The Wikimedia Foundation's Anti-Harassment Tools team is researching in preparation to design and build a user reporting system to make it easier for people experiencing harassment and other forms of abuse to provide accurate, actionable information to the appropriate moderation channel.

This page documents a feature the Wikimedia Foundation's Anti-Harassment Tools team has committed to build.

🗣   We invite you to join the discussion!
🛠   Track in Phabricator at T166812.

Our goal is to provide a reporting system for Wikimedia communities that puts less stress on users who are submitting the reports while simultaneously generating higher-quality reports.

Community input and feedback will be essential for the success of our work. Join us at the User reporting system consultation 2019.

About

edit
 

Reporting harassment

edit

When harassment or abuse happens between two or more Wikimedia users, most wikis request that the users in conflict self-resolve the issue by discussing on each others' talk pages. When this fails to resolve the situation, users can report the misconduct to wiki ‘moderators’ (other experienced users, administrators, and in extreme cases stewards or Wikimedia Foundation staff) in a variety of ways: on wiki talk pages, via email, or via IRC or another off-wiki communication channel. Most wikis have established noticeboards to report cases that require attention or suggested email lists for sensitive reports that require privacy. The identified ways to report harassment include:[1]

Triaging reports

edit

When a user receives or discovers a report of harassment the first step is usually to triage the report — "can I, as the recipient, facilitate the moderation? or is someone else better suited?"

We have found that for most simple cases of harassment, the user who receives the report will act as a mediator to work with both parties on defining the problem and agreeing on a solution forward. For cases where the receiving user does not want to become involved they will refer the user to another admin, the WMF, a noticeboard, a policy page, ArbCom, a local affiliate, or law enforcement depending on the severity of the incident.[1][2]

Triaging these reports can be time consuming and frustrating, especially for areas where reports are commonly misfiled or mislabeled.

Routing reports to the appropriate moderation channel

edit

Because there are a variety of places to report harassment, and user misconduct can be closely related to content problems (neutrality, paid editing, edit warring, etc. — which have their own workflows to report) it can be difficult to determine exactly who or where to report user misconduct. The Anti-Harassment Tools team believes the software should do the heavy lifting and users should not require special knowledge to find the most appropriate place to request assistance. This will help both the reporters and the moderators.

Our current direction for the user reporting system is a system which routes the reporter to the appropriate channel given their responses to some questions. For example, if someone reports that they have been unjustly reverted the system might recommend they discuss it on the article's talk page; but if they report that they have been physically threatened the system would recommend they email Trust & Safety for immediate resolution. This system would be customizable for every wiki and would be able to evolve over time.

Updates

edit

February 19

edit

The User reporting system consultation 2019 has begun! Visit us at Community health initiative/User reporting system consultation 2019 to learn about the process we are using to include a wide array of voices into gathering requirements about a system for users to report user misconduct and harassment.

January 16, 2019

edit

2019 is off to a great, productive start! Our first research project is wrapping up early and will be shared here as soon as we can. We've also conducted our Step 0: hold a meeting with all Wikimedia Foundation employees who care about this project (no surprise, it was a lot of people!) The discussions we had were immensely productive. We talked about escalation paths, we prioritized the most important aspects of a reporting system, and we enumerated a long list of things to discuss further on wiki with Wikimedians.

Our staff All Hands is in two weeks, after which we will kick off this process by inviting folks like you to discuss the concept and direction of an ideal user reporting system for our different wiki communities. We're aiming to have decisions made by June 2019 so we can start building prototypes in July. It's going to be a fast 6 months and I look forward to arriving where ever we get with you! ⛵️

December 20

edit

I'm excited to say that we have a plan to build the User reporting system in 2019! 🚀

Our plan roughly has four overlapping stages: 1. Research Our team's researcher Claudia published her report of current reporting workflows last month and is currently underway on a comparative analysis rubric of user-to-volunteer reporting systems, including Wikipedia. 2. Discuss This research should compliment our public discussions about harassment reporting, which will start in January/February. We will reach out to existing Wikimedians, former users who no longer participate due to toxicity, as well as newcomers to make sure we're addressing real problems with realistic solutions. Many of these discussions will happen here on-wiki. Additional research topics will emerge from these discussions. 3. Design When we have a solid understanding of what tool(s) or system(s) need to be built we will begin creating several designs or prototypes to illustrate and test the leading ideas. We'll share these publicly for feedback to determine the best possible solutions. 4. Build When we're confident that we've identified the best possible ideas to allow users to report other users for malicious behavior, our software development team will build the new tools.

This process will take the majority of 2019 and our rollout will likely occur in 2020. (And please note that this paragraph is an extremely abridged version of our plan.) We're looking forward to getting started as soon as we're back from our holiday break. ☃️

November 26

edit
 
Snapshot of current reporting systems on English Wikipedia, November 2018

Our team's Design Researcher has finished a report on current reporting workflows, which includes a diagram on the same topic. It defines some key terms, such as important stakeholders, types of reporting systems, and concerns outside of the mechanisms of reporting. The bulk of the report details the ways and places where editors can make reports, and highlights some valuable features as well as areas for improvement. You can find the link to the files in § Research on this page.

Our Trust & Safety department is continuing to design the first round of on-wiki consultations, as well as working to explore what other research projects would be beneficial to conduct as we start the community consultation process.

November 8

edit

Conversations are happening about how to get started on this important project in 2019. Our team's new Design Researcher will soon begin research into the current reporting workflows across our wikis, our Trust & Safety department will be designing an on-wiki consultation process to include a wide array of voices so we can generate consensus on what to build before our designer and developers begin building software.

October 23

edit

We had some fruitful discussions with community members at WikiCon NA 2018. Key takeaways are to design for users making reports, the importance of reconciling our culture of transparency with the need to maintain privacy in reporting, and making sure future systems fall under the purview of governing groups that have their communities' respect and trust to handle sensitive matters. We also went over the issue of signposting—any new reporting system needs to be easily found by users. Lastly, we discussed the issue of what voices we wanted to consult as we go about building these new systems, and how we might best reach them.

I've also gone ahead and cleaned up the "Things to Change" section, and hopefully it is easier to follow.

August 15

edit

I've expanded § Requirements with five more sources: results from a 2018 survey about English Wikipedia's AN/I noticeboard, results from quantitative analysis on AN/I, the Harvard Negotiation and Mediation Clinical Program's analysis of AN/I, a summary of IdeaLab submissions on the topic of reporting harassment, and results from the 2017 Community Engagement Insights survey. I plan to add the 2018 IdeaLab submissions and the 2018 Insights survey results as soon as they are ready.

The "Things to Keep" section has not grown, mostly because the sources I'm adding do not solicit feedback about what works well but rather about what does not work well and what should change. The list of "Things to Change" is growing and will soon need to be sub-categorized in order to better understand common areas of frustration or opportunity. A trend that is already starting to crystalize is "who is responsible for handling harassment, and how can the software better help them make faster, more accurate decisions that stand up to scrutiny?" This is not an easy question — Wikipedia has no deadlines and while administrators have the tools to block users and slightly more social capital than non-administrators, there is no specific group of people responsible for making sure reports of misconduct are handled. (Short of ArbComs, who only handle severe cases.) This is also made difficult by the fact that moderation is often a thankless, unrewarding job that can attract unwanted attention. How can we find, train, and support moderators while still holding them accountable?

August 9

edit

I've added the section § Triaging reports above based on our experience reading public cases of harassment as well as the Worksheet #1 notes from our Wikimania roundtable. Participating on a Wikimedia wiki requires dedication so unsurprisingly we found that the most common way to triage a report of harassment was for the user to handle it themselves (11) followed by referring to another admin or trusted user (9), the WMF (6), or to a public noticeboard (5). One person each mentioned referring to a policy page, ArbCom, a local affiliate, or law enforcement.

Also of interest from the first worksheet was the frequency of email in reporting harassment. 19 roundtable participants mentioned they receive reports via email, 11 mentioned talk pages, 8 IRC, 6 social media, 5 in-person, 4 to a listserv, noticeboard, or telegram, and 2 on a telephone call or via OTRS. It's becoming crystal clear that this reporting system will have to have a space for reporting harassment in private. The big question will be how to allow for private reports that treat all parties with dignity inside an environment of transparency and accountability. No small feat!

August 3, 2018

edit
 
Slides from our Wikimania roundtable

This page has been updated with all the information to date about this project, most notably results and takeaways from a roundtable our team conducted in Cape Town, South Africa at Wikimania 2018. The anonymized raw notes can be found at Community health initiative/User reporting system/Wikimania 2018 notes and my synthesis can be found lower on this page at § Requirements. At the roundtable we asked the 25 participants (from 8 different language Wikipedias) to take notes and discuss how harassment is reported on their wiki (we are working on processing through some of these notes, and we'll add them here in the coming weeks.)

Most significantly at the roundtable we asked participants what currently works well about reporting harassment and what does not work well. We are hoping to buff out a list of things to keep and things to change about reporting harassment on Wikimedia before we begin designing the new reporting system. It's important to us that we come to an agreement with most participants about these lists before proceeding so the solutions we design will solve real problems and be realistic to enable on all projects over the coming years.

It is also important to us that we include the voices of a wide variety of people in this process: active Wikimedia users, newcomers, people who no longer contribute due to harassment, and external experts. As not everyone will feel comfortable participating here on Meta Wiki, my team will anonymize, transcribe, and summarize all findings throughout this process.

Thank you for reading, thank you for helping us define an ideal solution, and please let us know your thoughts on the talk page. ✌️

Previous

edit

Our team (the WMF's Anti-Harassment Tools team) began looking into reporting channels in late 2017 knowing we would be working on building an improved system in 2018. This early work focussed on English Wikipedia's administrator's noticeboard for incidents (aka AN/I) in the form of both qualitative analysis and a survey for participants. The results are linked below in § Research. Given the amount of work for blocking tools this reporting system project was prioritized for the second half of 2018.

In April 2018 the Anti-Harassment Tools team met in Ft. Lauderdale Florida to discuss our 2018 commitments and agreed on an initial direction of a modular 'routing system' which is customizable per wiki and helps users who want to report an incident to the appropriate new channel. We agreed that introducing any new channels without consolidating or sunsetting old channels was irresponsible. As we progress through this project we hope to move tactically and make small improvements in small steps, rather than introduce large interruptive changes.

Requirements

edit

There are elements of the current reporting systems that work well and that we want to keep, and there are parts that are problematic and should change. Before we design the new system we would like to reach agreements on what to keep and what to change so we can design the best possible reporting system.

Things to keep

edit
  1. The ability to report directly to WMF Trust & Safety if desired or needed.[1]
  2. A system which assumes good faith, for the moderators, victims, and accused.[1]
  3. The authority of local projects[1]
  4. The ability to report and discuss in one’s native language[1][3]
  5. Accountability for moderators[1]
  6. Training of moderators and administrators (on wikis that have training)[1]
  7. A system that allows for some public reports, such as noticeboards for quick, straightforward, public cases[1][2][3]
  8. A system that allows for some reports to be private[1]
  9. A public system of documenting private cases[1]
  10. A scalable group of moderators, so many people can act when needed[1][3]
  11. A variety of different responses, appropriate to different situations[1]
  12. The ability to reach out to a specific admin or moderator if so desired[1][4]
  13. If you'd like to add to this list, please discuss on the talk page or submit a private submission via email to nkohli wikimedia.org

Things to change

edit

Visibility

edit

How easy is it to find these dispute resolution systems?

  1. The entry point to “report harassment” is not currently visible or findable.[1][3][2][5]
    1. We should make it easier for people who have been harassed to come forward.[3]
    2. The current system is not proactively described to users. People could be educated about how to handle incivility before they encounter a problem.
    3. There should be a way to mark certain conversations as harassment or problematic. [3]

Privacy and transparency

edit

How do we balance the culture of transparency with needs for privacy and security?

  1. Not all reports should be 100% public.[1][2][3][5]
    1. Alternatives to public on-wiki reporting channels are not emphasized enough.
    2. Trolls and passersby can derail public discussions.[2]
    3. Publicly discussing the incident with the person who harassed you causes further harm.[2][5]
    4. Private reports are not always appropriately documented on-wiki. The system should allow for private reports to be appropriately transparent. Consider a system where more than one moderator knows about a private report.

Documentation

edit

How should we document precedent, and how do we lay out definitions?

  1. The current systems do not use consistency or precedent of previous cases. Not every case is treated the same.[1][2][5]
    1. It is currently difficult to search and find similar cases.[2]
    2. There are gaps between setting precedent and updating necessary policies.[5]
  2. The current documentation about harassment is not sufficient, including:[1]
    1. Definitions and examples of harassment
    2. Who is responsible for moderating harassment (including users, administrators, and ArbCom)
    3. How to report harassment and/or other forms of user misconduct[2][5]

Ease of use

edit

How do we make it easier for users to report harassment, and easier for administrators to act on these reports?

  1. Most current reporting systems do not use a consistent form making it difficult for reporters to know what type of information will be useful for moderators to properly address their issue.[1][2][5]
  2. The concept of w:WP:Boomerang distracts from many cases and makes moderators skeptical of all reporters. The system should assume good faith of all parties.[1][2]
  3. It is time-consuming to gather the facts of a case, such as important diffs. Consider building a system that allows for a collaborative marking.[1][2][5]
  4. It should be easier to flag an article or user to moderators privately.[1][3]
  5. There are unwritten social rules for participating in dispute resolution and on noticeboards.[2][5]
  6. Public cases that have lengthy comment threads are difficult to navigate and follow.[5]
  7. Users must manually classify the problems they are facing.[5]
  8. Chatting synchronously is difficult for many users as IRC is limiting. Consider building a chat system connected to Wikimedia accounts.[1]

Efficiency

edit

How can we make the process of acting on reports better?

  1. The current workflows for cross-wiki or global dispute resolution are cumbersome.[1][5]
  2. Initial responses can take too long and should be quicker.[1][3][5]
  3. Case closure can take too long and should be quicker.[2][5]
  4. Responses to reports to functionaries, WMF, affiliates, or other volunteers are not often resolved.[4][3][6]
  5. Responses to reports to functionaries, WMF, affiliates, or other volunteers are not often useful or satisfactory.[4][2]

Responsibility

edit

Who should take responsibility, over what domains, and at what stages, of the reporting process?

  1. The system should not be designed in a vacuum, rather should involve the community, external experts, and digital rights leaders.[1]
  2. The volunteer moderators need more training on how to make appropriate decisions and for self-care. These volunteers should be trained in a way to train each other.[1][3]
  3. The current system requires one moderator to act (resolve a case, set a block, etc.) which often makes them a target of further harassment. Consider a system that uses a committee account or requires multiple users to respond to the harasser.[1][3]
  4. Not all cases need a moderator, in many situations users could mediate their own problems if they had useful advice.[3][2]
  5. Moderator shortages:
    1. There is no user group responsible for triaging reports, it is an adjunct responsibility for administrators. Consider clerking.[2][5]
    2. Sometimes moderators involved in an incident will be part of the decision making process.[2]
    3. It can be difficult to find other users to mediate a situation.[1][3]
    4. Some issues would benefit from the attention of multiple moderators' opinions, but there is no special marking these cases.[5]
  6. Prioritization of cases is manual, there is no special marking for cases that require urgency (not including emailing emergency@).[5]
  7. Complex cases (multiple parties, long time frames, etc.) are time-consuming and unrewarding to moderate.[2]

Challenges

edit

What are obstacles to consider when designing a reporting system?

  1. Clever users can "game the system" to their advantage.[2]
  2. The excuse of productivity (e.g. a user edits frequently) is often a justification for bad behavior.[1][2]
  3. Language barriers put non-native speakers of the wiki’s language at a disadvantage when reporting or defending themselves against a report of harassment.[1]
  4. There is no clear or consistent way to challenge an inadequately resolved report.[5]
  5. The current users in positions of authority on wikis do not fully understand the dangers of online harassment.[1][3]
  6. The current system does not encourage reporting harassment that you have observed but are not involved in.[1]

If you'd like to add to this list, please discuss on the talk page or submit a private submission via email to nkohli wikimedia.org

Research

edit

See also: Research:User reporting systems

 
 
Enwiki Reporting system workflow

The Wikimedia Foundation's Anti-Harassment Tools team wants to better understand these existing systems to identify any pain points or shortcomings that we can address with improved software. Our research will heavily focus on English Wikipedia as the largest wiki community but we aim to build a tool that can be used by any wiki of any size. Research completed to date includes:

See also

edit

References

edit