Research:Understanding Wikipedia Moderation Policies

Created
19:45, 13 April 2022 (UTC)
Collaborators
Kejsi Take, Rachel Greenstadt
Duration:  2022-3 – 2022-12
flagged_revisions, torblock, extension, policy, interview

This page documents a research project in progress.
Information may be incomplete and change as the project progresses.
Please contact the project lead before formally citing or reusing results from this page.


The main goal of our study is to understand moderation policies and existing systems of editing Wikipedia. By understanding what problems users encounter when editing Wikipedia, we aim to come up with suggestions and recommendations to improve the current systems or use them as the basis for building new moderation systems.

Anonymity Users. Anonymity-seeking users using Tor are systematically blocked from contributing on numerous user-generated content sites, allegedly due to concerns of vandalism. However, this type of blanket ban might have caused unintentional damage to the growth of these communities, by preventing not only the bad-faith actors, but also the good ones whose contributions might prove valuable. As researchers in the field of privacy, we are interested in finding a way to preserve the ability to seek anonymity to ensure free expression and equitable participation, while not neglecting the threat of abuse and vandalism. Prior work shows that there is indeed potential value in allowing anonymous contributions, and a more reasonable approach to moderating content from untrusted sources[1].

Moderation Policies. The current moderation policies and practices do not allow anonymous users to contribute. We aim to work closely with members of the Wikipedia community, to better understand the concerns of stakeholders behind several moderation policies, in hope to find an alternative path towards supporting anonymous contributions on Wikipedia. More specifically, we aim to study Flagged Revs as a tool aiding moderation, that could also be helpful as a pre-review check for anonymous contributors[2].

As a side goal of our project, we also aim to make moderation systems more inclusive to anonymity-seeking users.

Methods

edit

We are actively recruiting Wikimedia/Wikipedia members for a screening survey and a potential follow-up interview, to gain better understanding of several moderation policies on Wikipedia. In particular, we are interested in contacting those who have knowledge about either the Flagged Revisions extension, or the TorBlock extension. So far, we have obtained approval from the New York University IRB office to conduct our research. We are looking for active moderators and administrators of communities that employ Flagged Revisions extension as a moderation tool, as well as developers of the Flagged Revision extension or TorBlock extension, in order to send out recruitment emails with a screening survey to determine their eligibility to voluntarily participate in an interview. We are also planning to submit recruitment posts on different public forums to recruit interviewees on a broader scale. We intend to interview at least 20 participants.

    • Resulting paper and presentation materials from our study will be open access. The paper will be published on Arxiv.

If you are reading this page and are interested in helping our study, please feel free to fill out the screening survey to determine your eligibility for the interview: https://nyu.qualtrics.com/jfe/form/SV_aaRYQMZd29tCj5A.

Timeline

edit

Our research starts right after we obtained the IRB approval on 3/28/2022. We plan to recruit members and conduct the interviews until the sample size is sufficiently large (at least 20 participants). Based on the communications with Wikimedia/Wikipedia members, we will plan the next step in our research.

Policy, Ethics and Human Subjects Research

edit

This study was reviewed by the New York University IRB office (study number IRB-FY2022-6346, approval date: 3/29/2022) If you have any questions or need further help, please contact the IRB office at (212) 998-4808 or email ask.humansubjects@nyu.edu.

References

edit