CivilServant's Wikimedia studies

CivilServant (now operating as Citizens and Technology Lab) is a nonprofit that collaborates with online communities to test ideas that make their communities flourish. We are currently working with several Wikipedia communities to test design interventions aimed at retaining new editors and enhancing the experience and motivation of experienced editors. These projects are also a first step towards what we hope will be more projects that help Wikipedians answer questions important to their communities.

CivilServant

edit

CivilServant is committed to the principles of “Citizen Behavioral Science.” using research tools that can identify designs that serve the online community well, at the same time working with those communities to insure that the experimental process is open, transparent and driven by the insights and needs of that community.[1]

CivilServant is an outgrowth of Nathan Matias’ PhD research with Ethan Zuckerman at the MIT Media Lab. CivilServant was initially incubated as a non-profit by citizen media organization Global Voices, which has a history of supporting people around the world to add indigenous language material to the web; it is currently in transition to becoming a project of Cornell University. CivilServant is funded through donations from individuals and foundations. It does not take funds from for-profit corporations.

In the past three years, CivilServant has worked with communities on reddit to test ideas for preventing harassment, managing misinformation, and managing conflict in political discussions.

Wikipedia projects

edit

CivilServant is currently collaborating on projects with several non-English language Wikipedia communities. These projects are financially and administratively independent from the Wikimedia Foundation, although we do have a Memo of Understanding with WMF and collborate with individual WMF employees or WMF teams in areas where our community-shaped work needs advice or support from the foundation. We expect to run our research systems on the Wikimedia Cloud Services infrastructure.

Initial advisors to CivilServant's research with Wikipedians include Aaron Halfaker (Wikimedia Research), María Sefidari (Elected WMF Board Member), and Dariusz Jemielniak (Elected WMF Board Member).

Gratitude prompts

edit

In this study, we aim to test whether prompts to thank other Wikipedia contributors can enhance the experience of editors and further motivate those editors. The basic design of the study is described below, but the exact design - including the treatment conditions and outcome measures - has been developed in collaboration with four partnering Wikipedia communities.

In this research, we planned to test two kinds of appreciation messages. The first system, "Thanks", allows readers to privately thank a contributor for a specific contribution on a Wikimedia project, including a new article or a spelling correction. A notification of appreciation is then directed to the contributor. A second system, "Love", allows any authenticated reader (someone with a username and password on the site) to send a "love" that shares a personalized message to a public page that lists all of the appreciation the person has received. In both cases, we will randomly assign participants to receive a prompt to express appreciation for others' contributions and observe the outcomes for sender and receivers. (In the final design we chose to only test the impact of giving and receiving "Thanks.")

The primary outcome of interest is editor productivity (i.e. whether editors make more contributions if a page they edited had a gratitude prompt), and other possible measures include readers’ and editors’ attitudes towards other Wikipedians (determined by survey), and cascade effects (i.e. if receivers of gratitude send gratitude messages themselves).

Retaining newcomers

edit

Wikipedia’s mission to provide a free encyclopedia depends not only on a motivated corps of experienced editors, but also on the ongoing recruitment and retention of new editors. WMF has made the retention of newcomers one of its initiatives in its Growth Team. In independent but complementary work, we plan to collaborate with non-English language Wikipedias to test their ideas for retaining newcomers. One such study would test the effectiveness of French Wikipedia's welcome message.

In planning our research projects we initially anticipated testing the effectiveness of Snuggle to retain new editors while at the same time enhancing the experience of experienced editors. One of Snuggle's advantages is its ability to help identify "goodfaith" newcomers using machine learning. While the communities we have spoken to are strongly interested in mentoring tools and find Snuggle promising, we are now exploring using a modified version of Snuggle or other applications of machine learning to help communities identify newcomers that may most benefit from mentorship or other kinds of support.

WikiLovesAfrica 2020 recruitment

edit

WikiLovesAfrica[2] is an annual photography contest where anyone across Africa can contribute media to Wikimedia Commons. In 2020, CivilServant worked with WLA to test messages that recruit people to participate. In future we may also test messages that guide people to add accurate metadata to the images they uploaded.

Impact Stats

edit

CivilServant collaborated with Bangla, German and Slovak Wikipedias to test how knowledge of one's "impact stats" (information about how many views a user's edited pages have received) may motivate newcomers to continue to contribute. That study began fielding in December 2020 but was put on hold in February 2021.

Partnering with Wikipedia communities

edit

In 2018 we reached out to a number of Wikipedia communities we identified as both having ORES integration (necessary for using AI technology to identify goodfaith newcomers) and enough newcomers each month to give the study adequate statistical power. Those Wikipedias were: Arabic, French, Persian, Polish, Portuguese, Russian and Spanish. The analysis we used to select those Wikipedias is described here: CivilServant Initial Data Analysis For Community Outreach.

In 2019 we are embarking on a broader search to any Wikipedia community or project that has an idea it wants to test. As part of that outreach, we invite interested Wikipedians to attend a Research Summit in Stockholm on August 19th.

In each collaboration we partner with a liaison in that community to act as our guide and research partner. If you are interested in being one of our liaisons, we invite you to read more about the role and, if still interested, to contact CivilServant's research manager Julia Kamin.

Transparency, Privacy, and Open Knowledge: How We Use Data

edit

We approach every study with the expectation that it could contribute to scientific knowledge. For that reason, in addition to community partnership and review, we also ask a university ethics board (IRB) to review our research procedures.

In every study, we work to integrate strong privacy protections into our work to add to open, transparent, accountable research. While we sometimes introduce even greater privacy protections in especially sensitive cases, here are some of our common practices for most studies (as of Jan 21, 2019):

  • Planning:
    • We always consult with Wikipedia communities on the risks and benefits of our research and data collection before starting a study
  • Data storage:
    • When conducting interviews or surveys, we store contact information separately from personal responses
    • We store and analyze survey data on an access-controlled filesystem that is encrypted at rest
    • We comply with GDPR requests for removal of personal information
    • A year after a study concludes, our standard practice is to delete contact information associated with the study
  • Data publication
    • To protect the anonymity of our participants, we will never publish names or contact information and will never share this information with third parties
    • We never publish open data of survey answers that can be linked back to individual users
    • Before deciding if and how to publish open data, we consult with our community liaisons and carry out a threat model to ensure that our plan adequately manages the risks to participants
    • When publishing data for research transparency, we often use the following practices to reduce the risk of de-anonymization:
      • Generating study-specific unique identifiers for accounts
      • Omitting all information that is not essential to the analysis
      • Publishing aggregate counts rather than individual observations (for example, the number of edits rather publishing each edit)
      • Reducing the precision of measures (for example, reporting registration year rather than a registration timestamp)
      • Reporting aggregate groups rather than absolute ones (for example, reporting someone as "experienced" rather than the precise count of previous edits)


Project Updates

edit

People

edit

This project's team currently includes:

edit

Funding For CivilServant's Work with Wikipedia

edit

This project was made possible through the support of a grant from Templeton World Charity Foundation, Inc, after an application process and review by academic reviewers. They're supporting this grant because they're interested in our research questions, and in order to help CivilServant grow our core operations. Another goal of our grant is to develop software and processes that could help Wikipedians do future research on important questions that matter to you. As an independent research project, CivilServant's views are our own and do not necessarily reflect the views of Templeton World Charity Foundation, Inc. or the Wikimedia foundation.

References

edit
  1. Matias, J. N., & Mou, M. (2018, April). CivilServant: Community-Led Experiments in Platform Governance. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 9). ACM.
  2. "Celebrating Africa on Wikipedia". Wiki Loves Africa. Retrieved 2018-11-24.