Peer Review in Process now through July 3
The Learning & Evaluation from the WMF will be actively reading and responding to comments and questions until July 3. Please submit comments, questions, or suggestions on the talk page!
On this page, you will find answers to: What are some long-term outcomes of Writing Contests? How many users are retained? How much content is high quality?
From the 39 contests reporting, a total of 83 featured articles and 283 good articles were identified, but the process or standards for quality rated articles differ from project to project.
"Good" and "Featured" articles, or their equivalents in other language Wikipedias, were examined among the articles submitted to contests. It is important to mention that not all contests aimed to add only high quality content or rate articles.[1] The range for number of good articles is zero to 201 and the range for the number of featured articles is zero to 21. Because we use the median, the average contest produced zero good articles and zero featured articles.[2]
“When you write regularly on Wikipedia, people do not normally notice your page, you are working in the dark. Sometimes an admin will take a look and see that everything is fine and won’t do anything. During the competition, there is a lot of focus. There is the event page with the list of all of the pages in the competition. The participants and other people like to go through the pages and see if they can help out. So it is a much more collaborative effort during the competition times.”
The average cost per higher quality article is $167 USD for both Good and Featured articles. However, it is important to remember that article quality rating standards differ from project to project.
We examine the cost per quality article. In looking at the five contests with both budget and quality ratings, we find that the average cost per good article is $167 USD with a range of $7.50 to $1,000.00 USD,[3] while the average cost per featured article is $167 with a range of $5.40 USD to $1,000.00 USD.[4] Comparing quality ratings across projects, however, can be problematic since the rubrics for rating articles as quality will change from project to project.
Participation, pages of text, and quality articles
While most inputs were not associated with good or featured articles, pages of text shows a positive correlation with featured articles.
In the bubble graphs below, we examine participation, text pages, and whether articles were assigned as "good" or "featured" articles. We see that the bubble sizes for featured articles tend to increase as the text pages increase along the x axis, indicating a positive correlation.[5] While these data suggest a potential relationship between text pages and featured articles, these data may be misleading due to the method used to mine good and featured articles. Specifically, we were not able to determine when articles were marked with quality ratings, only whether they were marked as good articles.[6]
“We use online writing contests as a ‘play area’ for trying to boost the commitment of Wikimedians. For example, you have Wikimedians who join your organization and you want to check if they are good at managing projects, we invite them to manage an online writing contest, often they are enthusiastic about this. If something goes wrong, it is not a big problem. It is a good way for us to test the level of commitment for new members of our volunteer community and give them ownership of projects.”
User:Kippelboy, Catalan Wikipedia
For each contest, we assessed user retention rates for up to three possible follow-up periods
during the first, third and sixth month following the event start date.
A few things to remember with this section:
A user that makes five or more edits during the follow-up period is considered an active user. We examined editing activity both in terms of the number of edits made to any Wikimedia project and in terms of number of edits made only to the contest project.
A survived user is defined as a user that makes at least one edit during the follow-up period. Again, we looked at editing activity for both any Wikimedia project and for the contest project.
To examine retention--whether participants were still editing after the event--we use the start date and time of contests. We do this in order to see whether contests engage editors after the start of the contest, no matter how long the contest lasts.
For all the contests, we observed retention at 30 days, 3 months and 6 months following the start of the contest.
As outlined in the table below, for contests which had reached, or passed, the six month follow-up point, we assessed retention at one, three, and six months following the contest start date. For more recent contests, retention was assessed at the furthest point possible from the three and/or six month follow-up point.
As illustrated by the green (new users) and blue (existing users) shaded columns in the graph below:
114 new and 627 existing users had met or passed the 1-month and 3-month follow-up period
112 new and 588 existing participants had met or passed the 6-month follow-up period.
The dots illustrated in each of the green and blue columns in the chart below represent the number of editors who were active at (black dots) or survived to (white dots) each follow-up point.
For new users, retention rates by follow-up month ranged from 17.5% to 50.9% survived new users and demonstrated an increase in the second follow-up period followed by a decrease in the following period (i.e., 50.9% in Month 1; 17.5% in Month 3; 18.8% in Month 6).
For existing users, retention rates by follow-up month ranged from 77.7% to 93.3% survived and demonstrated proportionately lower retention at follow-up points further out from the contest (i.e., 93.3% in Month 1; 82.9% in Month 3; 77.7% in Month 6).
For the 30 writing contests with user data in this report, 114 new users participated.[10] At the one month follow-up stage, 36.8% of new users were retained as active editors.
One month after the event they participated in started, 41 new users (36.8%) were active editors and an additional 16 (14.0%) had made an edit.[11]
Three months after the event they participated in started, 10 new users (8.8%) were retained as active editors and an additional 10 (8.8%) as surviving editors.[12]
Six months after the event they participated in started, 16 new users (14.2%) were retained as active editors and an additional 5 (4.5%) as surviving editors.[13]
627 existing users participated in writing contests examined in this report. One month after their contest started, 93.3% of existing users survived and 89% were active editors.
one month after the event they participated in started, 556 existing users (89.0%) were active editors and an additional 29 (4.6%) had made at least one edit.[14]
three months after the event they participated in started, 482 existing users (76.9%) were retained as active editors and an additional 38 (6.0%) as surviving editors.[15]
six months after the event they participated in started, 434 existing users (73.8%) were retained as active editors and an additional 34 (5.78%) as surviving editors.[16]
"Most of the editors are young people, many are going an extra mile to edit Wikipedia, either going to an internet cafe or working on a five year old laptop.”
User:Shipmaster Producer Prize on Arabic Wikipedia
The writing contest program leaders who reported are generally experienced, who can help other program leaders start similar programs, and proactive at producing blogs and other online resources related to these events.
We asked program leaders to share their practices around program replication and shared learning resources related to the events they produce. This allows us to learn if the program leaders were experienced in implementing contests. We also are able to learn how program leaders and others (i.e. chapters, press, bloggers, etc) were promoting the results of the events, and if they make resources available for others to use to produce their own events.
For the seven contests in which program leaders reported on replication strengths, 100% were run by experienced program leaders[17] who can help others in leading a similar program (see graph below). You can find which people or organizations ran these events in the data tables). All seven program leaders shared blogs or other online material, while it was less common to generate guides (43%) or printed resources (43%). Each of the seven programs generated shared knowledge in at least two ways.
↑ Comparing good and featured articles across Wikipedias is difficult because quality measures may differ significantly from one Wikipedia to the next. We used different category names based on how articles are classified as higher quality in different Wikipedias. All the programs represented focused on Wikipedia.
↑Good articles: Median = 0, Mean=11, SD=39; Featured articles: Median= 0, Mean=3, SD=6
↑ For determining number of articles rated, both articles created and improved were assessed together rather than separately. In the future, we may need to distinguish between how many articles created were rated as good or featured and how many articles improved were rated as good or featured, in addition to better understanding quality ratings across different language Wikipedias.
↑1 Month follow-up window refers to period one calendar month past the start date, and counting back 30 days. This period often includes the contest period.
↑3 Month follow-up window refers to period beginning two months following the event start date and ending six months after the event start date.
↑6 Month follow up window refers to period beginning five months following the event start date and ending twelve months after the event start date.
↑New users were defined as usernames which had used the project for the contest for the first time up to two weeks before the contest start date.
↑100% of the new active editors were editing on the targeted language Wikipedia project one month after their contest started..
↑90% of the new active editors were editing the targeted language Wikipedia project three months after their contest started..
↑81.3% of the new active editors were editing the targeted language Wikipedia project, six months after their contest started..
↑97.8% of the existing active editors were editing on the targeted Wikipedia project one month after their contest started..
↑95.4% of the existing active editors were editing on the targeted project three months after their contest started.
↑93.8% of the existing active editors were editing the targeted project six months after their contest started..
↑It was up to the person responding to the data collection to determine if the program leaders was "experienced".