What longer term impacts do we see from the content generated by Other Photo Events? What do we observe of program participants?
We examined content production and quality improvement, user recruitment and retention, as well as measures of replication and learning as core outcomes across programs.
Read this page to learn how Other Photo Events delivers in three potential impact areas.
Of the 126,544 media uploaded to Commons, over 500 images ratings were assigned to Quality, Featured, or Valued Images.[1]
By February 2014, about 6 months after most events included in this report had ended, across the 126,544 uploads examined, there were more than 500 ratings assigned:
Although those pictures rated as Quality Images, Valued Images or Featured Pictures are associated with several media uploaded for the contests, there was a wide range of rating counts per contest, which suggests that contests may have varying practices for rating and/or potential varying success in achieving these ratings. Contests with more media rated as “Quality Images” tended to also have higher use of images in articles.[5]
Notes
↑Counts are inclusive such that an image that is rated both Quality and Featured is counted once in each category.
↑Number of rated photos per contest ranged from 0 to 1,314; Median=4; Mean=51; SD=148
↑Number of rated photos per contest ranged from 0 to 1235. Median=0 images per contest; Mean=32; SD=184
↑Number of rated photos per contest ranged from 0 to 61. Median=0 images per contest; ‘’Mean’’=5; ‘’SD’’=11
↑Quality images to unique images used: Correlation coefficient= 0.56
For each photo event, we assessed user retention rates at the furthest out follow-up point, which means we assessed activity during the third, sixth, or twelfth month following the event.
A few things to remember about this section:
An active editor is considered a user that makes 5 or more edits during the follow-up period. We examined editing activity both in terms of the number of edits made to any Wikimedia project and in terms of the number of edits made only on Commons.
A surviving editor is defined as a user that made at least one edit during the follow-up period. Again, we looked at editing activity for both any project and specifically on Commons.
As outlined in the table below, for events which had reached, or passed, the 12 month follow-up point, we assessed retention 12 months following the event, for more recent events, retention was assessed out to the furthest point possible, either the three- or six- month follow-up point.
As illustrated by the green (new editors) and blue (existing editors) shaded columns in the graph below:
At least 2,741 new and 1,228 existing users had met or passed the 3-month follow-up period.
2,470 new and 1,226 existing users had met or passed the 6-month follow-up period.
294 new and 276 existing users had met or passed the 12-month follow-up period.
The dots illustrated in each of the green and blue columns in the chart below represent the number of editors who were active (black dots) or survived (white dots) to each follow-up point.
For new users retention rates hovered right around 1.0% and ranged from 1.0% to 1.2% survived new users and demonstrated slightly lower retention at follow-up points further out from the contest (i.e., 1.2% in Month 3; 1.2% in Month 6; 1.0% in Month 12).
For existing users retention rates ranged from 43% to 52% “survived” and demonstrated higher proportions of existing users to total existing users retained at follow-up points further out from the contest (i.e., 44% in Month 3; 43% in Month 6; 56% in Month 12).
Notes
↑3 Month follow-up window refers to period beginning two months following the event start date and ending three months after the event start date.
↑6 Month follow-up window refers to period beginning five months following the event start date and ending six months after the event start date.
↑12 Month follow up window refers to period beginning eleven months following the event start date and ending twelve months after the event start date.
For the 103 events examined in this report, 2,741 new users participated.[1] At the three-month follow-up stage, 0.8% of new users were retained as active editors.
At least 2,741 new users[2] participated in 103 photo events examined in this report.
Three months after their event started, 22 new users (0.8%) were retained as active editors and an additional 10 (1.2%) as surviving editors.[3]
Six months after their event started, the same 22 new users persisted as active editors and the same 10 as surviving editors.[4]
Twelve months after their event started, the number of active new editors drops to 1 (0.3%) and the number of surviving editors drops to 3 (1.0%).[5]
Notes
↑New users were defined as usernames which had used Commons for the first time up to two weeks before the contest start date.
↑New users were defined as usernames which used Commons for the first time in the period between two weeks before the event start date and the event end date.
↑The majority of new active editors (86%) were editing on Commons, the targeted project, at three-month follow-up.
↑86% of the new active editors were editing on Commons at six-month follow-up.
↑That new active editor was editing on Commons at twelve-month follow-up.
1,228 existing users participated in the photo events examined in this report. As with Wiki Loves Monuments, existing user retention was higher than new user retention for photo upload events. Participants demonstrated a slightly higher rate of “active” participation following these events as compared to Wiki Loves Monuments.
As with Wiki Loves Monuments, existing user retention was higher than new user retention for photo upload events, and participants demonstrated a higher rate of «active» participation following these events as compared to Wiki Loves Monuments. [1]
At least 1,228 existing users took part in the 103 photo event contests.
Three months after their event started 480 existing users (39%) were retained as active editors and an additional 62 (5%) as surviving editors. [2]
Six months after their event started 465 existing users (31%) were retained as active editors and an additional 61 (5%) as surviving editors. [3]
Twelve months after their event started 127 existing users (50%) were retained as active editors and an additional 16 (6%) as surviving editors. [4]
Notes
↑In this round of reporting, Wiki Loves Monuments events retained existing users at a rate of 36 to 50 percent at 3-, 6-, and 12-month follow-ups.
↑80% of the existing active editors were editing on Commons, the targeted project, at three-month follow-up.
↑80% of the existing active editors were editing on Commons at six-month follow-up.
↑100% of the active editors were editing on Commons at twelve-month follow-up.
Photo event program leaders are generally experienced and proactive at producing blogs and other online resources related to their events which can help others implement their own contest.
96% of events reporting were run by an experienced program leader who can help others in conducting a similar program. You can find which people or organizations ran these events in the data table.
We asked program leaders to share their practices around program replication and the types of shared learning resources produced that are related to their events. This allowed us to learn if the reporting program leaders considered themselves experienced in implementing contests and to help others design and implement similar programs. We also are able to learn how program leaders and others (i.e. chapters, press, bloggers, etc) were covering the events, and if resources were made available for others to use to produce their own events. For the 23 photo events in which program leaders reported on these replication strengths, 22 (96%) were run by an experienced program leader who can help others in conducting a similar program. (You can find which people or organizations ran these events in the data tables). Most, 19 (83%) shared blogs or other online material while it was less common to generate guides or printed resources. Still, each of those 23 programs generated shared knowledge in some way.
There is a wealth of potential here for replication and shared learning.
Further investigation is needed to see how these resources are shared among program leaders in order to share and learn from one another's experiences and practices.