Grants:PEG/Learning/2013-14
- 8 general support (37% of funds)
- 2 tools (1% of funds)
- 5 outreach (12% of funds)
- 7 content (12% of funds)
- 10 Wiki Loves Monuments (28% of funds)
- Commons
- MediaWiki
- WikiSource
- WikiQuote
- WikiVoyage
- 26 language Wikipedias
- 59% chapters
- 8% external organizations
- 17% groups
- 17% individuals
Overview background
edit[[|thumb|Full report on PEG reports 2013-14.]] |
The Project and Event Grants (PEG) program has been in existence in its current form since 2012, and though it has been over multiple process-oriented studies, has never systematically evaluated the outcomes of the work being funded....until now! Unlike the Individual Engagement Grants and Annual Plan Grants programs, PEG operates on a rolling cycle, with grant applications coming in at all points throughout the year, with huge variance in the duration of a specific grant. As a result, to summarize and compare the work done through PEG, we examined all the final reports that were submitted during the FY2013-14. These included primarily grants that were funded during 2013-14 (n=22), but also grants approved in FY2012-13 (n=12) and FY2010-11 (n=2).
In total 36 grants and US$350K in funding was reported on by 27 different individuals, groups, and organizations who executed projects and events in an effort to make progress towards the Wikimedia movement's strategic goals of reach, participation, and quality content. When examining the reports, three key questions were in play:
- What progress did the grants as a whole make against our movement's strategic goals (reach, participation, quality content)?
- What characteristics stood out in the highest performing grants?
- How were the grants managed and reported?
We are thrilled to see the work being done across a mix of organizations, groups, and individuals, and are excited to continue supporting projects and events that are working towards our strategic goals. We see the need for more experimentation on projects that have clear linkages to the Wikimedia projects themselves -for example, on-wiki interventions and writing contests.
Key takeaways
edit- Most PEG grantees are very focused on specific output results, such as images or articles, which are clear additions to Wikimedia content
- There are limited proxies around quality of content, which is an area we need to explore more
- PEG grantees demonstrated ability to reach out to people. But, with average cost per person touched at $35, it is critical that grantees improve depth/quality of interaction and/or scale of interactions.
- Best practices around improving interactions and following up with participants is needed
- Online writing contests produced the highest amounts of direct article contribution, and they were inexpensive to run
- Groups/organizations with an integrated strategy throughout the year performed best
- The larger grants (>$10K) have high rates of significant underspending; chapters in particular struggled to spend the larger amounts of money in general support grants
- Learning is taking place, but little of it has been systematically documented in learning patterns:
- To scale the impact of the programs occurring, we need clearer systems of documenting and sharing learning
Key gaps
edit- We have limited cost data by program; we need more detailed breakdown of inputs by program
- We have limited data around quality of participation or quality of content: we need more information around the levels of engagement with participants, and the quality of content created
Major learnings
editFocus
editRelated Learning Patterns:
Projects that were most linked to impact were those that focused on a specific method for affecting the strategic priorities. They had clear theories on how to get overall impact: whether improving the quality of the content, increasing the number of editors, or increasing the number of readers.
Case study:
For example, the highest level strategic goal of increasing the number of active editors on our projects (“Participation”) can be achieved by working on one of three components: increasing the number of new editors, converting the new editors into longer term editors, or retaining existing editors. The Women in Science and Math Workshop, a small experiment conducted by User:Keilana focused on the first two sections of this chain: bringing in new users and converting these new editors into longer term contributors. It did this under the specific hypothesis that new editors would be more likely to become expert editors if they had the appropriate mentoring and help[1]. The Women in Science and Math Workshop explicitly targeted new users (female college students) who were passionate about a specific topic (women in STEM fields). The result of this careful focus? A small demonstration of a way to engage a female population, which ultimate resulted in more in depth funding to provide a model for scaling this program elsewhere, as it continues to demonstrate to ability to retain editors.
On-wiki contests
editRelated Learning Patterns:
5,280 articles were created through on-wiki contests - in fact, 60% of the articles written through PEG grants in FY2013-14 were created through just three online writing competitions. These projects are low cost, but are scalable across the different Wikipedia projects and relatively simple to track. These types of contests work well for photos and also articles.
Case study:
Wikimedia Eesti held several photo and article competitions throughout the course of their 2013 general support grant. To execute these competitions, they focused on specific themes for each one. One example of this was the annual competition they host in cooperation with the Institute of Danish Culture and the Danish Royal Embassy in Estonia, concentrating on Denmark and Danish culture. WMEE was able to create high quality content by
- (1) requiring editing on a specific topic,
- (2) pointing people to specific articles to edit,
- (3) keeping the contest short enough to manage, and
- (4) giving awards primarily based on the quality of the contributions (rather than just content).
The result? The 6-week competition involved just 30 people, but produced 135 articles, and at a relatively low cost. In total, WMEE's seven writing contests had 112 participating producing 336 articles.
Complementary activities
editRelated Learning Patterns:
Content acquisition projects such as photo and writing contests with the highest results involved specific, identified focus areas and multiple outreach strategies. Strategies included hosting photo campaigns periodically throughout the year to develop skills and also hosting edit-a-thons to put media content onto articles.
Case study:
Wiki Loves Monuments is an annual competition held by countries around the world, but Wikimedia D.C. had one of the highest proportion of photos used on Wikimedia project pages soon after the competition: 32% verses an average 12%! Why? How?! A few things came out in their report:
- (1) Wiki Loves Monuments was once piece of their broader outreach strategy, which included regular photo and editing events
- (2) Guidance to participants was given to help build coverage of needed content areas[2]
Learning Patterns
editThe following learning patterns were created by PEG grantees reporting during 2013-14. Creating learning patterns was not a requirement for these grantees, and we are so appreciative of this extra work put forth by grantees to share their most important learnings with the broader movement. We're excited to see grantees and community members to continue to share their learning more broadly:
Recommendations
editFor grantees
editProgram Planning
|
Executing and reporting
|
For Grants Advisory Committee
editProposal evaluation
increase understanding of larger grants
|
Project type advise
|
For WMF
editProgram Strategy
|
Assessment
|
Future areas for study
edit- Market analysis: this report (and others) do not take into account the amount of participation nor content in relation with the overall context of that language, geography, or topic area
- Measuring quality: we have limited proxies for quality; we need better understanding of how to estimate this
- Measuring important offline interventions: we have limited understanding of the impact of specific types of events on developing free knowledge
See Also
editGrants data by program type
editContent projects | Wiki Loves Monuments | Outreach | Conferences | Tools | General Support | |
---|---|---|---|---|---|---|
Typical Activities |
|
|
|
|
|
|
Average cost | $4,889 | $5,518 | $3,521 | $14,036 | $1,009 | $13,131 |
Number of grants | 7 | 10 | 5 | 4 | 2 | 8 |
Summative outcomes |
|
|
|
|
|
|
References
edit- ↑ This lever for change was demonstrated to be meaningful in the analysis of the WP:EN Teahouse project, which demonstrated that experiencing Wikipedia a s a community and perceiving that one has a role in it is a powerful incentive. See Research from Phase 2 of Teahouse project.
- ↑ The group used the National Registry of Historical Places (Wikiproject:NRHP) to highlight areas that lacked coverage on Wikipedia to provide suggestions to contributors