Grants:PEG/Learning/2013-14

PEG 2013-14
impact

36
grants
project focuses:

  • 8 general support (37% of funds)
  • 2 tools (1% of funds)
  • 5 outreach (12% of funds)
  • 7 content (12% of funds)
  • 10 Wiki Loves Monuments (28% of funds)
19
different countries
41%
of grant funding to global south
total grant spending reported (USD)
$352,508
180 days
average length of grant
lots of projects:

  • Commons
  • MediaWiki
  • WikiSource
  • WikiQuote
  • WikiVoyage
  • 26 language Wikipedias
smallest grant spending
$420
largest grant
$41,723
Mix of grantee types

  • 59% chapters
  • 8% external organizations
  • 17% groups
  • 17% individuals

Overview background

edit
 
Abridged public presentation (Google hangout) from July 2014
[[|thumb|Full report on PEG reports 2013-14.]]

The Project and Event Grants (PEG) program has been in existence in its current form since 2012, and though it has been over multiple process-oriented studies, has never systematically evaluated the outcomes of the work being funded....until now! Unlike the Individual Engagement Grants and Annual Plan Grants programs, PEG operates on a rolling cycle, with grant applications coming in at all points throughout the year, with huge variance in the duration of a specific grant. As a result, to summarize and compare the work done through PEG, we examined all the final reports that were submitted during the FY2013-14. These included primarily grants that were funded during 2013-14 (n=22), but also grants approved in FY2012-13 (n=12) and FY2010-11 (n=2).

In total 36 grants and US$350K in funding was reported on by 27 different individuals, groups, and organizations who executed projects and events in an effort to make progress towards the Wikimedia movement's strategic goals of reach, participation, and quality content. When examining the reports, three key questions were in play:

  1. What progress did the grants as a whole make against our movement's strategic goals (reach, participation, quality content)?
  2. What characteristics stood out in the highest performing grants?
  3. How were the grants managed and reported?

We are thrilled to see the work being done across a mix of organizations, groups, and individuals, and are excited to continue supporting projects and events that are working towards our strategic goals. We see the need for more experimentation on projects that have clear linkages to the Wikimedia projects themselves -for example, on-wiki interventions and writing contests.

Key takeaways

edit
  • Most PEG grantees are very focused on specific output results, such as images or articles, which are clear additions to Wikimedia content
    • There are limited proxies around quality of content, which is an area we need to explore more
  • PEG grantees demonstrated ability to reach out to people. But, with average cost per person touched at $35, it is critical that grantees improve depth/quality of interaction and/or scale of interactions.
    • Best practices around improving interactions and following up with participants is needed
  • Online writing contests produced the highest amounts of direct article contribution, and they were inexpensive to run
  • Groups/organizations with an integrated strategy throughout the year performed best
  • The larger grants (>$10K) have high rates of significant underspending; chapters in particular struggled to spend the larger amounts of money in general support grants
  • Learning is taking place, but little of it has been systematically documented in learning patterns:
    • To scale the impact of the programs occurring, we need clearer systems of documenting and sharing learning

Key gaps

edit
  • We have limited cost data by program; we need more detailed breakdown of inputs by program
  • We have limited data around quality of participation or quality of content: we need more information around the levels of engagement with participants, and the quality of content created

Major learnings

edit

Focus

edit
A. Clear links to strategic priorities

Related Learning Patterns:

Projects that were most linked to impact were those that focused on a specific method for affecting the strategic priorities. They had clear theories on how to get overall impact: whether improving the quality of the content, increasing the number of editors, or increasing the number of readers.

Case study:

For example, the highest level strategic goal of increasing the number of active editors on our projects (“Participation”) can be achieved by working on one of three components: increasing the number of new editors, converting the new editors into longer term editors, or retaining existing editors. The Women in Science and Math Workshop, a small experiment conducted by User:Keilana focused on the first two sections of this chain: bringing in new users and converting these new editors into longer term contributors. It did this under the specific hypothesis that new editors would be more likely to become expert editors if they had the appropriate mentoring and help[1]. The Women in Science and Math Workshop explicitly targeted new users (female college students) who were passionate about a specific topic (women in STEM fields). The result of this careful focus? A small demonstration of a way to engage a female population, which ultimate resulted in more in depth funding to provide a model for scaling this program elsewhere, as it continues to demonstrate to ability to retain editors.

On-wiki contests

edit
B. On-wiki competitions

Related Learning Patterns:

5,280 articles were created through on-wiki contests - in fact, 60% of the articles written through PEG grants in FY2013-14 were created through just three online writing competitions. These projects are low cost, but are scalable across the different Wikipedia projects and relatively simple to track. These types of contests work well for photos and also articles.

Case study:

Wikimedia Eesti held several photo and article competitions throughout the course of their 2013 general support grant. To execute these competitions, they focused on specific themes for each one. One example of this was the annual competition they host in cooperation with the Institute of Danish Culture and the Danish Royal Embassy in Estonia, concentrating on Denmark and Danish culture. WMEE was able to create high quality content by

(1) requiring editing on a specific topic,
(2) pointing people to specific articles to edit,
(3) keeping the contest short enough to manage, and
(4) giving awards primarily based on the quality of the contributions (rather than just content).

The result? The 6-week competition involved just 30 people, but produced 135 articles, and at a relatively low cost. In total, WMEE's seven writing contests had 112 participating producing 336 articles.

Complementary activities

edit
C. Complementary activities in program plan

Related Learning Patterns:

Content acquisition projects such as photo and writing contests with the highest results involved specific, identified focus areas and multiple outreach strategies. Strategies included hosting photo campaigns periodically throughout the year to develop skills and also hosting edit-a-thons to put media content onto articles.

Case study:

Wiki Loves Monuments is an annual competition held by countries around the world, but Wikimedia D.C. had one of the highest proportion of photos used on Wikimedia project pages soon after the competition: 32% verses an average 12%! Why? How?! A few things came out in their report:

(1) Wiki Loves Monuments was once piece of their broader outreach strategy, which included regular photo and editing events
(2) Guidance to participants was given to help build coverage of needed content areas[2]

Learning Patterns

edit

The following learning patterns were created by PEG grantees reporting during 2013-14. Creating learning patterns was not a requirement for these grantees, and we are so appreciative of this extra work put forth by grantees to share their most important learnings with the broader movement. We're excited to see grantees and community members to continue to share their learning more broadly:

Recommendations

edit

For grantees

edit
Recommendations for Grantees
Program Planning Executing and reporting
  • Grantees should link to commons categories for all projects
  • Final reports should reflect outcomes for the full grant period in an accessible way - simply linking to monthly reports is not sufficient.
  • Refer back to your original goals, and assess whether or not you met those goals and why
  • Tool grantees should plan to count usage and/or view counts.
  • Be sure to track content generated through GLAM projects!
  • Report budgeted and actual spending based on program
  • Create learning patterns throughout the project and afterwards!

For Grants Advisory Committee

edit
Recommendations for GAC
Proposal evaluation

increase understanding of larger grants

  • If applicable, look at past performance of a returning applicant
  • Look for applications with activities that focus on developing quality content in a specific area
  • Ensure there are clear measures of success and a plan for how to monitor those metrics
  • Consider systematic proposal evaluation: there was lots of variance in the quality and depth of feedback received by grantees
Project type advise
  • External organizations may have access to high quality media, but they should demonstrate a link to Wikimedia and plan to upload content to Commons.
  • Outreach project applications should include clear plans for following up and tracking participants; consider the quality of the contact
  • Focus funding for education projects that work with the same group of students over several weeks or months.
  • Content projects that focus on photographing important national events or public figures have a high ratio of photos in use with significant page-views.
  • Look for contests on-wiki, which are low-cost and generate significant content

For WMF

edit
Recommendations for WMF
Program Strategy
  • Encourage more short, specific writing and photography contests; also encourage edit-a-thons following a photo event, to put content into the projects
  • Build best practices around on-wiki competitions: these work, and are inexpensive
  • Provide more mentorship to grantees and applicants in terms of developing their idea, executing and measuring. Potentially have tighter integration with IdeaLab.
Assessment
  • Explicitly request that grantees create (and report) a category to track to all media content from their project.
  • In future reports, collect key metrics in a table in same location and format.
  • Simplify forms as much as possible, and instead incorporate most important elements for the reporting: learning patterns to facilitate sharing; standardized metrics to have filled in by each grantee, cost by program to increase understanding of larger grants

Future areas for study

edit
  • Market analysis: this report (and others) do not take into account the amount of participation nor content in relation with the overall context of that language, geography, or topic area
  • Measuring quality: we have limited proxies for quality; we need better understanding of how to estimate this
  • Measuring important offline interventions: we have limited understanding of the impact of specific types of events on developing free knowledge

See Also

edit

Grants data by program type

edit
Content projects Wiki Loves Monuments Outreach Conferences Tools General Support
Typical Activities
  • Photo and writing contests
  • GLAM programs
  • Wikiexpeditions
  • Photography workshops
  • Upload events
  • Awards ceremonies
  • Education programs
  • Info sessions
  • Offline Wikipedia
  • Local editor conferences
  • Regional Leadership Meeting
  • Global Wikipedia Research
  • Instructional videos
  • Translation tools
  • Overhead costs
  • Content programs, WLM
  • Outreach programs
Average cost $4,889 $5,518 $3,521 $14,036 $1,009 $13,131
Number of grants 7 10 5 4 2 8
Summative outcomes
  • 48 events
  • 477 people
  • ~31k photos
  • 4,358 articles
  • 25 events
  • ~1,850 people
  • ~80k photos
  • 8% photos in use
  • 119 events
  • ~ 4,000 people
  • ~450 articles
  • 4 events
  • 335 people
  • 15 articles
  • 141 events
  • ~2,200 people
  • ~80k photos
  • 16% photos in use
  • 4,386 articles

References

edit
  1. This lever for change was demonstrated to be meaningful in the analysis of the WP:EN Teahouse project, which demonstrated that experiencing Wikipedia a s a community and perceiving that one has a role in it is a powerful incentive. See Research from Phase 2 of Teahouse project.
  2. The group used the National Registry of Historical Places (Wikiproject:NRHP) to highlight areas that lacked coverage on Wikipedia to provide suggestions to contributors