Wikimedia Blog/Drafts/Individual Engagement Grants demonstrate potential for impact

Title

edit

Body

edit
 
Round 1 IEG projects

A year ago, Wikipedia didn't have a social media presence in China. With the support of a $350 Individual Engagement Grant, today 10,000 Chinese readers follow the Wikipedia account on Weibo, China's most active social networking site and Chinese Wikipedians are able to use the channel to share Wikipedia's knowledge and organize events in China like Wiki Loves Monuments. A year ago, there were no guarantees that a few one-off donated accounts to paywall journals could be grown into a digital hub providing free access to reliable sources for Wikipedians and pioneering new models of collaboration between Wikipedia and libraries. With the support of a $7500 Individual Engagement Grant, today 1500 Wikipedia editors have access to 3700 free accounts and The Wikipedia Library is laying plans to go global. Grantees like Addis Wang and Jake Orlowitz were clear about their goals, eager to engage with the community to understand needs and priorities and willing to take risks and experiment in search of pragmatic and scalable solutions. They incorporated experts and mentors into their process to build platforms that are larger than any one individual.

The Individual Engagement Grants program was launched a year ago with the idea of supporting individual Wikimedians like Addis and Jake to lead projects focused on experiments driving online improvements. This program, too, began as experiment with some risks and no guarantees. And so as the first round of grants come to a close, with the help of an assessment by WMF’s Grantmaking Learning & Evaluation team, we're taking a look at the impact of these projects and what we've learned so far.

Early indicators of impact

edit

The first round of IEG funding distributed about US $60,000 to support eight experimental projects led by community members in six different countries. Half were focused on online community organizing, the rest either built tools or conducted offline outreach. More time is needed to determine the full impact of these grants on their target wikis or as scaled programs across wikis, but some early indicators suggest that these grants can have direct impact on the strategic goals of the Wikimedia movement.

5 projects have been highlighted to monitor for longer-term impact:

  • The Wikipedia Library -- the project has already more than paid for itself and great potential still exists for scaling the online library across languages: $7,500 grant, $279,000 in donated sources, a 37.2X return on investment. Moreover, editors are demonstrating demand for and utilizing the resources: the donated resources have seen growth in their references on Wikipedia between 400-600% since free access became available.
  • Wikisource Strategy -- demonstrated impact with organized events: November 2013 saw the highest number of new Wikisourcers (171) since July 2011.
  • WikiArS -- a program demonstrating high quality donations of content, and extending the scope of Commons by including rating of drawings, rather than just photos: 30 images marked with a high designation of quality; 211 Wikipedia pages using images from program; 100% retention rates of participating schools.
  • The Wikipedia Adventure -- game built to provide more coaching to newcomers on English Wikipedia has shown potential for creating more productive, longer term editors: players were 1.2x more likely to edit than those who did not play.
  • Publicity in China -- the low-cost social media experiment with Weibo has already demonstrated the hunger for information in China with 10,000 followers (25% of whom are women), and top posted articles seem to drive readers to WIkipedia: there was overall a 252% increase in pages views for articles with the most reposts the three days following the post.

Targeted online community engagement experiments have potential

edit

Projects that focused on online community organizing seem to demonstrate the greatest potential for results so far, particularly when a community need was clearly defined from the start. Investments in projects that solely built tools, without integration into existing community workflows, appear to be less effective.

“The best IEGs seem to be the ones that build some sort of platform -- a social media group, a curriculum, a library, a strategy -- and have demonstrated the possibility for that platform to have impact on a smaller (beta) level. These projects were explicitly designed to meet an expressed need within the community: the grantee had heard this need (e.g., through past experience, through surveys) and designed a creative solution to resolve the gap. The next step, ideally, was for the grantees to see if the designed solution did in fact affect the desired outcomes.” - L&E impact assessment
 
Wikisource Strategy surveyed Wikisourcers from over 30 countries

Projects that have been identified as most successfully completed have some common factors:

  • they have clear links to strategic priorities,
  • they’re designed to meet specific needs of end-users,
  • they have feedback loops built into the project,
  • they engage with the community start to finish,
  • they incorporate significant expert support where needed.

A grantee’s own knowledge of their community, coupled with an ability to call on the expertise and perspectives of others, appears to be an important ingredient for success. Individual Engagement Grantees often start with a proof-of-concept on their home wikis - from Wikisource to Chinese Wikipedia - but most projects are designed so that, if successful, they would be able to scale beyond the initial target language project. This requires participation and input from more than one individual. According to one grantee, “Counting on a network of competences can be extremely useful for the project.”

Community engagement in the Wikisource Strategy project, for example, involved a combination of face-to-face and anonymous online feedback, providing the project leaders with range of community perspectives. A survey - distributed in eleven languages - pointed to several top technical issues plaguing Wikisourcers globally, helping the grantees prioritize efforts. And project organizers were able to leverage volunteer technical experts via Google Summer of Code, with four students working on Wikisource projects.

In the WikiArS project, grantee David Gomez piloted approaches in eight schools in Catalonia, using his findings to build workflows that could be used globally. He engaged with Wikipedians to solicit needs for classroom assignments so that student-created images would be incorporated by the community back into articles, and he took learning from the Commons image assessment process back to the schools in order to improve assignments uploaded to Commons.

 
83% of IEG funds are spent on human capital

Underscoring the importance of time and expertise in online community engagement projects, we're finding that most IEG project funds were ultimately spent on human capital, either for the grantee’s own project management costs, or to bring in outside expertise. (see chart, right)

Experimental mentality and mentorship enable success

edit

Experiments are about trying something new and learning from it. Good experiments need a safe-to-fail environment to thrive and IEG has focused on creating this environment by baking innovation into the selection criteria and encouraging grantees to take risks and share learning along the way.

“In contrast to other WMF grants programs, the IEG reports jump out as brutally honest and transparent. Many reasons may contribute to this, but more than anything it appears that IEG is a space clearly separated and specified as experimental…Part of this seems to be the design and coaching on the reports themselves.” - L&E impact assessment

We think reporting should be a useful learning exercise for people and their projects, rather than a boring requirement, this appears to be a useful framework for supporting grantees. As one grantee put it, “I like that, using Siko's word, we could "choose our adventure" for the monthly report. This proved valuable, as we decided on writing blog posts for the Wikimedia Blog, and this was useful and gave us attention and visibility.”

Mentorship and other forms of non-monetary support offered by WMF staff and volunteers to the grantees also appears to be an important component in the program. The impact report notes that, “Individuals who have specific ideas for experimental, scalable programs seem to thrive in the IEG program, which provides high levels of resources beyond money.” As another grantee put it, “There's a whole skillset around program/project management that is not learned easily without patient and thorough help, and I very much appreciate the IEG program for being that mentor.” While we don’t intend to dial back the level of human support that we aim to offer, it is clear that we will need to invest in scaling up mentorship methods for the program in order to meet the needs of more individuals and projects.

Opportunities to improve remain

edit

One way to scale access to the expertise that individuals need to successfully complete their projects is to involve more volunteer participants and advisors through IdeaLab. Further development is needed in the Lab, though, to fully realize the vision of a scalable systems for matching people, ideas, needs and skills. Improving spaces like IdeaLab and providing more self-service tools that grantees and other volunteers can use in community projects could support more experimental dreamers to solve critical issues in the Wikimedia movement in years to come. As new rounds of IEG projects and proposals get underway, we’ll be watching to see how these trends play out in the longer-term.

Do you have an idea for an experiment to improve Wikimedia in your community? Share it in the IdeaLab - we look forward to seeing you there! The next round of IEG proposals is due March 31st.

The complete IEG impact report and accompanying IEG process perspectives are both available on Meta-wiki. Many thanks to Jessie Wild and Jonathan Morgan of the Learning & Evaluation team for their analysis.

Siko Bouterse, Head of Individual Engagement Grants

Notes

edit