Program: A program is a group of activities that share a similar theories of change and often have the same mission or goals. Program Leader: A program leader is a person who plans, executes and, typically, evaluates programs. Program Implementation: A program implementation is an instance where a program leader plans and executes a program.
Data for the 2015 reports is captured from programs implemented September 2013 - September 2014.
Almost twice as many countries represented compared to 2013 reports and many non-english languages as we are dug further and deeper to capture as many potential case studie sfrom around the world.
We had a lot of reporting, a lot more than in the first beta report, 6 times more implementations since we looked at all grant reports as well as voluntarily reported programs.
A lot of new and existing users as well as offline volunteers engage through Wikimedia programs
31,341 participants engaged through the program events captured (30% new users, 9% Existing active editors and the other 60% other volunteer contributors)
1,130,292 pages were created and/or improved through the program events captured (65% new pages, 35% improved pages)
336 Implementations had participant data for us to track retention
For workshops: new user names were only available from 32/142 events, the new editors survived and active at 6-month follow-up came from 8 of those 32 workshops (25% only). More importantly, we know nothing about the other 5,380 participants of workshops (95%), whether they created accounts or had them, their user status is entirely unknown
Similarly: For Editathons, new user names were only available from only 15/170 events, the new editors survived and active at 6-month follow-up came from 6 of those 15 events (40%). More importantly, we know nothing about the other 1,008 participants of edit-a-thons (44%), whether they created accounts or had them, their user status is entirely unknown.
Next Steps
Use the report data to help direct learning and program selection, design, and planning for outcomes
Having data is part of a larger approach towards accountability and shared learning on Wikimedia programs. The full picture includes having accessible metrics that can surface data which can help us better understand these programs and how to implement them
Review the reading guide to understand the numbers and graphs presented and this brief review of how to interpret them.
Mission: Use the data which are relevant to you to help find program leaders to reach out to and learn from one another
Explore your program data and share learning through
Learning Patterns: A set pattern, or guide, which explains how to do something in a problem-solution format. In the Learning Patterns Library on Meta you will find many successful strategies for planning, implementing, and evaluating projects and programs. Here we invite all community members to input by adding to, endorsing, and creating their own learning patterns to share learning about effective practices across the movement.
Program Guides and Toolkits: Curated information and key resources to plan, run, and evaluate a program and connect to other program leaders.
Considering your assigned scenario, work to select a program and set reasonable targets for that program based on your resources, and using the reports for information.
Step 1: Familiarize yourselves with your community scenario.
Step 2: Based on resources and goals, choose one program
Step 3: Choose which outcome or outcomes (max 2) you will target
Step 4: Set reasonable goals and targets
Step 5: Check goals and targets
Most importantly be aware of:
Potential context differences in monetary value and community goals
Using the median and median range rather than mean average
Share an observation, raise a question, suggest a solution
If you ran a program that delivered excellently against goals, speak up! Consider writing a blog or how-to guide highlighting your ideas on why your program was so successful.
If you have run a program and want to report key metrics to the Learning and Evaluation team, our collector is always open. Visit our reporting page to learn about the reporting forms contents and find the link to voluntary reporting.
Join the program leader mailing list of weekly updates about program evaluation, tools, etc.
If you are considering running a new program or updating an existing one, consider reaching out to experienced program leaders who have organized a similar program.
Acknowledgements
Thank you so much to the program leaders who contributed invaluable program data to this report. Without you this report would have been impossible. Thank you also to the developers who built WMF labs tools we used to gather and analyze data: Magnus Manske, Yuvipanda, and WMF Analytics researchers past and present. The additional data accessed through these tools has made this report richer and more informative for program leaders, evaluators, and the movement.
The team behind this report
This report series was produced by the Program Evaluation and Design unit of the WMF Learning and Evaluation team. This team is part of the Community Engagement department, and strives to bring forward the learning that take place within the movement, through its many organizations, program leaders and volunteer community members that execute, in whole or in part, Wikimedia activities in their community.
Jaime Anstee, Program Evaluation Specialist
Edward Galvez, Program Evaluation Associate
María Cruz, Program Evaluation Community Coordinator