Learning and Evaluation/News/Workshop in Budapest
Program Evaluation and Design Workshop in Budapest
editIn June 2013, the first Program Evaluation and Design Workshop will take place in Budapest, Hungary. This workshop is being offered by the Wikimedia Foundation, with support from Wikimédia Magyarország.
Over the next couple of years, the Wikimedia Foundation will be building capacity among program leaders around evaluation and program design. A better understanding of how to increase impact through better planning, execution and evaluation of programs and activities will help us to move a step closer to achieving our mission of offering a free, high quality encyclopedia to our readers around the world. With this in mind, we are pleased to announce the first Program Evaluation and Design Workshop, on 22-23 June 2013 in Budapest, Hungary. We have only 20 slots available for this workshop and the application deadline ends.
For detailed information on visiting Budapest, including weather, visa requirements, getting to and around the city, etc, please check out the "Visiting Budapest" page.
Our long-term goals for the workshop are:
- Participants will gain a basic shared understanding of program evaluation
- Participants will work collaboratively to map and prioritize measurable outcomes, beginning with a focus on the most common programs and activities
- Participants will gain increased fluency in common language of evaluation (i.e. goals versus objectives, inputs and outputs versus outcomes and impact)
- Participants will learn and practice how to extract and report data using the UserMetrics API
- Participants will commit to working as a community of evaluation leaders who will implement evaluation strategies in their programs and activities and report back at the pre-conference workshop at Wikimania 2013
- And participants will have a lot of fun and enjoy networking with other program leaders!
Agenda
editDuring the workshop in Budapest, we will only have a limited amount of time. Therefore, we will be focusing on the some of the more common programs and activities:
- Wikipedia editing workshops where participants learn how to or actively edit (i.e. edit-a-thon, wikiparty, hands-on Wikipedia workshop)
- Content donations through partnerships with galleries, libraries, archives and museums (GLAMs) and related organizations
- Wiki Takes/Expeditions where volunteers participate in day-long or weekend events to photograph site specific content
- Wiki Loves Monuments, which takes place in September
- Education program and classroom editing where volunteers support educators who have students editing Wikipedia in the classroom
- Writing competitions, which generally take place online in the form of contests, the WikiCup and other challenges – often engaging experienced editors to improve content.
Friday June 21, 2013edit | |
---|---|
19:00 | Optional Dinner (no host)
|
Saturday June 22, 2013edit | |
---|---|
9:00 | Meet at venue
|
09:15 | What is Program Evaluation and Why we Evaluate
|
09:55 | Stages and types of Program Evaluation
|
10:20 | Short break
|
10:30 | The aims of the current evaluation approach
|
10:45 | Visioning
|
11:30 | Program Evaluation Spotlights
|
12:30 | Lunch
Catered lunch |
13:30 | Theory of Change and Logic Models
|
14:30 | Logic Model Break-out Session 1:
Mapping through the chain of outcomes
Outcomes (Short-, Intermediate-, and Long-term Outcomes/Impacts) |
16:30 | Afternoon break
Coffee/Tea |
16:45 | Whole group Sharing
|
18:00 | Pre-Dinner break
|
19:00 | Evening dinner
|
Sunday June, 23, 2013edit | |
---|---|
8:30 | Meet at venue
|
9:00 | Check-in: Take Aways from Day 1
|
9:20 | Data Sources
|
10:50 | Logic Model Break-out Session 2:
Identifying Data Sources and Gaps'
|
12:00 | Lunch
Catered lunch
|
13:00 | Logic Model Break-out Session 3:
Prioritizing Outcome Indicators
|
14:00 | Whole group presentation and processing
|
15:00 -15:45 | Wrap up
|
Presentations
edit-
How to tell our stories better: from the Iberoconf 2013
-
Day One – A general presentation about Program Evaluation & Design
-
Day Two – A presentation about various evaluation data sources
Workshop in Budapest Outcomes
editParticipants feedback
editTwenty-six international participants came together in June 2013 for the pilot Program Evaluation & Design Workshop in Budapest, Hungary. The workshop brought together 21 Wikimedians from 15 countries. The participants – all with a track record of doing program work – represented five different program types: Edit-a-thons/Editing Workshops, GLAM Content Donations, Photo Upload Contests (Wiki Love Monuments, WikiExpeditions, WikiTakes), On-wiki Writing Competitions (Contests, i.e. WikiCup) and the Wikipedia Education Program. Participants were asked to complete PRE and POST workshop surveys in order to assess the workshop activities impact in terms of its set objectives:
- Participants gain a basic shared understanding of program evaluation.
- Participants will work collaboratively to map and prioritize measurable outcomes, beginning with a focus on the most popular programmatic activities.
- Participants will gain increased fluency in common language of evaluation (i.e. goals versus objectives, inputs & outputs versus outcomes & impact).
- Participants will learn about different data sources and how to extract data from the UserMetrics API.
- Participants will commit to working as a community of evaluation leaders who will implement evaluation strategies in their programs and report back to the group.
- Participants will have a lot of fun and enjoy networking with other program leaders!
The majority of the pilot workshop participants entered the workshop with no or only basic understanding of eight of ten program evaluation terms included in the survey, only the terms program, qualitative, and quantitative were well-known to the group at the beginning of the workshop. By the end of the workshop the majority left the workshop with an applied or expert understanding of nearly all the key terms included on the survey. Importantly, the core concept terms “theory of change” and “logic model,” while still less understood than the other terms, demonstrated highly significant gains along a similar trajectory as the other selected terms that were less known at PRE survey time.
Specifically, understanding of each of the selected terms demonstrated the following growth from PRE to POST:
- Cohort: Understanding grew from 19% reporting applied or expert understanding at PRE to 78% at POST
- Inputs: Understanding grew from 38% reporting applied or expert understanding at PRE to 100% at POST
- Logic Model: Understanding grew from 25% reporting applied or expert understanding at PRE to 47% at POST
- Outcomes: Understanding grew from 40% reporting applied or expert understanding at PRE to 84% at POST
- Outputs: Understanding grew from 30% reporting applied or expert understanding at PRE to 95% at POST
- Metrics: Understanding grew from 50% reporting applied or expert understanding at PRE to 63% at POST
- Program: Demonstrated a growth trend from 63% reporting applied or expert understanding at PRE to 74% at POST
- Qualitative: Understanding maintained with 75% reporting applied or expert understanding at PRE and 74% at POST
- Quantitative: Demonstrated a growth trend 75% reporting applied or expert understanding at PRE to 84% at POST
- Theory of Change: Understanding grew from 12% reporting applied or expert understanding at PRE to 53% at POST
In addition to actual change in understanding of a new, shared vocabulary, participants also demonstrated a high level of success in accessing several core learning concepts that were presented and modeled throughout the course of the workshop. At POST survey time, participants rated their level of understanding of six key learning concepts that were part of the workshop presentations all of which they rated rather high.
Furthermore, the majority of the participants were highly satisfied with the process of, and logic models generated by, the break-out group sessions. At both PRE and POST survey time participants shared one word or phrase that best represented their feeling(s) about evaluation. At PRE survey, motivations, while somewhat “curious” also presented aspects of feeling pressured to participate while at POST survey time there was much more excitement expressed, along with a fair bit of their feeling overwhelmed. When asked what next steps they were planning to implement in the next 45 days, the participants’ most frequent responses were:
- Develop measures for capturing outcomes (47%)
- Conduct a visioning activity to articulate their specific program’s impact goals and theory of change (42%)
- Develop their own custom logic model to map their specific program’s chain of outcomes (42%)
Although most participants offered specific ways that the workshop could be improved, the majority of participants felt confident in their ability to implement next steps in evaluating their program and they shared the ways that the Program Evaluation and Design Team could best support them in those next steps (i.e., broader community involvement, quality tools for tracking data, survey strategies, materials to teach other program leaders, and an online portal for engagement) toward which the team continues to direct progress.
Complete responses are summarized in the Results Summary (see below)
Program Evaluation and Design Workshop Logic Model Drafts
editThis page is currently a draft. Material may not yet be complete, information may presently be omitted, and certain parts of the content may be subject to radical, rapid alteration. More information pertaining to this may be available on the talk page.
Note to Program Leaders: While the Logic Model tool was mostly successful, there was still some confusion around inputs vs outputs in terms of "activities" for many, as well as confusion in identifying the WHO did WHAT in terms of "Participation" outputs. So... in the current version seen below, outputs are articulated instead of the former two-category system (i.e. Activities and Participation) there are now three categories under outputs (i.e. Participants, Activities, & Direct Products). You are welcome to also share your perspective on this on the talk page as it is something we are continuing to work on for clarity's sake. ____________________________________
Within each column are those items identified within each program’s general theory of change as delineated and mapped out in each of the program-based break-out session groups. The items are in columns for the various Input, Output (Participants, Activities, and Direct Event Products), and Outcome (Short-, Medium- , and Long-term outcomes) categories. Note – since there was confusion and some gaps in the end products related to the confusing output category “participation” we have currently revised the outputs mapping to include participants, activities, and direct products as separate output prompt categories. Please feel free to share whether this helps with clarity and thorough mapping or if it somehow further muddies the waters on this article's talk page.
Program Name: Overarching themes example
Theory of Change Vision: Wikimedia Programs will recruit, retain, and support contributors to create and maintain high quality content across Wikimedia projects.
- Program Evaluation/Share Space/Overview Logic Model/Edit-a-thons and workshops
- Program Evaluation and Design/Share Space/Overview Logic Model/Wikimedia Education Programs
- Program Evaluation and Design/Share Space/Overview Logic Model/GLAM content donations
- Program Evaluation and Design/Share Space/Overview Logic Model/Image upload competitions: Wiki Takes/Wiki Loves
- Program Evaluation and Design/Share Space/Overview Logic Model/On-wiki writing contests
Click here to see the logic model created at the workshop
Pickles
edit- Documentation of Programs/projects/evaluation – Where, how, when, and what to collect?
- Including community voices in the dialogue as opposed to imposing WMF’s narrowing focus on the movement
- Confusion over unexpected outcomes and whether they are important to take into account
- Defining outcomes verses outputs – simply something that happens after you lose control of participants – a lot still struggle with the distinction
- Too many modes of communication (i.e., mailing lists, wiki, events, etc.)
- I was positively surprised that it was possible to get certified Halal food in Hungary