Grants:APG/Proposals/2019-2020 round 1/Wiki Education Foundation/Progress report form

Purpose of the report

edit

This form is for organizations receiving Annual Plan Grants to report on their progress after completing the first 6 months of their grants. The time period covered in this form will be the first 6 months of each grant (e.g. 1 January - 30 June of the current year). This form includes four sections, addressing grant metrics, program stories, financial information, and compliance. Please contact APG/FDC staff if you have questions about this form, or concerns submitting it by the deadline. After submitting the form, organizations will also meet with APG staff to discuss their progress.

Metrics and results overview - all programs

edit

We are trying to understand the overall outcomes of the work being funded across our grantees' programs. Please use the table below to let us know how your programs contributed to the Grant Metrics. We understand not all Grant or grantee-defined Metrics will be relevant for all programs, so feel free to put "0" where necessary. For each program include the following table and

  1. Next to each required metric, list the outcome/results achieved for all of your programs included in your proposal.
  2. Where necessary, explain the context behind your outcome.
  3. In addition to the Global Metrics as measures of success for your programs, there is another table format in which you may report on any OTHER relevant measures of your programs success

For more information and a sample, see Grant Metrics.

Metric Achieved outcome Explanation
1. number of total participants 7,703 Most students had enrolled prior to the COVID-19 disruption, so we are close to our goal for this statistic.
2. number of newly registered users 6,642 This year, more and more people in our program are taking repeat courses.
3. number of content pages created or improved, across all Wikimedia projects 13,455 Our Wikidata courses have significantly exceeded our expectations in terms of items edited, helping this number.
4. Quantity[1] 5,518,063 The disruption in higher education from COVID-19 affected this number.
5. Quality Articles[2] 1,888 The disruption in higher education from COVID-19 affected this number.


[1] Number of words added to the article namespace for Wikipedia programs, and number of statements improved for Wikidata.
[2] Number of articles that have at least a 10-point improvement in an ORES-based quality prediction score, which indicates significant improvement of the "structural completeness" of an article.

Telling your program stories - all programs

edit

Wiki Education's programs continue to play a critically important role in the Wikimedia ecosystem, especially in relation to the strategic priority of knowledge equity. Two research projects published in the first six months of 2020 underscore the strategic impact we're having.

First, the Wikimedia Foundation's analytics team re-ran their previous analysis of Wiki Education's portion of the English Wikipedia's new active editors numbers. Last year, WMF found that in 2018, Wiki Education brings 19% (see page 60) of all new active editors. In early 2020, we asked them to re-run the numbers for 2019; the results showed that the percentage rose slightly, to 19.8%. The graph shows a very consistent pattern of activity: Wiki Education brings two spikes in new active editors, at the start of the spring and fall terms, a noticeable and meaningful impact on the overall new active editor numbers.

Second, a new journal article, Content Growth and Attention Contagion in Information Networks: Addressing Information Poverty on Wikipedia, was published in June. In this paper, researchers from Boston University took a sample of articles edited by our program participants in 2016, located a control group of similar articles, and looked at the impact of our program. The findings clearly demonstrate the impact of our program:

  • Content added by our program ("exogenously added content"), according to the abstract, "leads to significant, substantial, and long-term increases in both content consumption and subsequent contributions" to the articles in question. In other words, articles improved through our programs receive more page views after they've been improved by our participants, and they receive more additional edits than articles in the control group.
  • In contrast, the articles in the control group received little to no attention. This means content areas that are already underdeveloped are likely to remain underdeveloped without an intervention like our program.
  • Extrapolating from these observations, the researchers analyze the effectiveness of possible strategies for increasing the quality and impact of "impoverished regions" of Wikipedia coverage. They conclude that specifically targeting a set of interconnected articles in an underdeveloped topic area — as our programs do — "can lead to as much as a twofold gain in attention relative to unguided contributions".

This research is huge for Wikimedia's knowledge equity work: It confirms that content doesn't naturally improve over time; if content in a particular area is underdeveloped, it is likely to continue to be without a purposeful intervention. But it shows that purposeful interventions, especially those in programs like ours where a group of articles in a particular topic area all receive attention, can lead to a two-fold increase in attention. This attention is both internal — from other Wikipedia articles — and external — from other sites. The article makes a strong case for why content programs like Wiki Education's will be critical to the knowledge equity work that's at the center of the 2030 Movement Strategy. In particular, it reinforces the wisdom of the strategic recommendation to Identify Topics for Impact, which our programs put into practice by empowering scholars to identify the key content gaps in Wikipedia from the perspective of their own knowledge communities.

Wikipedia Student Program

edit
Wikipedia Student Program Annual goal Jan – June Percentage
1. number of total participants 16,000 7,497 47%
2. number of newly registered users 15,500 6,571 42%
3. number of content pages created or improved, across all Wikimedia projects 15,500 6,059 39%
4. Words added to article namespace 12,500,000 5,203,295 42%
5. Quality Articles 4,500 1,793 40%
What's worked well
The Progress Tracker helps guide students through the article expansion process.
The new grading view highlights where each student editor is in their assignment.

The spring 2020 term — which coincides with the timeframe of this report — was poised to be Wiki Education's most successful ever. We onboarded 409 courses with planned Wikipedia assignments, the most we've ever supported. Then the COVID-19 pandemic hit the United States and Canada, with significant disruption to the higher education sector, where we work. The "What hasn't worked well" section will delve more into the challenges that emerged from this, but despite the disruption, we were still able to achieve a lot of impact with our program in the spring 2020 term — just not as much as was expected.

In preparation for our largest cohort of courses ever, we rolled out two technology pieces in spring 2020 to improve the user experience for students and instructors in our program, thereby reducing questions to our staff.

  • We rolled out a new Progress Tracker that scaffolds the article building work students do. When a student chooses an article, the Progress Tracker will appear for that article with a number of useful links. Students can then quickly access a sandbox link for them to craft a bibliography, another sandbox link for them to draft their article, and training modules relevant to each stage of the process. It also links to the sandbox pages where classmates will add their peer reviews.
  • Building off the Progress Tracker, we also debuted a new grading view for instructors. The new view includes links to the key sandbox pages where students complete the different phases of their assignment — preparing their bibliography, drafting their articles, reviewing the drafts of their peers, and editing live Wikipedia articles. The goal of this new view is to make assessing student work easier for faculty.
Carleton College students created this graphic to describe an innate immune response to a bacteria invasion, which they uploaded to Wikimedia Commons.

Student editors in the spring 2020 term made significant improvements to thousands of Wikipedia articles. Some examples of student work include:

  • A Purdue University botany student significantly expanded the article on biomass partitioning.
  • A Stanford Law School student created the article on U.S. Eighth Circuit Appeals Court case Planned Parenthood v. Rounds.
  • A Carleton College class on immunology's article improvements received a lot of pageviews as the world sought disease information during the pandemic
  • Students at the University of Kentucky in an African-American history class created and expanded numerous biographies of African Americans.
  • Reflective writing isn't allowed on Wikipedia, but a student from Middle Georgia State University expanded the article on reflective writing, taking it through the Did You Know process.
What hasn't worked well

The COVID-19 pandemic brought significant disruption to higher education in the United States and Canada. In the middle of the term, many universities suddenly asked students to vacate campuses, and all instruction moved from in-person to virtual. Wiki Education's support model is virtual, so this required very little shift from our end of things. However, this was a monumental change for most of the faculty. We asked about the impacts in our end-of-term instructor survey:

  • 58% of instructors selected "My students and I successfully completed the Wikipedia assignment despite the challenges stemming from COVID-19."
  • 19% of instructors selected "I continued to run the Wikipedia assignment, but found it more difficult due to the challenges stemming from COVID-19."
  • 10% of instructors selected "I had to discontinue the Wikipedia assignment due to the challenges stemming from COVID-19."
  • 13% of instructors selected "Other."

These results line up with our anecdotal experiences: A small percentage of our classes simply abandoned the assignment. A larger percentage tried, but found it challenging: Some students were less engaged in a virtual experience than they were in face-to-face classes, some students found themselves doing childcare or health care for family members, and some students dropped out entirely. Even student editors who did continue no longer had access to physical books or other library reference materials; they were limited in their citations to all-digital sources. All of these contributed to lower outcomes in terms of impact to Wikipedia than we had expected after onboarding 409 courses.

Nevertheless, we are extremely proud of the work our student editors did. Our numbers place us around 80% of our expected productivity. Given the upending of higher education and the challenges faced by everyone during the pandemic, we feel like reaching 80% of our goal for the spring term is a laudable result.

The next six months

The COVID-19 pandemic didn't just disrupt the higher education sector: It also disrupted the philanthropic sector. Wiki Education is reliant on large institutional grants for our operations. Given the severity of the impact of the COVID-19 pandemic on the economy and the uncertainty around the endowments of institutional funders, we don’t expect to generate the revenue that would be necessary for keeping our operations on the same level as in the past. That's why, in order to not jeopardize our organization’s future and Wiki Education’s positive impact on the public’s access to accurate and trustworthy information, our fiscal year 2020–21 plan calls for a hopefully temporary yet significant cut of our expenses, including painful layoffs of staff. These budget cuts mean we will not be able to support as many courses as we have in prior terms, as we have fewer staff devoted to the program.

In order to support as many courses as possible, we will institute an application process to participate in the Student Program for fall 2020. We will support as many courses with a full assignment as we feel like we can, and we will ask some faculty to have students leave work in sandboxes until we have a chance to review it. We expect this will enable us to still support a large number of courses, providing high quality content to Wikipedia in areas with content gaps, even given our reduced staff capacity.

Wikipedia Scholars & Scientists Program

edit
Wikipedia Scholars & Scientists Program Annual goal Jan − June Percentage
1. number of total participants 175 159 91%
2. number of newly registered users 120 59 49%
3. number of content pages created or improved, across all Wikimedia projects 700 933 133%
4. Words added to article namespace 300,000 292,356 97%
5. Quality Articles 120 95 79%
What's worked well
In February 2020, Scholars & Scientists Program Manager Ryan McGrady and Executive Director Frank Schulenburg met with program participants from our 2019 collaboration with the National Archives, to improve articles related to women's suffrage in anticipation of their Rightfully Hers exhibit covering the same topic.

In the first six months of 2020, we ran eight Wikipedia editing courses for subject matter experts as part of our program. We also ran two smaller experiments to re-engage alumni from past programs. The editing courses continue to produce great content for Wikipedia, especially in relation to areas of equity.

  • In partnership with the WITH Foundation, we ran a course about improving articles related to healthcare for people with disabilities. One participant wrote this great blog post about her experiences improving articles she has a personal connection with.
  • The American Physical Society sponsored a course aimed at improving articles on women physicists.
  • We partnered with 500 Women Scientists to run a course on adding biographies of women scientists.
  • Similarly, a Women in Red branded course improved many articles on notable women.
  • A Society of Family Planning course brought medical experts to improve articles related to family planning and abortion.

One of the best impacts of this program so far in 2020 has been our COVID-19 course. While our courses are usually a fee-based model, where we generate revenue for the organization to run the course, we made an exception to this rule and launched a free course in May, where we aimed to bring subject matter experts to Wikipedia to improve articles on state and regional responses to COVID-19. Improvements included:

  • Significant expansions to the articles on the state responses of North Dakota, Maine, and Florida.
  • Creation of the article about the impact on Navajo Nation, and additions to state articles about the impacts on local Native American tribes, such as the Northern Arapaho tribe in the Wyoming article.

The COVID-19 course has been such a success that we sought to replicate it again. We secured funding from an individual donor to run four more courses, one of which was just getting started in June, to continue improve articles related to the response to and impact of the pandemic in the United States. We look forward to seeing the impact these courses continue to have.

What hasn't worked well

While we were ideally positioned in this program when the switch to virtual happened in March — we've been using Zoom to run these courses since the start of the program — we found that the partners who sponsor the courses were not ready to move forward with new courses. Since we charge a fee for hosting a course, we found it difficult to interest new sponsors during the time of economic instability immediately following the pandemic declaration. We also wanted to be sensitive to not being too sales-y, which wouldn't fit the ethos of either Wikipedia or Wiki Education. As we move into the second half of 2020, we are working on pursuing more opportunities like sponsoring courses themed around COVID-19 responses to help us continue to grow our revenue from this program.

The next six months

The decrease in budget mentioned in the Student Program section will also affect this program. With less staff time to devote to Scholars & Scientists courses, we've adapted the program in an effort to simplify our offerings. Following the model of our successful COVID-19 course, we're reducing the time for the course from 12 weeks to 6 weeks, with the curriculum focused more on how to edit one particular kind of Wikipedia article, rather than a larger "how to edit Wikipedia" syllabus. We are also carefully monitoring how our partners are responding to the economic instability in the U.S., and will adapt our strategy accordingly.

Wikidata Scholars & Scientists Program

edit
Wikidata Scholars & Scientists Program Annual goal Jan – June Percentage
1. number of total participants 150 47 31%
2. number of newly registered users 100 12 12%
3. number of content pages created or improved, across all Wikimedia projects 2,000 6,463 323%
4. Statements improved 15,000 22,412 149%
What's worked well
Wikidata Program Manager Will Kent, center, meets with staff from the San Francisco MOMA who took his course.

So far in 2020, we have hosted five courses in the Wikidata branch of our Scholars & Scientists Program. These courses, originally launched in 2019, have proven to be wildly popular, attracting participants from top GLAM institutions including the Smithsonian, the Chicago Art Institute, SFMOMA, and the Frick, among others, and even a handful of staff from Wikimedia chapters! We've found these six-week courses are extremely popular with librarians in particular, and many can apply the skills they learn about Wikidata directly to their jobs. For example:

  • A librarian from the Smithsonian shared her learnings from the course.
  • A Stanford Law School librarian shared what he got out of the course.
  • Carnegie Hall's digital collections staff took the course, and were so inspired they've rethought their departmental strategy with Wikidata in mind, shared here.

Not only are we teaching GLAM staff critical skills, they are also making meaningful edits to Wikidata. While we are less than halfway to our annual goals in terms of number of participants, we have already dramatically exceeded our goals for number of items edited and statements improved. We're really excited to see the impact these courses will continue to have on Wikidata content.

What hasn't worked well

We had expected that many of the participants taking our Wikidata courses would be new editors; they're designed to be newbie-friendly. But what our registration numbers continue to show is that may participants have existing Wikimedia accounts. In fact, 74% of participants in our courses so far are returning editors. We believe what is happening is these people have attended an edit-a-thon or other event, or tried to contribute on their own, and ran into barriers. We are grateful they have found our courses, and that we are able to provide the extended training they need to turn them into productive contributors of Wikidata, but that also means we are likely to miss our overall goal on this in 2020. Nevertheless, we see these participants as re-engaged lapsed Wikimedians, in most cases, so we believe the spirit of the statistic is intact.

The next six months

Our expense reductions described in the Student Program section will also affect our Wikidata program. We will still host courses on editing Wikidata, but we are likely to host fewer than previously anticipated. Nevertheless, we are excited about the impact these program participants will have to Wikidata.

Other projects

edit

Programs & Events Dashboard continues to serve as key infrastructure for programs throughout the Wikimedia movement, providing impact metrics for more 1,350 programs and events so far this calendar year. We improved the reliability of the update system, added support for tracking sets of articles via the PagePile tool, and have started a pair of Google Summer of Code internship projects that will (respectively) improve the Dashboard's error monitoring capabilities, and reduce loading times and bandwidth usage for editors with slow connections.

Revenues received during this six-month period

edit

Please use the exchange rate in your APG proposal.

  • Important note
    • the anticipated column may list revenues anticipated for the whole year instead of only the 6 months. Please make sure that this the time period clear in the table.
    • In the explanation column, always mention relevant information about the numbers: what period they refer to etc.

Table 2 Please report all spending in the currency of your grant unless US$ is requested.

  • Please also include any in-kind contributions or resources that you have received in this revenues table. This might include donated office space, services, prizes, food, etc. If you are to provide a monetary equivalent (e.g. $500 for food from Organization X for service Y), please include it in this table. Otherwise, please highlight the contribution, as well as the name of the partner, in the notes section.
Revenue source Currency Anticipated Q1 Q2 Q3 Q4 Cumulative Anticipated ($US)* Cumulative ($US)* Explanation of variances from plan
Selling Services USD $25,625 $22,613 $84,740 F G $107,353 $25,625 $107,353 We have strategically pushed revenue growth in this area since submitting our plan.
Selling Impact USD $2,450,950 $390,492 $42,799 F G $433,291 $2,450,950 $433,291 Our Wikimedia Foundation APG money arrived in December instead of January and was thus included in the prior quarter, making this number look artificially low.
Total USD $2,476,575 $413,105 $127,539 F G $540,644 $2,476,575 $540,644

* Provide estimates in US Dollars


Spending during this six-month period

edit

Please use the exchange rate in your APG proposal.

  • Important note
    • Budget can be the budget for the whole year (and thus the percentage will reflect the half year and should be around 50%, or the half year, in which case the % should be around 100%. Please make that clear in the table.
    • In the explanation column, always mention relevant information about the numbers: what period they refer to.

Table 3 Please report all spending in the currency of your grant unless US$ is requested.

(The "budgeted" amount is the total planned for the year as submitted in your proposal form or your revised plan, and the "cumulative" column refers to the total spent to date this year. The "percentage spent to date" is the ratio of the cumulative amount spent over the budgeted amount.)
Expense Currency Budgeted Q1 Q2 Q3 Q4 Cumulative Budgeted ($US)* Cumulative ($US)* Percentage spent to date Explanation of variances from plan
Student Program USD $348,971 $86,261 $84,417 F G $170,679 $348,971 $170,679 49% We are on track.
Wikipedia Scholars & Scientists USD $455,844 $122,029 $124,459 F G $246,488 $455,844 $246,488 54% We are on track.
Wikidata Scholars & Scientists USD $317,822 $82,308 $84,544 F G $166,851 $317,822 $166,851 52% We are on track.
Technology USD $398,517 $88,352 $69,551 F G $157,903 $398,517 $157,903 40% Our software developer left in April, and we did not re-hire for the position within the quarter, leaving this slightly lower than budgeted.
General/HR/Finance/Admin/Board/Fundraising USD $830,536 $248,864 $155,869 F G $404,733 $830,536 $404,733 49% We are on track.
TOTAL USD $2,351,690 $627,815 $518,840 F G $1,146,654 $2,351,690 $1,146,654 49% We are on track.

* Provide estimates in US Dollars


Compliance

edit

Is your organization compliant with the terms outlined in the grant agreement?

edit

As required in the grant agreement, please report any deviations from your grant proposal here. Note that, among other things, any changes must be consistent with our WMF mission, must be for charitable purposes as defined in the grant agreement, and must otherwise comply with the grant agreement.

  • No changes.

Are you in compliance with all applicable laws and regulations as outlined in the grant agreement? Please answer "Yes" or "No".

  • Yes

Are you in compliance with provisions of the United States Internal Revenue Code (“Code”), and with relevant tax laws and regulations restricting the use of the Grant funds as outlined in the grant agreement? Please answer "Yes" or "No".

  • Yes

Signature

edit
Once complete, please sign below with the usual four tildes.

Resources

edit

Resources to plan for measurement

edit

Resources for storytelling

edit