Grants:APG/Proposals/2018-2019 round 1/Wiki Education Foundation/Impact report form


Purpose of the report

edit

This form is for organizations receiving Annual Plan Grants to report on their results to date. For progress reports, the time period for this report will the first 6 months of each grant (e.g. 1 January - 30 June of the current year). For impact reports, the time period for this report will be the full 12 months of this grant, including the period already reported on in the progress report (e.g. 1 January - 31 December of the current year). This form includes four sections, addressing global metrics, program stories, financial information, and compliance. Please contact APG/FDC staff if you have questions about this form, or concerns submitting it by the deadline. After submitting the form, organizations will also meet with APG staff to discuss their progress.

Global metrics overview - all programs

edit

We are trying to understand the overall outcomes of the work being funded across our grantees' programs. Please use the table below to let us know how your programs contributed to the Global Metrics. We understand not all Global Metrics will be relevant for all programs, so feel free to put "0" where necessary. For each program include the following table and

  1. Next to each required metric, list the outcome achieved for all of your programs included in your proposal.
  2. Where necessary, explain the context behind your outcome.
  3. In addition to the Global Metrics as measures of success for your programs, there is another table format in which you may report on any OTHER relevant measures of your programs success

For more information and a sample, see Global Metrics.

Overall

edit
Metric Achieved outcome Explanation
1. number of total participants 16,780 We met our overall goal of 16,185.
2. number of newly registered users 15,085 We nearly met our overall goal of 15,655.
3. number of content pages created or improved, across all Wikimedia projects 19,772 We nearly met our overall goal of 20,475.
4. quantity: words added to the article namespace 13,431,661 We exceeded our goal of 11,950,000.
5. quality: number of articles increasing by at least 10 points on the ORES scale 4,476 We nearly met our goal of 4,700.


Telling your program stories - all programs

edit
Wiki Education Strategy

For Wiki Education, 2019 was our first full calendar year enacting our new strategy, and it was a year of great success. As we executed the work to support the three pillars of our strategy, Equity, Quality, and Reach, we increased our overall impact to the English Wikipedia and Wikidata through our programs and the global Wikimedia community through our support of the Programs & Events Dashboard.

In 2019, Wiki Education was responsible for 3.1% of all active editors on English Wikipedia, based on our count of monthly editors from our programs divided by the numbers at stats.wikimedia.org. Based on Wikimedia Foundation statistics, we were responsible for 19.8% of all new active editors on English Wikipedia. The sheer scale we're operating at on English Wikipedia demonstrates the outsized impact we had on Wikipedia in 2019.

Highlights from 2019 included:

  • Maintaining the massive impact of our Student Program on English Wikipedia content and new editors.
  • Creating an earned income revenue stream with our Scholars & Scientists Program, a first in the Wikimedia universe.
  • Launching a new series of Wikidata courses.
  • Supporting more and more global Wikimedia program leaders with our Programs & Events Dashboard.

Student Program

edit

In our Wikipedia Student Program, we supported more than 16,000 student editors as they wrote Wikipedia articles as a class assignment. In order to focus our attention on growing our new Scholars & Scientists Program in 2019, we set our goals to maintain our 2018 impact, and we succeeded in maintaining our outsize impact to the English Wikipedia.

Quantitative targets

edit
Measure of success Goal (2019) Total % completed Notes
Total Participants 16,000 16,565 104% We exceeded this goal.
Newly Registered 15,500 14,999 97% We're noticing a trend of more and more student editors taking multiple courses within our program, so we very slightly missed this goal.
Content Pages Improved 19,200 14,986 78% Student editors worked on fewer articles than expected but added more content to those articles. Overall, this is a good thing.
Quantity 11,200,000 12,843,562 115% We exceeded this goal.
Quality Articles 4,500 4,259 95% We fell just short of this goal.

What we did

edit
Wiki Education's Senior Wikipedia Expert Ian Ramjohn gave a talk at the Ecological Society of America conference about how student editors are improving Wikipedia's ecology content through our Student Program.

In 2019, we supported more than 16,000 editors as they took on Wikipedia assignments in their higher education classes within the United States and Canada. These students participated in 831 courses across 382 universities, taught by 625 professors. The scalable solution Wiki Education has built for this program means we were able to maintain this impact with only three full-time staff devoted to the program: One Program Manager who works with faculty, and two Wikipedia Experts who answer student editors' questions, review their work on wiki, and help students navigate the process of contributing to Wikipedia.

In 2019, the majority of our courses (58%, on track with our baseline of 57% in 2018) were taught by returning instructors, a key marker of success for us. Even with our success in retaining instructors, the variability of academia (sabbaticals, changing teaching schedules, faculty leaving academia, etc.) means we need an ongoing pipeline of fresh new instructors. The visibility work we've done in the past and our extensive word-of-mouth network meant that 70% of new instructors who taught with Wikipedia for the first time in 2019 came to our program organically, meaning we'd had no previous contact with them before they created a course page. The other 30% come from our targeted outreach, to equity-related groups like the National Women's Studies Association and to science associations in support of our ongoing Communicating Science initiative.

Once we bring the instructors to our Dashboard platform to create a course page, we of course also need to support them. We've developed numerous internal processes to keep track of all of these courses, using technology with the Dashboard to detect common situations that warrant a closer look. The Wikipedia Expert assigned to each class gets email notifications based on various student behaviors — nominating an article for DYK, editing an article under discretionary sanctions, having an article nominated for deletion, etc. This is a notice for the Wikipedia Expert to go see if they need to take action. They also respond to individual requests for help from student editors and instructors.

At the end of the term, we do a long "course closing" process, where we look into what the students in the course did on Wikipedia more systematically. Wikipedia Experts assess the quality of the overall class into one of four categories: "Excellent", "Good", "Fair", and "Hurts Wikipedia". For courses that get a "Hurts Wikipedia" rating, we discuss whether we believe the instructor can remedy the source of the problem. If they can, we speak directly with the instructor to go over what they would need to do differently to work with us in the future. If we think it's not fixable (e.g., the instructor insists that students add original research), then we ask them to do a different assignment outside of Wikipedia. This process enables us to support a large number of courses and new student editors in a scalable way.

What worked well

edit

As expected, student editors in our Student Program produced great content for Wikipedia. Here are some examples:

  • An ecology student from Washington University in St. Louis expanded a very short article on the fruit fly species Drosophila subobscura to Good Article status.
  • A Rice University student completely reworte the article on Education in Mali.
  • A student from California Institute of Technology in an organic geochemistry class created the article on Crenarchaeol, a lipid that can be preserved for hundreds of millions of years.
  • A University of British Columbia student in a history class wrote a new article about Shareefa Hamid Ali, an Indian woman who campaigned for the rights of women and children. She lobbied for legislation to restrict child marriages, and was one of 15 women to attend the first UN Commission on the Status of Women in 1947.
  • An evolution student from the University of Central Arkansas was responsible for a new article on landscape genetics, a field of scientific study. This new article gets more than 100 views a day.
  • An environmental sciences student at the Georgia Institute of Technology re-wrote the article on the geological concept Vergence.
  • A law student at Stanford Law School rewrote the article on Buried Bodies Case, explaining the legal doctrine and ethical questions for the legal profession that arose from the case.

In addition to the excellent work student editors produced on Wikipedia, we tried a small social media pilot in fall 2019, where we sent emails to our instructors encouraging them to share the impact their students had on Wikipedia during the fall term. The results were great! Many instructors included links or screenshots of their Dashboard pages when they shared their impact.

Tweets

@janniaragon[1]
In Fall 2019, my students added to Wikipedia in a @WikiEducation assignment. Check out the amazing work they did. link
@islamoyankee[2]
I just got the stats from my @thenewschool Intro. to Islam class @WikiEducation project: 350K views on the pages my 17 students edited. That's amazing. Thanks @stephenniem for the idea.
@DanicaSavonick[3]
A huge thanks to the @WikiEducation team who helped my @suny_cortland Digital Divides students make a ton of edits to Wikipedia! Students' reflections convey how much they learned about the politics of knowledge through this editing assignment.
@d_giovannelli[4]
My students in #MarineMicrobialDiversity @UninaIT edited @Wikipedia articles as part of the course. 19 master students involved, 5 articles edited, 256 new citations and 133,115 characters added. Average completeness went from 38.6 to 67.8%. I'm very proud of them! #Teaching

What didn't work well

edit

We mentioned in our first progress report that we'd decided to delay the start of the Wikidata Student Program after an unsuccessful grant application. We had a second learning when we tried to create a Slack channel where instructors could engage with each other to build community, which several have requested in the past. While 99 instructors joined, virtually no discussion happened. The handful of times instructors asked a question, Wiki Education staff answered it. A post-term survey showed an overwhelming majority of instructors did not find it useful, so we are discontinuing it for 2020.

Connecting it to our strategy

edit
Equity

Our Student Program is a really great opportunity to bring new voices and new perspectives to Wikipedia. Our demographics have remained steady over the last few years:

  • 58% of student editors identify as women, and 2% identify as nonbinary, increasing Wikipedia's gender diversity.
  • 40% identify as a race other than white; we believe this also increases the English Wikipedia's racial diversity, but without editor data to compare it to we cannot make an assessment.
  • 42% of our student editors speak more than one language, bringing new cultures and perspectives to English Wikipedia.

These student editors also write in diverse content areas. In 2019, student editors improved hundreds of biographies of women or other underrepresented people on Wikipedia. They tackled underdeveloped content areas like African politics. One class collaboratively improved the article on American Indian boarding schools in Wisconsin. Another class improved topics related to American Sign Language. The sheer volume of courses we work with enables us to improve knowledge equity across a diverse range of topics.

81% of fall 2019 instructors said the Wikipedia assignment helped their students to become more socially and culturally aware (e.g., the ability to identify underrepresentation and other content gaps stemming from bias).

In 2019, Wiki Education hired Dr. Alexandria Lockett, Assistant Professor of English and long-time participant in Wiki Education's Student Program, as a consultant to assess our tools and resources as they relate to our current strategic goal of equity. Dr. Lockett has devoted her career to exploring issues of equity as they relate to pedagogy, higher education, and open educational initiatives. As such, she is well-positioned to ensure that Wiki Education's materials and resources meet our stated goals surrounding equity. Dr. Lockett conducted a deep analysis of Wiki Education's training materials, Dashboard timeline, and assignment design wizard, suggesting ways we could improve it in relation to knowledge equity. She recommended asking specifically about knowledge equity as a learning outcome in our instructor survey, which we did in fall 2019, causing us to identify that this is a key learning objective for faculty (see chart at left). She encouraged us to ask about instructor demographics, which led us to learn that nearly 65% of our faculty identify as women, around 15% higher than academia at large.

We will spend time in 2020 addressing Dr. Lockett's recommendations around knowledge equity, particularly:

  • Helping instructors to incorporate issues of knowledge equity into all aspects of their Wikipedia assignment, e.g., getting students and instructors to think about their relatively privileged positions vis-à-vis knowledge and how they can share that information in the most accessible and equitable way possible.
  • Helping instructors and student editors to think about sourcing in a way that improves knowledge and content equity, e.g., encouraging students to find sources that are not only reliable, but that come from a more diverse group of authors.
  • Helping students to find articles to work on that directly address Wikipedia's equity gaps, e.g. encouraging students to choose articles that cover traditionally underrepresented and marginalized populations and topics.
  • Ensuring that we make equity a central part of all aspects of the Wikipedia assignment. The Wikipedia assignment is currently structured as a scaffolded project, and we want to make sure that each step in the project includes equity work where possible. This means that students will think about equity when they are just beginning to learn how to critically evaluate Wikipedia articles, to choosing the articles they'll ultimately contribute to, to providing their peers with feedback, to their final live contribution.
Quality

We've maintained our excellent quality of work throughout the Student Program. This can be seen by the number of articles that improved by more than 10 points on the ORES scale in our overall Student Program metric. While ORES doesn't directly measure quality, we're using it as a proxy to demonstrate that a computer can detect that the information students add to Wikipedia looks like a real encyclopedia article. Our own assessment of quality of work by our Wikipedia Experts backs this up.

As part of the Quality element within our Strategy, one objective identified relates to our partnership work. We maintain active partnerships with around a dozen academic associations in the U.S.; these associations encourage faculty in their discipline to join our Student Program. Partnerships have been an important element in programmatic growth for us over the years, as instructors look to them for leadership and pedagogical recommendations.

Reach

While most students don't work on high-traffic articles, a handful do (for example, one student improved a section in the article on Troy, which receives nearly 1 million views each year). Mostly, however, the sheer size of the corps of articles students work on as part of their coursework for this program means students reach hundreds of millions of viewers with their work.

Scholars & Scientists Program

edit

Quantitative targets

edit
Wikipedia
Measure of success Goal (2019) Total % completed Notes
Total Participants 175 131 75% We had fewer participants, but we were able to achieve more on wiki than expected per participant.
Newly Registered 155 69 45% We remain surprised by how many of our participants had previously tried to edit Wikipedia but were stymied.
Content Pages Improved 525 617 118% We exceeded this goal.
Quantity 250,000 394,047 158% We exceeded this goal.
Quality Articles 100 133 133% We exceeded this goal.
Wikidata
Measure of success Goal (2019) Total Notes
Participants 0 75 We saw an opportunity and launched this new, unplanned program.
Newly registered 0 17 We saw an opportunity and launched this new, unplanned program.
Items edited 0 3,771 We saw an opportunity and launched this new, unplanned program.
Statements added 0 7,768 We saw an opportunity and launched this new, unplanned program.

What we did

edit
Wiki Education's Wikidata Program Manager Will Kent led a Wikidata workshop in New York City in July.
Wiki Education's Director of Partnerships Jami Mathewson spoke with program participants on a panel at the Society of Family Planning Conference in October.

At the beginning of 2019, we migrated our Scholars & Scientists Program to be exclusively a fee-for-service model. In the program, Wiki Education staff offer structured, virtual classes on how to contribute to Wikimedia projects; participants pay a fee to participate and are given a certificate upon successful completion of the course. We tested three models for courses:

  • Individual payer: In this model, Wiki Education hosts a course at a specific time over a specific duration; individuals sign up and pay a fee to participate in that course.
  • Institutional payer: In this model, a particular organization purchases an entire course for their staff/members/etc.
  • Hybrid: In this model, an institution supports a course, but works with Wiki Education to fill the slots in the course; sometimes in this model the institution sponsors the entire course or sometimes they just co-brand it and we charge individual participants.

Once participants are signed up in the class, we offer weekly meetings using Zoom videoconferencing and communicate using the Slack messaging platform between meetings. Participants enroll in the Dashboard course page for their course, where they have a detailed milestone-based timeline and a series of outside-of-class training modules. The Dashboard tracks their edits to provide data about their impact and to facilitate the support of Wiki Education staff. One staff member serves as a lead trainer of the course; a second serves as a Wikipedia Expert, answering questions and supporting the participants on-wiki.

In 2019, we sought to get more data points on which models would work, and what a future for this program might look like. At the beginning of the year, we had an active program around Wikipedia editing, with an established 12-week curriculum. A few months into the year, we saw an opportunity to also host Wikidata courses. While we had not planned to launch Wikidata courses, we wanted to take advantage of our perceived opportunity. In July, we launched our first Wikidata courses. We started with an in-person workshop in New York City, which we used mostly as an opportunity to gain learnings about our curriculum. Then we launched 6-week virtual courses, at both the beginner and the intermediate level, which we continued throughout 2019.

What worked well

edit

We're extremely proud of the quality of content added by participants in both our Wikipedia and Wikidata courses.

  • In an Advanced Wikipedia course, a group of alumni from previous courses collaboratively improved the article on the Nineteenth Amendment to the United States Constitution, bringing it up to Good Article status on the English Wikipedia.
  • A University of Chicago graduate student significantly expanded the article on Water resource policy.
  • An obstetrician-gynecologist significantly expanded and rewrote the highly viewed English Wikipedia article on Tubal ligation.
  • A graduate student in developmental biology and genetics noticed there were Wikipedia articles on both Embryo and Human embryo, but the Embryo article lacked information about non-human embryos. So the new editor significantly expanded sections on plant and animal embryos.
  • A professor of neurosurgery rewrote the article on the Nigrostriatal pathway, a neural pathway important in Parkinson's and schizophrenia.
  • A Wikidata course participant noticed an equity gap in Wikidata's criminal justice modeling: A person could be convicted of a crime, but not exonerated of one. The participant proposed "exonerated of" as a new property, and now P7781 can help address this issue.
Wiki Education's Scholars and Scientists Program Manager Ryan McGrady presented about our learnings from this program during Wikimania 2019.

In addition to the impact to Wikipedia and Wikidata, these programs produced two other outcomes: (1) Learnings, and (2) Revenue. The video at right shows Program Manager Ryan McGrady presenting about the learnings from the Wikipedia side of the program at Wikimania 2019. We published an extensive evaluation of our Wikidata course pilot. As part of our ongoing commitment to transparency and sharing our programmatic learnings with the broader Wikimedia community, we've published this report with detailed outcomes on Meta. Since we charge money for this program, we've also generated revenue! The financial section below indicates our income from our fee-for-service model; we are pleased that we have been able to bring in money to our organization. We look forward to expanding these efforts in 2020.

As part of our learnings, we hired two consultants to work with us in 2019 on the program. We found these to be extremely useful exercises. In spring 2019, we worked with instructional design consultant Michael Atkinson to evaluate the content and teaching. Michael created both best practices documentation for us around things like running courses, using virtual learning environments, and hiring new trainers, as well as offering content suggestions and a framework for improving the content of our trainings based on competencies we’re teaching. We used Michael's recommendations to make a number of improvements to our program. In fall 2019, we worked with marketing consultants M+R to help understand the best messaging and approaches for recruiting partners and individuals interested in a “Women in STEM” Wikipedia training course. M+R interviewed our past participants, current partners, and some champions of our community about what they considered most valuable about our courses. This research resulted in a Messaging Manual with suggested changes to our messaging and recruitment tactics. The project wrapped up at the end of 2019, and we look forward to implementing the suggestions next year.

What didn't work well

edit

Since this was our first year of running the program in a fee-for-service model, we didn't quite know how to project the income from the program. We made our best guess in our initial proposal, but the actuals varied from our initial estimate. Now that we've gone through a calendar year, have more established leads in the pipeline, and have a better sense of when people are more and less interested in taking courses — as well as the marketing learnings we've gained over the last year — we anticipate being able to better project revenue and impact in the future.

Connecting it to our strategy

edit
Screen capture of Wiki Education program participants' edits to the English Wikipedia article on the Nineteenth Amendment to the United States Constitution, as seen here: https://dashboard.wikiedu.org/courses/Wiki_Scholars/NARA_Wiki_Scholars_-_Advanced_(2019-05)/articles/edited?showArticle=51316504
Equity

Many of the courses we've recruited for the Wikipedia branch of this program have improving Wikipedia's knowledge equity as a specific goal. Four courses focused exclusively on topic areas related to women's suffrage; one focused on biographies of women important to the history of the Merrimack River Valley in New England; two focused on family planning topics; and among the rest, many individual editors were motivated to improve English Wikipedia's knowledge equity gaps by focusing on underdeveloped content areas. We have built knowledge equity discussions into the course curriculum for each course, as we think it's important to encourage all participants to consider inclusivity in their work.

In the Wikidata branch of the program, participants were similarly motivated to improve equity-related issues. Librarians from the University of Toronto are using Wikidata to create subject headings to describe items from their Indigenous Perspectives Collection; previously, subject headings contained more problematic language in describing indigenous populations. Staff at the Experience Design Department at the Art Institute of Chicago have been trying to determine what percentage of their collection was created by artists who identify as women; now, armed with the knowledge from our course, they're working with Wikidata to get the answer to this question and other equity-related questions. Additionally, the "exonerated of" property now in Wikidata thanks to our courses is a good example of our work on equity: social justice advocates in the United States focus a lot of attention on the criminal justice system's inequities.

Quality

The high quality work these participants produce continues to amaze us. In particular, we're pleased with the outcomes of our work with the National Archives courses related to women's suffrage. Our Advanced course, which worked collaboratively to bring the article on the 19th Amendment up to Good Article status, is a great example of quality work. Another editor from one of our National Archives courses brought the article on Woman suffrage parade of 1913 up to Good Article status. We had some really great article improvements from our course with the National Science Policy Network and two courses with the Society of Family Planning.

On the Wikidata end of things, a great example of the quality of partnerships is how our course inspired two archivists from Carnegie Hall to use Wikibase as their internal collection data management solution. The participants established the Carnegie Data Lab, and a blog post they wrote describes how they're leveraging what they learned through our course to improve their understanding of what should and shouldn't be in Wikidata, and how to best engage in their work.

Reach

A key element to our Reach strategic priority was to start a project with Wikidata: As more and more digital assistants use Wikidata as the backbone for answering users' questions, it becomes more and more critical to improve the quality and quantity of information on Wikidata. Creating a program aimed at improving the quality of Wikidata items is a major effort toward reach in that sense. We have been impressed with our participants' edits to Wikidata in general but especially the work they've done to expand Wikidata's identifiers. Identifiers on Wikidata allow museums and libraries to connect information in records databases to Wikidata. So far, our participants have added six identifiers that apply to thousands of books, works of art, and other cultural heritage objects. The Frick Art Reference Library Artist File ID will allow artists in the Frick's collection to be connected to their Wikidata items. This work allows Wikidata's users to better define data and make linked data more discoverable thanks to the knowledgeable metadata experts we've brought to Wikidata.

On the Wikipedia end of things, we'd be remiss if we didn't mention one giant win: We coached medical professionals to successfully edit English Wikipedia's abortion article, which averages 3,000 views per day. Our experts improved the sections on methods and safety. Bringing experts to improve high-traffic articles like this demonstrates the work we're doing to fulfill our strategy's Reach priority.

Other projects

edit
Programs & Events Dashboard
Programs & Events Dashboard won an inaugural "Coolest Tool Award" at Wikimania

2019 was the most active year yet for Programs & Events Dashboard. More than 27,000 users participated in programs and events tracked by the P&E Dashboard. The Dashboard saw 2,198 new programs and events in 2019 (up from 1820 in 2018), and 1,455 different program organizers who used it (up from 1,128 in 2018). The events listed on the Dashboard, and the editors who are leading them, actually represent the most complete census of program organizers in the Wikimedia movement, which is one of the unintended benefits for the Wikimedia Foundation program evaluation team — they used our data from the Dashboard as the starting point for identifying and surveying program leaders.

The P&E Dashboard has become central to the programmatic efforts of a number of other Wikimedia organizations, especially ones focused on programs for new users. Art+Feminism, for example, runs about 300 edit-a-thons each year, mostly in March. The Dashboard — and making sure the organizers of each event set up and use a Dashboard page for it — is central to their coordination efforts, both to surface problems like pages in danger of deletion, and to get cumulative data about the impact of their events. This year, they are running a new program of stipends to cover event expenses, and having a Dashboard page set up ahead of time is the prerequisite for getting a stipend for one of these events. Many Wikimedia chapters and other groups are similarly using it to keep track of and collect data for their events; there were about 190 different campaigns — that is sets of related events — that were added to the Programs & Events Dashboard in the last year. The Dashboard also won the Coolest Tool Award in the Outreach category at Wikimania this year!

Our continued efforts to improve P&E Dashboard for the wide array of global use cases resulted in a few notable new features this year: we improved support for projects that track edits across multiple wikis; in support of #1lib1ref, we added "references added" as a core stat for wikis that have compatible ORES data available; and we added the option of using PetScan queries as the basis for tracking a specific set of articles.

Wikimedia community engagement
Wiki Education won the Education Impact Award during WikiConference North America.

Speaking of awards, we also won the Education Impact Award during WikiConference North America in 2019! Most of Wiki Education's staff traveled to Boston for the conference. Staff, board, and program participants gave at least 17 talks during the event. It was a great opportunity for us to connect with the North American community of editors, and learn from others' experiences as well.

Three staff also attended Wikimania and one staff attended Wikidata Con this year, giving us the opportunity to share our learnings and learn from the global Wikimedia community as well. As described in our midterm report, we also participated in the Prague Hackathon, the Wikimedia Summit, and the Wikipedia + Education Conference.

Visiting Scholars

As mentioned in our last report, we've moved the Visiting Scholars Program into maintenance mode, where we didn't actively dedicate staff time to it. By the end of 2019, only two Scholars were still active, meaning we missed a handful of goals we had set for the program. The content that was added was impressive; a total of 16 articles by Visiting Scholars were promoted to Good or Featured Article in 2019. Our 2020 proposal does not include goals for the Visiting Scholars program.

Measure of success Goal (2019) Progress to date % completed Notes
Total Participants 10 9 90% We did not add another Scholar as expected, as we are winding down this program.
Newly Registered 0 0 n/a As expected.
Content Pages Improved 750 398 53% Several Scholars have been less active than expected.
Quantity 500,000 194,052 39% Several Scholars have been less active than expected.
Quality Articles 100 84 84% The work our Scholars have done has been of extremely high quality, however.

Revenues received during this period (6 month for progress report, 12 months for impact report)

edit

Please use the exchange rate in your APG proposal.

Table 2 Please report all spending in the currency of your grant unless US$ is requested.

  • Please also include any in-kind contributions or resources that you have received in this revenues table. This might include donated office space, services, prizes, food, etc. If you are to provide a monetary equivalent (e.g. $500 for food from Organization X for service Y), please include it in this table. Otherwise, please highlight the contribution, as well as the name of the partner, in the notes section.
Revenue source Currency Anticipated Q1 Q2 Q3 Q4 Cumulative Anticipated ($US)* Cumulative ($US)* Explanation of variances from plan
Selling Impact USD $2,455,000 $475,080 $372,877[1] $900,287 $288,169 $2,036,413 $2,455,000 $2,036,413 A large foundation grant renewal expected in Q4 of 2019 has now been pushed to 2020, leaving a gap in calendar year 2019; additionally, the first WMF payment for 2020 reached us in 2019 so that made our 2019 numbers different than planned.
Selling Services USD $280,000 $24,691 $79,596 $26,200 $49,012 $179,499 $280,000 $179,499 Our ability to project earned income is still evolving, as this is a new program. We see that the program is working and we anticipate getting better at predicting the revenue over time.
TOTAL USD $2,735,000 $499,771 $452,473 $926,487 $337,181 $2,215,912 $2,735,000 $2,215,912

* Provide estimates in US Dollars

"Selling Impact" is the bucket term we use for foundation and individual philanthropic support for our organization's impact. "Selling Services" is the bucket term we use for our earned income model, splitting out money we expect to raise by providing our services.

[1] Based on the language in the grant agreement from a restricted grant we signed in late Q2 but received the payment in early Q3, our accountants moved the recognition of the revenue to Q2. Thus, we've adjusted this number from what we reported in our midterm report to reflect how our accountants recognized the money in our organization's financials.

Spending during this period (6 month for progress report, 12 months for impact report)

edit

Please use the exchange rate in your APG proposal.

Table 3 Please report all spending in the currency of your grant unless US$ is requested.

(The "budgeted" amount is the total planned for the year as submitted in your proposal form or your revised plan, and the "cumulative" column refers to the total spent to date this year. The "percentage spent to date" is the ratio of the cumulative amount spent over the budgeted amount.)
Expense Currency Budgeted Q1 Q2 Q3 Q4 Cumulative Budgeted ($US)* Cumulative ($US)* Percentage spent to date Explanation of variances from plan
Student Program USD $510,123 $121,721 $115,456 $108,896 $108,920 $454,993 $510,123 $454,993 89% We reduced spending vs plan by 10-15% everywhere to match our revenue.
Scholars & Scientists Program USD $680,797 $150,326 $158,666 $144,416 $154,653 $608,060 $680,797 $608,060 89% We reduced spending vs plan by 10-15% everywhere to match our revenue.
Technology USD $410,854 $84,867 $91,392 $82,178 $82,199 $340,636 $410,854 $340,636 83% We reduced spending vs plan by 10-15% everywhere to match our revenue.
General/HR/Finance/Admin/Board/Fundraising USD $1,004,468 $244,295 $212,661 $177,241 $214,419 $848,616 $1,004,468 $848,616 84% We reduced spending vs plan by 10-15% everywhere to match our revenue.
TOTAL USD $2,606,242 $601,208 $578,176 $512,731 $560,190 $2,252,305 $2,606,242 $2,252,305 86%

* Provide estimates in US Dollars


Compliance

edit

Is your organization compliant with the terms outlined in the grant agreement?

edit

As required in the grant agreement, please report any deviations from your grant proposal here. Note that, among other things, any changes must be consistent with our WMF mission, must be for charitable purposes as defined in the grant agreement, and must otherwise comply with the grant agreement.

  • No significant deviations.

Are you in compliance with all applicable laws and regulations as outlined in the grant agreement? Please answer "Yes" or "No".

  • Yes.

Are you in compliance with provisions of the United States Internal Revenue Code (“Code”), and with relevant tax laws and regulations restricting the use of the Grant funds as outlined in the grant agreement? Please answer "Yes" or "No".

  • Yes.

Signature

edit
Once complete, please sign below with the usual four tildes.

Resources

edit

Resources to plan for measurement

edit

Resources for storytelling

edit