Future Audiences/Experiment:Add a Fact


Add A Fact was an experiment developed by the Future Audiences team at the Wikimedia Foundation that ran August-December 2024, building off of a previous related experiment (Citation Needed)

As with all Future Audiences experiments, the goal of Add A Fact was to learn about how we might continue to sustain and grow Wikipedia in a changing online knowledge landscape. In this case, we were seeking to understand if/how people can make editorial contributions off-platform (that is, without going directly to Wikipedia.org), and if generative AI can support or hinder this process:

  1. This experiment: We are interested in understanding how/if being able to add facts to Wikipedia in a light-touch way could help existing Wikipedia editors speed up their process
  2. Potential future experiments: We are also interested in getting Wikipedians’ thoughts on how/if being able to add facts to Wikipedia in a light-touch way could engage new casual/non-Wikipedian audiences to contribute productively.

Common questions are answered at our FAQ.

How it works

edit
Demo video of Add A Fact

Add A Fact is an experimental extension for the Chrome browser. An autoconfirmed English Wikipedia account is required to use this extension. You can download the extension from the Chrome store here:

  1. After downloading and logging in (we also recommend pinning the extension to your Chrome toolbar), you can use this extension on any non-Wikipedia or Chrome store webpage.
  2. While reading any secondary source on the web (a news item, a scholarly article, etc.), you can open Add A Fact and highlight a short claim that you may want to add to Wikipedia.
  3. A large language model (LLM) will check if the selected claim is related to any existing Wikipedia articles, and will present information about whether the fact is fully, partially, or not present in these articles. You may also search for an article of your choosing.
  4. Once you select a Wikipedia article to add your fact to, Add A Fact will give you the option of sending a pre-filled template message to the talk page of the article, which includes the selected text, any additional comments you’d like to add, and a structured citation (using Citoid). This message will be signed under your Wikipedia username.
  5. If the URL of the source you are on appears on WP:Reliable sources/Perennial sources, you will receive a warning message about your source’s reliability (but will still be able to add a suggested fact from this source).
  6. If the URL of the source you are on appears on the spam blocklist, you will not be able to add a suggested fact from this source.

To limit any potential misuse/spam, Add A Fact users will be limited to sending a maximum of 10 facts per day during this early experimental period. Recent edit proposals made with this tool can be found here.

Add A Fact is a temporary experiment and is subject to change based on user feedback. You can follow Add A Fact’s development in Phabricator.

Research questions

edit
  1. Do people on the internet want to contribute good-faith information to Wikipedia?
  2. Who are the people who would be interested in doing this? i.e.:
    • The general public – people who have a casual relationship to Wikipedia (are aware of and may visit it from time to time, but wouldn't consider themselves members of our movement, may not donate, etc.)
    • People who are Wikipedian-like in some way – e.g., Reddit moderators, subgroups on the Internet (i.e., fandoms, communities, fact-checkers, etc.); donors
      • What could incentivize non-Wikipedians to do this? i.e.:
        • Add extra incentives: i.e., wrap the "add a fact" functionality into another useful end-user tool, e.g. Citation Needed (if we discover it is useful/attractive to end-users)
        • Radically lower the barrier to entry: i.e., make the functionality run in the background, like spellcheck (checking for and identifying claims that look like they are on reliable sources and should be added)
        • Other?
    • Existing Wiki(p/m)edians
  3. How might we deliver these contributions into existing or new pipelines for human review/oversight/addition to Wikipedia?

Insights

edit
 
Data & insights from the Add A Fact experiment

Key takeaways:

  • Some Wikipedians are curious to try new tools for contributing facts and sources to Wikipedia from outside our projects
  • These contributions generated productive discussions and additions to Wikipedia, as well as some pushback from existing community members, who were concerned about the legal and ethical implications of using AI-assisted tools to add content to Wikipedia.
  • More thought is required for how to create sustained, productive contributon workflows off-platform. As with Citation Needed, most Wikipedians did not continue to make use of this extension, pointing to a need to consider other user experience approaches.
  • Exploring other ways of contributing off-platform is a promising direction to continue exploring; however, more discussion is needed about whether and when it is appropriate for AI-assisted tools to add content to Wikipedia at scale.

See also

edit
  • WikiGrok (2014-15): on-wiki experiment to encourage casual Wikipedia readers to contribute a structured Wikidata fact to a topic (by answering a simple question about the article they were reading).
    • Findings: high overall engagement and quality of responses (especially when aggregated). Main blocker was in the cost to maintaining/scaling the infrastructure to power suggested questions (at the time, a graph database was the best solution, but there were no affordable, scalable open source solutions on the market).
  • Citation Hunt: a game hosted on Toolforge that allows anyone to search for/add a reference to an unsourced claim onwiki.
  • Wikidata for Web: an extension that displays data from Wikidata on various websites and also allows extraction of data from these websites to input into Wikidata.
  • Article Feedback Tool: a tool piloted to engage readers to participate on Wikipedia and to help editors improve articles based on reader feedback.
    • Findings: Readers welcomed the opportunity to engage with Wikipedia in a new way, but since they were asked to provide freeform feedback on article quality, most of their output was not useful to improving the content. From the final report: “Over a million comments were posted during this experiment: on average, 12% of posts were marked as useful, 46% required no action, and 17% were found inappropriate by Wikipedia editors. However, a majority of editors did not find reader comments useful enough to warrant the extra work of moderating this feedback.”