Wikimedia Foundation Annual Plan/2023-2024/External Trends

Last year, the Wikimedia Foundation shared a list of external trends, prompted by the "Puzzles and Priorities" from CEO Maryana Iskander in her incoming Listening Tour. This year, the Foundation set out to update the trends and is requesting help from the wider Movement to share their thoughts on these topics. When considering the world around us, a variety of perspectives makes for a clearer picture of reality and better-informed decisions. We welcome and encourage your feedback on the draft ideas below.

This is an opportunity to look outward: As a Movement, we need to keep asking: "what does the world need from us now?" It is also a starting point for shared understanding, even with different views: Trends analysis requires us to have a long-term view and to track what is important to us - even if we have differing views about how to address it.

With that said, we live in a complex, fast-changing world. This is not a comprehensive list of threats and opportunities facing our Movement, but rather a few of the most pressing issues we face.

Search & Content

edit

Update from 2022: social platforms continue to disrupt traditional search engines, but artificial intelligence (AI) threatens even more significant disruption.

Personality-driven experiences are increasingly drawing younger audiences to social platforms (TikTok, Instagram) and away from traditional search engines. Social platforms are testing out new search features to keep users engaged. Traditional search engines are testing different strategies to stay competitive and remain destinations – which reduce SEO ranking of Wikimedia content in external search results links.

The explosion of generative AI could benefit knowledge creation and consumption, but creates uncertainty and risk for our role in the knowledge ecosystem. In just 2 months, ChatGPT became the fastest-growing consumer web application of all time. Traditional search engines and browsers (Google, Bing, DuckDuckGo) have begun to pilot AI chatbot-assisted search, leveraging GPT and other large language models (LLMs) – which rely in part on Wikipedia as a source of training data and a knowledge store but do not always represent or attribute information coming from our projects accurately.

These technologies are emergent and evolving quickly, and they could eventually be used in ways that help us advance the Wikimedia mission – e.g. by adding efficiencies to content creation, moderation, and technical contribution workflows on our projects, as well as providing new ways to make content more findable and accessible to readers. However, there are barriers to integrating generative AI tools into our projects – e.g. open questions about the copyright status of the output of generative AI, cost to maintain and run, and concerns about bias and inaccuracy.

There are also major challenges to our sustainability presented by this technology. The use of AI assistants as an entry point to search could exacerbate existing challenges around attribution and disintermediation, further distancing consumers from contributing to or financially supporting our projects. Widespread AI-assisted content creation (both on and off our projects) also risks overrunning our projects with unreliable and/or harmful content and irrevocably damaging our mission and brand. In March 2023, we hosted the first of what may become a regular conversation with community members interested in this topic to discuss opportunities, challenges, and next steps.

Disinformation

edit

Information warfare is intensifying. Information warfare as a political and geopolitical weapon by governments and political movements is intensifying and growing more complex/subtle, while also growing more dangerous (disinformation campaigns are increasingly accompanied by physical threats, blackmail, arrests, etc.).

Machine-generated content is expanding. The ability of artificial systems to generate quality content is expanding, and crucially, its societal mainstreaming is unfolding fast in most major markets. How Wikimedia positions itself could help shape the field.

Encrypted disinformation and misinformation attack vectors are growing. Digital privacy and information warfare fears push disinformation further into closed channels where encryption makes monitoring and prediction more challenging, allowing disinformation to thrive and polarization to further advance. As an open platform counter to this damaging trend (monitor-able, stewarded, cared for, public and open) we can show our form/method as functionally addressing the problem.

Wikimedia has become a notable target. In 2022, disinformation narratives and dedicated attacks against the Movement, individual volunteers, and the Foundation also increased, creating increasingly severe risks for our volunteers and for the Foundation's reputation.

Regulation

edit

Wikimedia has become more international as an organization, which means that more laws of more countries apply to us. To protect our projects and people, we will need to comply with an increasingly broad range of laws around the world. This includes laws in the United States, broad laws in the European Union such as the Digital Services Act and, on a case by case basis, laws in other countries in multiple regions of the world such as defamation and privacy claims. We must be prepared to fight against harmful government actions in court and publicly advocate against harmful laws in even more countries as we continue to grow.

More is demanded of hosting providers than ever before. Governments are under increased political pressure to address a variety of perceived harms and biases online. In the near future, multiple cases involving seminal internet law CDA 230 will be heard by the US Supreme Court, potentially disrupting well-established intermediary protections that platforms like Wikipedia rely upon. Meanwhile, penalties for hosting harmful content are growing, including criminal liability in some cases like the UK Online Safety Bill.

Lawmakers aren't thinking about Wikipedia. Legislation continues to conflate Wikimedia with for-profit platforms. Few policy makers understand Wikimedia's volunteer-led content moderation model. We need to educate governments and policy influencers about Wikimedia's model -- and how laws should protect and support it.

Our relationship with for-profit tech platforms is important and complicated. We need each other. But to protect Wikimedia's model, projects, and people from harmful regulation, lawmakers and policy influencers must be educated about how we are different from large for-profit platforms. We must invest in ways to ensure that lawmakers and policy influencers understand how our volunteer-led model works and our Movement's positive role in society.


HistoryGoals