Wikimedia Foundation elections/2024/Questions for candidates/Question 3

In the 2024-25 draft Wikimedia Foundation Annual Plan, there is a statement that Wikimedia content is becoming less visible as part of the Internet's essential infrastructure, because an increasingly closed and artificial intelligence-mediated internet doesn't attribute the source of the facts, or even link back to the Wikimedia projects. What responsibility does the Board and the Wikimedia Foundation have in enforcing the CC-by-SA licensing of the content from all projects by AI or other digital media information formats that do not respect the copyright law?

Bobby Shabangu (Bobbyshabangu)

I believe laws are crucial for our protection, but current laws may not be adequate to safeguard Wikimedia from AI companies using our content without proper attribution. I think the Board should not passively wait for change though. It should act now by collaborating with mission-aligned organisations like Creative Commons to advocate for the enforcement of CC-by-SA licensing. This situation also presents an opportunity to engage with major open source AI organisations that rely on our facts. Partnering with them to ensure proper attribution of Wikimedia content could be mutually beneficial and promote ethical use of information across other digital platforms.

Deon Steyn (Oesjaar)

The Board has to act, it cannot just leave this matter. The responsibilities include legal action against violators, establishing monitoring mechanisms, advocation for stronger legal frameworks, developing standards and best practices AI practices, providing educational resources and running awareness campaigns.

Collaboration with technology companies is crucial to ensure compliance with licensing terms. The Board and the technology companies must also develop technical solutions that facilitate proper attribution and linking.

Overall, the Foundation and its Board need to take a proactive and multi-faceted approach to enforce CC-by-SA licensing in the era of AI and digital media. I can think of legal enforcement, advocacy, community engagement and collaboration with technology companies.

Erik Hanberg (Erikemery)

In general, I believe the board should be proactive in its enforcement of copyright law for its content. The Creative Commons license is not a giveaway.

Unfortunately, when it comes to AI in particular, my understanding of the legal and copyright questions posed by chatbots and such is that winning a lawsuit on these grounds is not certain. The law has not caught up to the technology. And a court decision that goes the wrong way might be a blow. In that regard, some strategic patience may be required, as the WMF seeks favorable conditions for a legal strategy. But, again I believe the board should actively defend its legal copyright claim where it can and where it has a likelihood of success.

Farah Jack Mustaklem (Fjmustak)

No response yet.

Christel Steigenberger (Kritzolina)

Yes, we see a world that doesn’t respect the way we want to share our knowledge. Often it diminishes our efforts as volunteers by not giving credit where credit is due. And I think this is a problem that goes beyond the scope of this question. Licenses are only a small part of the issue. On this topic I see a clear responsibility of the Board to act in its core capacity of developing strategies on how to tackle problems volunteers face. The Wikimedia Foundation has the responsibility to research and monitor the situation (it already does, this is why we see these issues listed) and to act on the strategies the Board, I am confident, is already working on.

Lane Rasberry (Bluerasberry)

Our community of content creators develops open media. We generously and freely share this media, but there are many corporations with bad intent who capture the content, conspire to make it closed, then convert our free access into their paid product. This is a known problem that affects more than just Wikimedia content, and it is bigger than just the Wikimedia platform. When corporations abuse the system, they are following copyright law, because they are large enough to control governments and write the laws. It may be legal, but at least the Wikimedia Foundation can support the user community in telling the world that corporate capture of the commons is unethical.

To solve the problem of enforcing Creative Commons licenses, I would like to propose a more formal multiyear partnership with the Creative Commons organization. Wikimedia and Creative Commons are interdependent on each other, and if we want expertise with their licenses, then it would be less expensive and more sensible to fund their advocacy instead of duplicating their efforts with Wikimedia Foundation staff. Their budget was US$4 million in 2022 compared to $180 million for the Wikimedia Foundation. Because we need them, and because of the difference in money power, it is appropriate for us to share something.

There have been Wikimedia community complaints that the Wikimedia Foundation endlessly grows its own staff bureaucracy. I do not support wild growth of the Wikimedia Foundation, but our donors give because they trust us to protect the world. Part of our protection should be sharing money with movement partners, and Creative Commons is one of those.

Besides Creative Commons I think we should commit multi-year grants to other smaller allied organizations who support our mission, including Internet Archive who maintains our links to deleted websites; OpenStreetMap who manages our maps and who are experiencing corporate capture of their open data and user content community; Flickr Foundation which has provided so much image support to Wikimedia Commons; and the Tor Project which provides essential online privacy services in the media environment.

There is no hope of the Wikimedia Foundation alone resisting corporations. As the biggest and best funded nonprofit steward of an online community, the Wikimedia Foundation can find success in bringing all the smaller more vulnerable communities together.

Lorenzo Losa (Laurentius)

The new pattern of artificial intelligence-mediated access to information poses new challenges that we are not fully equipped to face. It's too early to fully understand the impact it will have on our projects and on the world in general - but we still have to react, because otherwise it would likely be too late. In particular, how generative artificial intelligence interacts with copyright laws, and licenses, is still unclear. There are several litigations taking place between AI vendors and publishers or authors, and some have led to unexpected results.

Even beyond AI, enforcing the Creative Commons licenses used in our projects is hard for the Wikimedia Foundation because the copyright is owned by individual contributors. The Foundation has no more rights on the content of Wikipedia than any other person in the world. The Foundation can have a role in communicating the issue and advocating for solutions, and (with some limits) supporting individuals who want to do more, but cannot directly appeal to a court for a license violation.

Wikipedia Enterprise might give us some small additional opportunities, because through it the Foundation is in contractual relationships with some large companies - even though this is not directly connected to AI, because Wikipedia Enterprise is not generally useful for AI training (dumps are well enough for that). This gives us a venue for advocating for a use of the content of the projects that is more in line with what we'd like to see - but nothing more, and it's not a tool to force it.

Maciej Artur Nadzikiewicz (Nadzik)

Wikipedia is a strong global brand, but it cannot change the world by itself. It cannot fight alone. The Wikimedia Movement needs allies, like-minded organisations, and mission-oriented organisations that can support it. It also needs a Board of Trustees that is committed to the cause and understands the challenges ahead.

The Board cannot be operational; it shouldn't work on enforcement or micromanage the staff. However, it is the Board's role to plan the organisation's overall strategy and ensure that the next Wikimedia Foundation Annual Plans appropriately focus on the changing world around us. The strategy and Annual Plans should increase resources for public policy efforts; this is one of the most vital activities right now, which, ignored, can lead to our downfall the fastest out of all the dangers that are facing us.

We cannot win with big corporations; some of them pay their only CEO more in salary than our entire Movement spends in one year on EVERYTHING. We need allies in other organisations; we need allies that are activists; we need allies in other projects; we need allies in the regulator sphere. We need to educate people on the value of non-BigTech internet; we need to make sure that the idea of Open Knowledge has allies and supportes.

I have been a Board member at Wikimedia Europe for two years now. During this time, I have appeared at multiple public hearings at the Polish Parliament and various other institutions – User:Nadzik/Policy. I have seen firsthand the impact we can make when appropriate resources are invested in securing the safety of our Movement. We need to strengthen this effort and make it one of our priorities.

Mohammed Awal Alhassan (Alhassan Mohammed Awal)

First of all, I believe that significantly investing in the development of APIs or tools that automatically embed attribution information when Wikimedia content is used is the surest way in ensuring that the CC-by-SA licensing of content from Wikimedia projects is respected. The Board and the Foundation can facilitate the process by enforcing proper attribution and share-alike requirements when content is used by AI systems and other digital platforms. When the Board takes the responsibility to advocate for proper attribution and the ethical use of open content to raise awareness among AI developers, digital media companies, and the broader public about the importance of respecting open licenses and the legal and ethical implications of failing to do so, the tendency of having a reduced violations for copyright will be high. Monitoring for non-compliance, issuing cease-and-desist orders, and pursuing legal remedies when necessary by the foundation's legal team to protect Wikimedia content and uphold copyright law will also reduce incidence of violations of the CC-by-SA license. There should also be the provision of clear guidelines and best practices for AI developers and digital media companies on how to properly attribute or cite Wikimedia content by creating resources, toolkits, and tutorials or documentations on complying with CC-by-SA licensing. Equally, there should be a dedicated legal team to address licensing violations and provide legal support to volunteers and community members who identify non-compliance. To achieve this, the Wikimedia community should be properly engaged in the enforcement process. The board and foundation can enhance the process by engaging the community and encouraging volunteers to help monitor the use of Wikimedia content and report violations. There should be recognition and support for the community's efforts in this regard. I believe that through a combination of legal action, advocacy, technological solutions, and community engagement, the board and foundation will not only be protecting the integrity and visibility of Wikimedia content but also upholding the principles of open knowledge and the legal frameworks that support it.

Rosie Stephenson-Goodknight (Rosiestep)

The Board’s responsibility is to foster high-level strategic thinking and to guide the CEO in this regard. The WMF’s responsibility is to operationalize the strategy. Within the WMF, this topic ("Wikimedia content is becoming less visible as part of the Internet's essential infrastructure, because an increasingly closed and artificial intelligence-mediated internet doesn't attribute the source of the facts, or even link back to the Wikimedia projects.") falls within the Legal Department. As there are rapid changes within technology at the same times as there's a need for advocacy, its vital for the CEO and the Legal Department to keep the Board abreast of challenges and opportunities. At the same time, the community's perspective needs an avenue for discourse, in all the ways that the various communities prefer to share points of view. No easy answer, except that we need to be agile and bold.

Tesleemah Abdulkareem (Tesleemah)

Thank you for the question, according the [plan of WMF 2024-2025] drafted, truly there is emphasy on Wikimedia content gradually fading due to AIs not referencing the encyclopedia contents and also because the articles are now widely spread;with translations available, access to English content as reduced since the common language of goggle is English.

The WMF governed by the board are doing a great job with the wikimedia Enterprise that allows the reuse of content for public consumption, it also allow partnership. To enforce CC-BY-SA licensing especially on contents written by AIs, I will say Wikimedia Enterprise can partner with these digital corporation. There must be a way Meta AI work around citing as I've seen it reference Wikipedia articles a couple of time. Other digital platforms like chatGPT, Gemini among others can be made to do the same.

There should also legal consequences with copyright law clearly stated, Another thing is that there should be more sessions and training for editors, contributors and audience revolving around copyright.

All these will go a long way in ensuring there is proper licensing and prevent copyright infringement when using wikimedia contents.

Victoria Doronina (Victoria)

This is an excellent question. So far, WMF's approach to entities that use our content for commercial purposes, such as Google, has been to ask for some charity politely. As a result, WMF has a productive collaboration with Google, which prominently displays Wikipedia articles in the right panel of the search results. It has also become a multi-year paying client of Wikimedia Enterprise, which is one of the sources of the diversified WMF income.

As far as I know (and I was wrong before), there’s no ongoing relationship with companies that create AIs. Meanwhile, some US newspapers negotiated to sell their content for AI training to these companies. However, they did it via lawsuits, and what is acceptable and expected from for-profit companies can do more damage than bring gain for a charity such as WMF—and we don’t have millions to spend on lawsuits with an uncertain outcome.

I think the Board should encourage the CEO to explore possible mutually beneficial models of interactions with AI companies. If I am reelected, I can start exploring this question.