Community Resilience and Sustainability/Conversation Hour May 2 2023

You are invited to the quarterly Conversation hour led by Maggie Dennis, Vice President of Community Resilience and Sustainability, on May 2, 2023 at 18:00 UTC.

Maggie and others from the Community Resilience and Sustainability team will discuss Trust and Safety, the Universal Code of Conduct, Committee Support, and Human Rights.

This conversation will happen on Zoom. If you are a Wikimedian in good standing (not Foundation or community banned), write to let us know you will be attending the conversation and share your questions at answers(_AT_)wikimedia.org at least one hour before the conversation. Please place “CR&S” in the subject line. Someone will follow up with the Zoom details.

If you do choose to participate in this conversation, Maggie would like to bring some expectations to your attention:

  • I can't and won't discuss specific Trust and Safety cases. Instead, I can discuss Trust and Safety protocols and practices and approaches as well as some of the mistakes we've made, some of the things I'm proud of, and some of the things we're hoping to do.
  • I will not respond to comments or questions that are disrespectful to me, to my colleagues, or to anyone in our communities. I can talk civilly about our work even if you disagree with me or I disagree with you. I won't compromise on this.

You may view the conversation on YouTube and submit questions live in Zoom and YouTube.

The recording, notes, and answers to questions not answered live will be available after the conversation ends, usually in one week. Notes from previous conversations can be found on Meta-wiki.

Notes

edit
Turkey is apparently having a very divisive election next month and has blocked Wikipedia previously. What is the WMF doing to help protect our projects?
There are three challenges that the election might bring: (1) risk of the geopolitical event itself, which could lead to societal unrest (2) risks to the project, particularly the Turkish Wikipedia, so the stewards and admins are working closely with Trust and Safety to make sure the platform is safe; and (3) risks to Wikipedians – Trust and Safety is working with local communities to make sure Wikipedians are safe and allow Wikipedians to raise concerns proactively.
There is the risk of on-wiki issues spilling into off-wiki problems; we are working with and sharing relevant information with local community partners to be able to mobilize and act, in case something happens.
The Movement Charter Drafting Committee published initial draft chapters of the Charter in November 2022. When will the next chapters of the Movement Charter be published and what is the community engagement plan around them?
Right now, we are wrapping up community review for the draft ratification methodology. We will be posting updated drafts of the previous chapters (Preamble, Values and Principles, and Roles and Responsibilities statement of intent) on Meta-wiki. We are also going to be publishing drafts of new chapters (Roles & Responsibilities, Global Council, Hubs and Decision-Making, and an accompanying Glossary) in July 2023, ahead of Wikimania. This will be paired with another round of community consultation to seek community feedback on the drafts.
There will be an in-person meeting in June to work out the more contentious points of the Movement Charter and prepare for the publication of more chapters and the engagement ahead of Wikimania.
In the ratification proposal that was shared by the MCDC in April, it was suggested that every affiliate will have one single vote per affiliate. Many active organizers in the movement are members of multiple user groups, so the suggested method would be giving more weight to these individuals and small user groups. Is this something that might be fixed with the ongoing affiliate strategy process prior to the ratification?
No ratification process that we propose will be perfect. For example, a person who is a member of a user group and edits the projects will be able to contribute three times: (a) via the user group, (b) as an individual contributor, and (c) as part of a project. That’s why we asked specific questions on how to ensure a fair election within affiliates (chapters, thematic orgs and user groups). The Movement Charter is ratified only if all four “voting groups” ratify: (a) affiliates, (b) projects, (c) individual contributors, and (d) Board of Trustees.
Is the charter just a defining document? Or does it have applications?
The Charter will be defining some roles and responsibilities of different entities, like the Global Council. There are real-life applications of those definitions and parameters.
The perspective on this annual plan - CR&S is integral for the APP. There is a topic around creating a set of metrics to measure community health. What is conceived around this topic? Why is this a priority and what is coming from this topic?
CR&S is not the team that builds the metrics, but we know the “community health” metrics intimately. We want a thriving community where people are set up to succeed. It’s hard to know if the community is safe and healthy if we don’t have measurements. For example, with regards to harassment, we have tried to create a baseline to work to improve (to decrease the number of harassments).
The Wikimedia Foundation is accountable to the communities if we have clear metrics – to see if we meet the metrics and at the same time, to see where we can improve to meet the metrics with feedback and collaboration with communities. For example, the Foundation’s Global Data & Insights team has run quarterly surveys on how safe communities feel, and based on those results, the Foundation collaborates with those communities that indicate they do not feel safe to improve their environments and make sure contributors feel safe. (The Portuguese Wikipedia was referenced.)
I have heard the Movement Strategy and Governance team was dissolved with the recent layoffs. Many of them were community points of contact, especially in non-English speaking communities. How do you answer people who say that this change means the Foundation does not consider community important?
The Foundation has not lessened its support of communities, especially non-English speaking communities. The intention is to consolidate this support – community support and outreach – under one team to centralize administrative and operational work. The team is Movement Communications, which resides with the Communications department and not within CR&S. Yes, there were some terminations to reduce redundancies…
Regarding Movement Strategy, the Foundation is still committed to the Movement Strategy, which has been integrated into the core of the Foundation’s work to allow for cross-departmental collaboration (e.g. hubs coordination).
Trust and Safety staff are not made public. The staff and contractors page says there are over 20 people from 4 continents, but it also still lists people in Movement Strategy and Governance that I know were laid off. How did the layoffs impact Trust and Safety? How are you adjusting your work?
Tighten our management and stabilize our resources. We are committed to continuing support for human rights, T&S operations, and disinformation cases, but T&S policy is leaner with the ratification of the UCoC Enforcement Guide, as we start to build out the Universal Code of Conduct Coordinating Committee.
The community’s needs are changing, and the Foundation wants to prioritize those needs - not only by making sure resources are stable, but that the Foundation is appropriately responsive. The Foundation has always prioritized Trust and Safety. The tools Trust and Safety are using are the same tools the community is using, so it is a shared interest we will work on together.
I read something about the right to be forgotten in the AP, does it apply to the contributors who committed something?
“Right to be forgotten” is a case by case thing and is not intended to be applied in a case where a user has done something wrong, but there might be some cases where it would apply. It is protected under European privacy law. Removing content is a decision that the community would have to make. If the community decides vanishing an account would negatively impact anti-vandalism work, then it probably would not apply.
The Terms of Use update includes stronger language on Undisclosed Paid Editing and Child Protection issues. Now what?
The community consultation around the Terms of Use (ToU) has wrapped up in April. The next step is that the ToU goes to the Board of Trustees to review and approve. Once approved, we’ll need to figure out implementation, with the community’s input. Part of the plan is socialization of the ToU to make sure all communities (big and small) know the full impact of the ToU and its implications.
The recent anti-LGBTQ+ bill in Uganda is really concerning. What steps is the Foundation taking to protect its LGBTQ+ volunteers?
Context at the time of the recording: The bill has passed and it’s going to the Ugandan President to pass or veto. The bill imposes very stiff punishment - up to and including the death penalty - for activities like “promoting homosexuality,” which can be interpreted as a carte blanche for law enforcement to imprison anyone they deem to be homosexual.
The Human Rights team within the Foundation has developed a video to educate community members on digital privacy – how to protect ourselves online. And we have strengthened our local partnerships in Sub-Saharan Africa who can provide support and resources on the ground, in cases of emergencies.
For reference: Queering Wikipedia 2023 Conference
How much detail would we expect to see from Legal as (or after) the mediation/arbitration new methodology gets its first use? It would be good to aid awareness for more info to be shared, at least after it concludes.
It is probably not widely shared information – what can be shared with whom and how widely depends on the nature of the case and some might require Non-Disclosure Agreements for access. Legal tries to be as transparent as possible, but it will be reviewed on a case by case basis.
I understand why the Foundation doesn’t discuss active human rights cases. My question is about your approach. What does the Foundation do when it finds out that a community member has been jailed for editing? How is that different from a terrorist threat on a person? Or on the page of an article about a school or government building?
Two different workstreams: (a) threats of violence against schools, government buildings, etc. and (b) threats against individuals within the community. For (a), we pass it on to the respective local law enforcement or the people who can handle those threats. For (b), we have a protocol that starts with assessing the context and coming up with a plan to react in a timely manner.
When an individual is jailed for their editing, we assess the situation and conduct an investigation to see if there were other activities they were doing that also might have put them in risk (for example). The Human Rights team is working with Trust and Safety internal to the Wikimedia Foundation, and with local partners external to the Wikimedia Foundation. Every case is different; the context determines our response.
One of things I've observed working with new editors over the years is that when they run into trouble with other editors — from simple rudeness, to encountering POV pushing, to the perception that an experienced editor is following them around and giving them grief, to occasional outright abusive behavior — they are completely lost when it comes to what they can or should do about it. On other sites, there are obvious ways to report other users or content, to flag it for moderators. On our projects, there's no one single answer, because we have community processes for dealing with different types of problems. But it's a huge learning curve to even know which community process might be the right one for a new user with a problem. (I believe this is one of the bigger reasons why Wikipedia has a reputation as a toxic community among some circles.) How do we get better in that area?
English wiki had an etiquette noticeboard that was shut down, as well as a Request for Comment: User that was also shut down. In both cases, there was a commitment to build something better that never happened. This is a problem we have struggled with as a community for many years. We are trying to build understanding and acceptance for reasonable expectations of conduct/manners online in our communities vis-a-vis the Universal Code of Conduct and the forthcoming Universal Code of Conduct Coordinating Committee (U4C).
The community also thought about this problem for a long time, including manifested in the Movement Charter for “safety and inclusion.” New users lack familiarity with what avenues to take when there’s a problem. Private Incident Reporting System is being piloted in some communities to get to a minimally viable product to roll out. Because communities self-govern, there won’t be a “one size fits all” approach to this; every community can customize for their local context.
We should do what we can to make sure our newcomers do understand the challenges that they may face as a contributor. For example, we discourage people from creating usernames with their real names. What we try to achieve is “informed consent” of what they’re signing up to do.
Is the Trust & Safety Case Review Committee still active? When will we see the numbers for the first quarter of 2023? Why are so many appeals ineligible for review?
The Case Review Committee is indeed still active, and in fact we are onboarding a few new members. The numbers for last quarter have just been published. Cases are eligible for appeal based on the review of Foundation attorneys relative to the severity of the case. This determination is made at the time the sanction is issued, to avoid any bias at the time of request. In the most extreme cases, which we term Category 3, the Foundation’s action was deemed legally necessary or the situation is deemed dangerous enough that appeal is not possible. There is also a Category 2, where the Case Review Committee may not automatically overturn a decision but may advise that the Foundation consider doing so.
When the CRC was first established, we imagined that they would receive more appeals than they do from Category 1, which involves on-wiki harassment but not of such severity or of such a nature to fall under the higher categories. We also imagined that they might receive more appeals from people who write Trust & Safety asking the Foundation to ask where the Foundation declines. (Generally such requests are declined because the issue is deemed more appropriately handled by community bodies, because the complaint is less severe or because the community in question has appropriate processes for self-governance.) The majority of requests the Foundation receives are declined.
In practice, we have found that this is not the case. More appeals come in from those who are sanctioned and who are sanctioned at Category 3. While I can’t disclose generally in terms of Foundation bans who is banned in what category (our policies forbid us discussing individual cases in any specific detail), I can point out that of the 23 bans issued last year by the Foundation, 16 of them were related to the MENA action in December which we publicly discussed. Those were unappealable. The majority of Foundation sanctions issued are.
This does not mean that the work the CRC has done is not valuable, however. In fact, one case they sent back requesting stronger handling has occasioned a review with global functionaries of our approach to one particular kind of issue. (Very sorry, but I can’t name that issue here.) And where they have requested more information from or recommended new practices to the Trust and Safety team, this has influenced our practices going forward.