Community Wishlist Survey 2022/Larger suggestions/Identifying Lobby Teams/Proposal
- Problem: The problem is how to prevent corporate (not individual) sock-puppets - where teams of employed (or conflict of interest) editors collude to steer a narrative. While Wikipedia has a strong ability to moderate and manage those efforts by individuals to distort, rewrite, or promote a particular viewpoint, it is not at all as strong when dealing with large corporate interests, especially when pushing a narrative that, at face value, may appear to be robust and peer-reviewed, but is actually politically or financially motivated. As one of the most visited websites in the world, where people come to find out 'the truth', there is an increasing responsibility to prevent political/financially motivated agendas.
- Proposed solution: This behaviour could be identified using modern graph analysis tools based on edit and comment behaviours. Current sock-puppet approaches use tools such as stylometric conformance analysis, but I believe that additional tools could be implemented that look at cross-user behaviours, especially where comments and edits are often shared, following patterns of behaviour that indicate focus on specialised content. This is particularly noticeable where there are a group of editors who are aggressively defensive in their given subject area.
For example, of a potential positive hit, it is not unusual, given any area of expertise, for editors to agree most of the time - however, when they agree all of the time, then their relationship becomes suspect.
- Who would benefit: Everyone who matters benefits from increased vigilance.
- More comments:
- Phabricator tickets:
- Proposer: 20040302 (talk) 20:37, 10 January 2022