Grants:IEG/Editor Interaction Data Extraction and Visualization
Project idea
editWhat is the problem you're trying to solve?
editIn this grant we are attempting to improve on two situations.
- First, there have been requests by researchers for some of the data that we aim to make available through this project, for purposes of understanding editor engagement and other strategically important objectives such as increasing geographic and gender diversity, and increasing the likelihood that new editors will remain active over time.
- Second, data needs to be easy to understand and use, even by people who are unfamiliar with data analysis, as communities consider changes to their guidelines and processes in order to improve editor engagement.
What is your solution?
editOutputs
- We will make a rich variety of editor interaction data available, with priority for data production arranged in consultation with researchers, WMF staff, and the community.
- Possible examples of editor interaction data sets:
- Surviving new editors
- New active editors
- Users who propose articles in the Draft namespace and Articles for Creation
- Teahouse participants and hosts
- Self-identified male and female contributors
- Candidates for administrator, arbitrator, oversighter, or checkuser roles
- The Wikipedia Adventure participants
- Grants:IEG/Reimagining Wikipedia Mentorship mentors and mentorship recipients
- Project space participant interactions in deletion nominations, arbitration case pages, Did you know / Good article / featured content candidates, and WikiProject pages
- Editor interaction data will be made easier to understand by using visualizations and metrics. These visualizations an metrics:
- Will demonstrate how interactions have changed over time
- Will demonstrate how interactions differ among Wikimedia sites and among particular areas such as the Teahouse, the Draft namespace, and Articles for Creation
- Will be easy to understand for community members who may apply the research in their communities for efforts to improve editor retention, editor diversity, and similar strategically important objectives
Outcomes and impact
- As a result of our research, we expect researchers and communities to have an easier time with understanding how editors interact with each other, how those interactions are changing over time, and how subsets of users and projects differ in their interactions. This may reveal patterns that suggest which subsets of Wikimedia communities may be healthy in terms of contributor interactions, may suggest which subsets of Wikimedia communities are most hostile, and reveal if policy and project changes being made to Wikimedia communities, such as the implementation of the Draft namespace or the changes to mentorship, are having positive effects on user interactions.
- As Kerry Raymond said succinctly: "...The end game here is to reduce editor attrition resulting from unpleasant interactions."
Project plan
editActivities
editStep 0: determine how to best represent editor interactions for purposes of our research
editEpochFail proposes that the core event of an editor interaction can be represented as a triple of:
- <interaction> ::= <person> <person> <timestamp: int>
- <person> ::= inst. of Human
- <timestamp> ::= int
Since the wiki software represents persons as users -- registered and anonymous -- it seems clear that we need to simplify to:
- <interaction> ::= <actor: user> <actee: user> <timestamp: int>
- <user> ::= <registered user> | <anonymous user>
- <registered user> ::= <id: int>
- <anonymous user> ::= <text: str>
We'll also like to carry a payload of metadata about the event (was it positive or negative? what was the topic of conversation? etc.):
- <interaction> ::= <user> <user> <timestamp: int> <meta>
- <meta> ::= <type: str> ...
- ... ::= A relevant data structure for the type.
Now for an example:
revision | wikitext | event (JSON) |
---|---|---|
1 |
== His stolen watch. == The article is missing information about [...] |
|
2 |
== His stolen watch. == The article is missing information about [...] : What information are you talking about? Was his [...] |
{
actor: {text: "123.123.123.123"},
actee: {id: 987654},
timestamp: 1984567890,
meta: {
type: "talk_page_section",
section: {
index: 1,
title: "His stolen watch."
},
conversers: 2
}
}
|
3 |
== His stolen watch. == The article is missing information about [...] : What information are you talking about? Was his [...] :: Yes it was. There's an article in the [...] |
{
actor: {id: 987654},
actee: {text: "123.123.123.123"},
timestamp: 1984567890,
meta: {
type: "talk_page_section",
section: {
index: 1,
title: "His stolen watch."
},
conversers: 2
}
}
|
Step 1: Extract interactions
editEditor interactions are not explicitly stored in MediaWiki's database with few exceptions. This means that interactions and characteristics of those interactions will need to be extracted from the revision histories of articles and talk pages. Initially, we propose to build software for extracting the following interactions.
- User talk page interaction
- When one editor posts to another editor's talk page, this is as close to an explicit interaction as they come. It also is likely to suggest a very direct interaction.
- Non-user talk page interaction
- When two editors add content under the same level 2 header of a talk page, they are most likely involving themselves in the same conversation. This interaction may be further limited or labelled based on the sequence of additions and explicit @mentions.
- Reverts/disagreements and reintroductions
A very meaningful interaction that takes place between editors is a revert -- where one editor rejects the contributions of another. Advantageously, there's well documented strategies for detecting reverting behavior and public datasets from which we can draw interaction events.
Beyond mere identity reverts (going back to a previous revision), one can detect finer-grained disagreement on single word level, such as word deletes, undos of reintroductions and undos of deletes. It is also possible to extract positive interactions between editors, such as one editor reintroducing content by another one that has been previously deleted by a third one. An extension of the wikiwho approach for authorship attribution already exists that can extract these interactions
There's also an additional set of interactions that may prove more difficult to extract, but may still be important to extract:
- ... for Discussion
- AfD/CfD/RfC/etc and other discussion spaces like it focus discussions on a particular page of Wikipedia. Participants of these discussions may not be directly interacting with each other, however, we may still use their actions to understand their relation to each other (e.g. disagreements/agreements on which articles should be deleted). In this case, semantic analysis may be a problem since, for example, negativity may be directed towards the topic of an article as opposed to another editor.
- Reference desk, Help desk & Teahouse
- These discussion spaces tend to involve a small set of persistent question responders and a large set of of ephemeral questioners. Luckily, parsing them should will likely work with the same code that is used to parse talk page interactions.
- Page deletions
- similar to AfD/CfD/RfC
- Requests for Adminship
- similar to AfD/CfD/RfC but with a more personal focus on the candidate for adminship
- Block and unblock
- similar to AfD/CfD/RfC but with a more personal focus on the candidate for blocking or unblocking
- Right changes
- similar to AfD/CfD/RfC but with a more personal focus on the candidate for rights changes
- Arbitration cases
- complex discussions of personal conduct and site policies
- Format
- sender_id
- sender_text
- receiver_id
- receiver_text
- interaction_type
- rev_id
Step 2: Label of interactions
editTODO: Discuss sentiment analysis and other strategies for classifying interactions.
Step 3: Visualize & analyze interactions
editWe'll explore visualization and analysis strategies of editor interactions to identify those that appear useful for drawing insights about the nature and effects of interactions. For example, we'll develop software to visualize interactions between editors in many kinds of on-wiki interactions during selectable timeframes. These analysis and visualizations can benefit from the strength, number, frequency, tone and type of interactions. The tool will also take into account reverts, revdels, and logged user interactions such as blocks or rights changes.
It may be possible to use existing visualization tools to create visualizations. If we can find a way to make relatively quick visualizations that we believe are satisfactory then we may run Step 3 parallel to earlier steps.
If done well, visual aids enhance the ability for expert and lay readers to understand data quickly.
A prototype of a network graph visualization for editor disagreements in individual articles ("whoVIS") can already be tested at [1], including deletes, undos of reintroductions and undos of deletes as "disagreement actions" (possibility of future inclusion of "positive" interactions like reintroductions)
Other example tools:
- stats.wikimedia.org (visualize things in two dimensions)
- https://snuggle-en.wmflabs.org/
- http://www.tableausoftware.com/
- Gephi
- R & ggplot2
- Spreadsheet graphs
Budget
editProject Manager / Research Analyst: $25/hour, 16 hours/week for 6 months = $10,400. Progress will be reviewed on a weekly basis, and at the customary midpoint and endpoint reporting phases. Some of this time may be spent attending workshops or developing new skills that may become seen as necessary or useful during the course of the grant. The time estimate is approximate. I asked a scientist who is not directly involved with this project about time estimates and milestones for research projects, and was informed that making these estimates is routinely difficult or impossible for research projects. However, if I exceed 16 hours per week on this project, I will not ask for additional funds during this grant round.
There are no additional funding requests at this time.
If a conference seems to be especially relevant to this work then we may submit a separate Travel and Participation Support request.
At Step 3 of this project, if visualization tools that we have already identified and other open-source or commercial tools are unable to produce visualizations that meet the needs of our intended audience (researchers and editor engagement supporters), we may make a supplemental request to hire a contractor. At this point we are not including a separate budget for visualization, and any request would be submitted through the appropriate review process, most likely an IEG extension request.
Depending on how far we get with this 6 month grant and how successful we are, we may request an extension.
Community engagement
edit- Participants in the Research, Editor Engagement, Mobile, GLAM, and Gendergap mailing lists will be consulted about which data sets they would like to see prioritized
- Periodic updates will be sent to the Research mailing list and other lists where readers express interest in the project
- Periodic meetings on IRC or Google Hangouts will be offered to discuss the project, its ongoing work, and its results
- Periodic posts may be made on Village Pump pages, the Wikimedia Blog, the Signpost, the Editor Engagement email list, and other communication tools.
Sustainability
edit- Data and the methods used to extract it will be publicly documented for others to re-use. There have been previous requests from researchers for the kinds of information that will be covered in this study, so we anticipate that our data and methods will be reused.
- Visualizations and visualization methods will be publicly documented for others to reuse.
Measures of success
edit- Editor interaction data sets will have been publicized in the order of priority that was set in consultation with the communities identified above.
- Communication tools identified above will be used regularly to update interested researchers, WMF staff, and community members and to invite their suggestions.
- Visualizations using the highest priority editor interaction data sets will have been uploaded to Commons in a Wikimedia-compatible license, and the methods for producing similar visualizations will be described so that the methods can be reused by other researchers and members of the Wikimedia community.
Get Involved
editParticipants
edit- Grant proposers:
- Pine: Project Manager / Research Analyst
- User:EpochFail: Research Scientist / Software Engineer
- Other interested people please sign up below and indicate your potential role:
- Volunteer Extract interaction data EpochFail (talk) 22:15, 24 September 2014 (UTC)
- Volunteer I'll provide editor relationship datasets and possibly a D3-based intra-article editor relationship graph visualization. Fabian Flöck (talk) 12:49, 4 October 2014 (UTC)
- Volunteer Not quite sure what role at this time, but fits with what I have previously proposed. Kerry Raymond (talk) 01:58, 18 October 2014 (UTC)
Community Notification
editPlease paste links below to where relevant communities have been notified of your proposal, and to any other relevant community discussions. Need notification tips?
- Analytics mailing list 1
- Education mailing list 2
- Editor Engagement mailing list 3
- Gendergap mailing list 4
- Research mailing list 5
- Wikimedia-l mailing list 6
Endorsements
edit- I think it's a really important step to make Wikipedia processes more transparent for everyone. Fabian Flöck (talk) 12:47, 4 October 2014 (UTC)
- The abrasive behaviour of the Wikipedia community is often mentioned as a cause of editor attrition. A better understanding of editor interaction is necessary to identify "good" and "bad" patterns of interactions, which might then be used to encourage/discouage certain kinds of behaviours through either social (policies, guidelines, e.g. WP:BITE) or technical means (disallowed by the edit software). Kerry Raymond (talk) 02:03, 18 October 2014 (UTC)
- Providing this can be anonymised more effectively so as to avoid naming and shaming editors then I would support this proposal. WereSpielChequers (talk) 04:57, 19 October 2014 (UTC)