WikiCredCon 2025/Notes
Problems Topics (please feel free to expand)
editBest Practices
editType of Work Labels
editReferences
editDiverse Voices
editOne approach is encourage use of news aggregators such as:
- Ground News (https://ground.news/): Ground News is a platform that makes it easy to compare news sources, read between the lines of media bias and break free from algorithms.
Journalist Expertise
editQuestions:
- Is there a way we can build trustworthiness in specific domains for journalists?
Methods
editLocally Sourced
editActionable Feedback
editResources
editProposed strategies for proactive defenses
editHow can the Wiki community defend against information disorder? In rough order of least drastic to most drastic:
- Expanding data coverage of media outlets on Wikidata
- Red-teaming disinformation attacks
- Teams of volunteers to test Wiki's defenses and suggest how they can be improved (technical + non-technical)
- Promotion of media literacy strategies
- Prebunking of claims or common techniques (ex. propaganda techniques, fallacies, common misconceptions)
- Critical ignoring
- Engagement journalism
- Deradicalization
- Directly partner with fact-checkers, information disorder researchers, Internet governance organizations, social media platforms and search engines
- Decentralized, archived Wikimedia ecosystem via IPFS (example: https://github.com/ipfs/distributed-wikipedia-mirror)
- Decentralized Internet archives via IPFS
- Addressing news deserts (reliable sources may not exist in a given locale, so helping to fund existing journalists or train new ones to report there)
- If/when secondary reliable sources are lacking, should Original Research policy be revisited?
- Should an ethical source license be considered?
- Revising architecture of the Internet (ex. Semantic Web, Decentralized Web), including built-in content provenance (ex. C2PA) and automated archiving/version control of web pages
Tools
editExisting Tools
editSource Evaluation
edit- Veri-fyi Tool (https://veri-fyi.toolforge.org/):
Automated Fact Checking
edit- Factiverse: discount code available for wiki editors/conference attendees
Citation Related
edit- Notes from How Do We Monitor 100 Million Citations? session
- Citation Watchlist by SuperHamster
- Unreliable/Predatory Source Detector (UPSD) by Headbomb
- Cite Unseen: Wiki gadget to classify references.
- LibraryBase:
- RobustLinks: Proposal to update to proactively address issue of link-rot.
- FABLE (Finding Aliases for Broken Links Efficiently) (toolforge): Automated approach to revive dead links
- IA Reference Explorer:
Tools that we would like to be created
edit- Information Nutrition Labels for Wikipedia Articles (from Sally Lehrman's session) : "This article was written by #xnumber people since #datecreated. It contains #ynumber words and #znumber references." Clicking on this could potentially go to a graph showing the concentration of edits over time, which could be an indicator of how recent/up to date the article is.
- Newbie Indicator (flagging "student driver" editors): Frequently a new editor contributes with good intentions but does not adhere to Wiki best practices. Too often experienced editors will correct these errors in ways that do not encourage future editing - i.e.reverting/blanking/etc with either no explanation or hostility. If we flagged these new users we could try and soften these responses and build a more welcoming environment for new editors. Pseudo code would be:
User has less than 50 edits Attach words "I am a newbie editor. If I have erred in editing, please let me know what happened and how I should be editing. Thank you." to any edit published en:User:LoveElectronicLiterature
- Baby Steps (flagging "newbie" editors): The edit article suggestions are too complex and are scaring newbies. I think they just flag anything, so we are throwing college level editing (citations, tone, scandals, conflicts) at kindegartners (new editors). Get a bot that just identifies sentence structure, misspellings, typos to flag "baby step" easy editing. en:User:LoveElectronicLiterature
Tools that will be available soon
edit- The tapestry project: tapestries.media -- Beta access to be available on March 2025, reach Bob Stein (futureofthebook(at)gmail(dot)com) for access rights
Organizations
editThe Trust Project (https://thetrustproject.org/)
edit... The Trust Project built the Trust Indicators by asking people what they value in the news – and what wins and loses their trust. Then we married their insights with bedrock journalism values to come up with eight core disclosures that every reader, listener and viewer deserves to know. ...
Index of Unreliable News Sources (https://iffy.news/)
edit- Veri-fyi Tool (https://veri-fyi.toolforge.org/):
- Fact-check Feed (articles by US fact-checkers, 2016–present)
- Fact-check Search tool
- News Netrics media site performance metrics
- Unreliable News repo
- CredScore (hypothesis)
Factiverse (https://www.factiverse.ai)
editOur mission is to empower knowledge professionals to verify information fast and help them protect their organizations from reputational, financial, and legal harm.
Try it out at https://editor.factiverse.ai/.