Machine Wikipedia

This is a proposal for a new Wikimedia sister project.
Machine Wikipedia
Status of the proposal
Statusunder discussion
Details of the proposal
Project descriptionAccording to Tim Berners-Lee's proposal, that is "Web 3.0" or semantic web, we should make our existing web machine-readable. But current editions of Wikipedia (English, French, etc.) are not machine-readable. Even though "Wikidata" provides some "structured machine-readable data", it does not implement Web 3.0, because Wikidata only provides structured data for one concept and the article may contain many concepts that are not included in its Wikidata item.

So I propose to create "Machine Wikipedia" like other editions of Wikipedia (such as English) which is written in the "machine language", e.g. triples of RDF (Resource Description Framework). This way, Chat-bots and other machines can access required information more accurately and more conveniently. This new edition of Wikipedia (Machine Wikipedia) can be filled with RDFs, both by humans and by artificial intelligence using natural language processing (NLP).

I also propose to make fully textual articles called "Machine articles" written in RDF, like other editions of Wikipedia, these fully textual articles can be filled by humans and AI (by NLP). The implementation of "Machine Wikipedia" is very fast and bots and machines can benefit from this edition very much.

Besides its benefits for chatbots and machines, encoding editions of Wikipedia to RDF can help to make some symmetry or harmony between existing human-readable Wikipedias.

For example: If English Wikipedia lacks some data about Steve Jobs (for example his father name), but French Wikipedia has, we can detect that deficiency of English edition by using "Machine Wikipedia" because after encoding process of English edition to RDF, the result lacks this data. Then we can alarm editors of English edition to add a sentence for that from the sentence existing in French.

I should note that Machine Wikipedia can be an accumulation of all human-readable Wikipedia (English, French etc.) containing all its data, but without any redundant information.
Is it a multilingual wiki?RDF, OWL etc.
Potential number of languagesIn triples of RDF
Proposed taglineMake Wikipedia machine-readable (The goal of the Semantic Web is to make Internet data machine-readable.)
Proposed URLmachine.wikimedia.org
Technical requirements
New features to requireThis project encodes Human-readable data to Machine-readable data, so it may be implemented both by humans and AI (by NLP). So some NLP processes should be implemented to do this encoding. Then humans should verify the correctness of this encoding.
Development wikiAbstract Wikipedia
Interested participants
Hooman Mallahzadeh


Proposed by

edit

Tim Berners-Lee

Alternative names

edit
  • Machine Wikipedia
  • RDF Wikipedia
edit

Abstract Wikipedia

Domain names

edit
  • machine.wikipedia.org
  • rdf.wikipedia.org
edit

Demos

edit

People interested

edit