Community Wishlist Survey 2021/Wikidata

Wikidata
21 proposals, 319 contributors, 672 support votes
The survey has closed. Thanks for your participation :)



Improve Derived Statements

  • Problem: at present, derived statements are shown only pressing a button at the bottom of the item. And the button itself is a preference (the relateditems gadget).
  • Who would benefit: increase the visibility of interconnections among items.
  • Proposed solution: add a section visible by default, compressed and expandable.
  • More comments: minimum information immediately shown in this section could depend on values of P31, and include counters at least. Expanding the section, all derived statements will become visible. Or other similar solution.
  • Phabricator tickets:
  • Proposer: Bargioni (talk) 09:29, 23 November 2020 (UTC)[reply]

Discussion

Voting

Bibliographical references/sources for wikidataitems

  • Problem:

Items (e.g. buildings, archeological sites, paintings, artists ...) described on various Wikimedia platforms benefit largely from good bibliographical sources. However, today one can only add such a source in an individual Wikipedia article (per language) and/or on a specific statement in Wikidata. As a result, many sources relevant for items/subjects remain hidden in all the languages of Wikipedia and knowledge remains "unused" and "undiscovered".

  • Who would benefit:

Editors on Wikipedia, the user community in general (more available sources = better articles) and all readers

  • Proposed solution:

The possibility to add a list of bibliographical references in a standard way, and data driven, (different from f.i. a list of works by a specific author) in a structured way per Wikidata-item. As a result, a list of bibliographical sources linked to the Wikipedia page via its corresponding Wikidata item would be automatically available in each language. For example, one could use this to link to historical sources/ archeological reports/ historic maps ... to for instance a town/artist/ ...

  • More comments: Advantages:
    1. More sources on Wikimedia platforms will be used
    2. More references per article add to the discussion as well as Wikipedia as a third source
    3. More data/research becomes visible through Wikimedia platforms and thus adds to its overall value within the information community
    4. Link between scientific research and valorisation of knowledge for a large scale audience through Wikipedia
  • Phabricator tickets:
  • Proposer: Hilke Arijs (talk) 17:15, 19 November 2020 (UTC)[reply]

Discussion

  • I love this proposal. Wikidata has everything it requires. It is multi-lingual. You can register already today references described by source (P1343) to link an item e.g. to a book/edition. The only thing that is required is a Wikidata module for Wikipedia to extract a bibliography from Wikidata. Geert Van Pamel (WMBE) (talk) 17:49, 19 November 2020 (UTC)[reply]
  • @Hilke Arijs: While in principle this proposal would improve Wikidata items, I feel like the proposal suffers from a lack of clarity, and it's not clear whether you're proposing a technical change that would somehow enable this or a comprehensive import that would actually allow this to be useful in the short term (at the moment, there's not enough relevant Wikidata data for such a feature to be genuinely useful). There are already properties that exist to relate works and subjects (main subject (P921) and described by source (P1343)), and there are already modules which allow data from multiple statements to be extracted (Commons' Wikidata infobox already does it). Either way, the proposal could be fulfilled without any changes to the underlying software, so it's possible that this wouldn't be within the survey's scope to begin with. Jc86035 (talk) 19:08, 20 November 2020 (UTC)[reply]
  • Could also be related to/benefit from mw:Global templates. Geert Van Pamel (WMBE) (talk) 11:08, 21 November 2020 (UTC)[reply]
  • Actually, it could be implemented as an option, like we currently have with Wdsearch. Geert Van Pamel (WMBE) (talk) 11:08, 21 November 2020 (UTC)[reply]
  • We could first do two simple things:
    • copy a (structured) reference from a wiki to wikidata
    • use this multiple times in the wikidata item for the properties it refers to.
Example: We have most info about person from ref1. Then we can use ref1 for the person's name, profession, place and date of birth, nationality, family. It would be enough to fill in ref1 as reference and ref1 to be a nicely formatted, full reference, copied from the wiki. Now we can only do that by leaving a simple URL multiple times (or filling in several source fields in every property, again and again). --FocalPoint (talk) 06:37, 24 November 2020 (UTC)[reply]
You can edit once the reference and then copy-paste the whole reference data to each relevant statement using DuplicateReferences gadget. Any reference structure has to take account the possibility to retrieve data for one statement. Having a kind of redirection of the reference is a dangerous tool unless it is well built to be automatically updated. Just take the case where the reference is deleted and not the redirections ? From my point of view this will generate more problems. Snipre (talk) 08:29, 24 November 2020 (UTC)[reply]

Voting

Edit Wikidata in a map!

  • Problem: Wikidata are great, but editing them requires still quite some knowledge of stuff like SPARQL etc. This makes the whole project less accessible to others. Everyone knows a map. Why can't we just fork a tool like Wikishootme so the users could edit the items directly in a map? Sure, not all items are geography-related but many are. Wikishootme is great for adding images. Now imagine that instead of adding images we could add other data.
  • Who would benefit: Wikidata community, Wikidata newbies.
  • Proposed solution: Forking Wikishootme and making it more versatile. Using the concept of adding a string to P18 in a window in the map for other properties.
  • More comments:
  • Phabricator tickets:
  • Proposer: Aktron (talk) 10:50, 17 November 2020 (UTC)[reply]

Discussion

Voting

  • Problem: Wikipedia redirects cannot be linked to Wikidata items
  • Who would benefit: Mainly Wikidata users seeking relevant Wikipedia text
  • Proposed solution: In the Wikidata UI, allow Wikipedia redirects to be linked to items
  • More comments: Some Wikipedia redirects already link to Wikidata items. For example, en:517 BC correctly links to d:Q715545. However, Wikidata's UI prevents the creation of such useful links. To create it, one must first make a "bad" edit to Wikipedia, replacing the redirect by a dummy article, then link to that "article" in Wikidata, then self-revert on Wikipedia to restore the redirect.
  • Phabricator tickets:
  • Proposer: Certes (talk) 02:14, 23 November 2020 (UTC)[reply]

Discussion

I was going to say that I proposed this already here: Community_Wishlist_Survey_2015/Wikidata, but I misread your problem, which is also known as the "Bonnie and Clyde problem", and is explained in detail at d:Help:Handling sitelinks overlapping multiple items. My proposal wasn't to change Wikidata UI, but to change the Wikipedia UI in the way redirects are implemented on the Wikipedia side, so instead of #REDIRECT [[Target]] you would see something like #REDIRECT [[d:Qid|Target]] where the Qid is optional for the redirect (most redirects for Q5 items are alternate spellings and don't need any alternate than the direct local target). Your work-around to create it is a known kludge to get around the Bonnie & Clyde problem which I delete when I see them because they could be created by people like you or because articles get deleted sometimes. I delete them because they gum up my Sparql results. Jane023 (talk) 11:18, 29 November 2020 (UTC)[reply]

Editors create these links to help Wikidata. If you are deleting them because they are unwelcome, just let us know, and we'll stop creating them and I'll withdraw (or at least not support) this proposal. Certes (talk) 21:45, 8 December 2020 (UTC)[reply]
@Jane023: Once this is implemented, it should be possible to exclude redirects from Sparql results, as they will have an associated badge (See T235420). So instead of having a kludge that messes up query results (as you do now), you would have a proper implementation that alleviates your problem. In other words, you would be able to choose whether or not to include redirects in query results, which you can't do currently, AFAIK. Kaldari (talk) 03:16, 9 December 2020 (UTC)[reply]

Does this finally mean that an English language Wikipedia entry (c|w)ould show multiple Inter-wikilinks (the version created from Wikidata that is) from another language. Shyamal (talk) 08:17, 9 December 2020 (UTC)[reply]

@Shyamal: This would make most manual Inter-Wikilinks unnecessary. There are still some cases where individual Wikipedia's have policies that make certain redirects not welcome. ChristianKl11:47, 9 December 2020 (UTC)[reply]
@ChristianKl: - I was not clear, not manual interwikis of course - but now that a language A Wikipedia entry may be covered by two entries in language B (linked via two Wikidata items), how does the Wikipedia entry guide a reader from language A to two possible page in B (I am assuming the side bar will ideally have multiple targets under language B when one sees the language A entry). Shyamal (here is a case in question - the Korean entry at https://ko.wikipedia.org/wiki/%EB%8F%99%EA%B3%A8%ED%95%B4%EB%A9%B4%EB%AA%A9 d:Q139079 does not have a interlanguage to the English version or vice versa because the English entry is covered under https://en.wikipedia.org/wiki/Homosclerophorida d:Q13140211 - adding the redirect in Wikidata is one aspect, but how would it get interpreted on the Wikipedia sidebar? I presume the priority would be to be able to guide the ease of user navigation between KO and EN) (talk) 11:53, 9 December 2020 (UTC)[reply]
This feature alone doesn't provide a way to guide users from A to two possible B (B1 and B2. A Wikipedia that hosts B1 and B2 and that wants that people from A can come to B could create a bot that automatically creates pages that say "A is covered in B1 and B2" and then add a sitelink from the A to that new page.
Whether or not bots that create such pages are seen as desirable by individual Wikipedia communities will be seen. In case they aren't it's also possible to create a gadget that has some virutal form of those B1 or B2-pages.
This feature will mean that the data that a bot that creates those B1 or B2-pages or a gadget would need exist. ChristianKl12:48, 9 December 2020 (UTC)[reply]
This feature would help us to guide readers from language B to language A (e.g. from article nl:Bonnie Parker to redirect en:Bonnie Parker which targets the section en:Bonnie and Clyde#Bonnie Parker). It neither helps nor hinders the 1→many link from language A to language B. Certes (talk) 13:01, 9 December 2020 (UTC)[reply]
Some canvassing happened at d:Wikidata:Project_chat#Finally_fixing_the_Bonnie_and_Clyde_problem. I still oppose this change because it would break the current data model in regard to uniqueness. I'm sure more people   Oppose this change, but I guess these pages are just to cheer on proposals. Multichill (talk) 22:37, 9 December 2020 (UTC)[reply]
@Multichill: the uniqueness of the data model gets defined via the notability policy and the sitelinks to redirects change nothing about notability. ChristianKl01:16, 11 December 2020 (UTC)[reply]
@Multichill: Isn't that a data model that very much needs to be broken? What is the down side of breaking it? Kaldari (talk) 20:17, 15 December 2020 (UTC)[reply]
There was an RFC in 2018 with majority support for allowing links to redirects, but action on the phab ticket to improve the UI stalled, as the devs wanted to consider other options despite consensus. Solution involves badging the sitelinks to redirects so that they can be filtered in queries (as Kaldari noted above), and templating the redirect pages to flag them as intentional (the latter practice is already happening on English Wikipedia). Sorry I don't have the links on hand but will try to find them. Pelagic (talk) 17:33, 15 December 2020 (UTC)[reply]
Here is the RFC I was thinking of: d:Wikidata:Requests for comment/Allow the creation of links to redirects in Wikidata. —Pelagic (talk) 18:27, 15 December 2020 (UTC)[reply]
On the data abstraction: redirect pages are still mediawiki pages, and they are in the main namespace, just like articles. This URI https://en.wikipedia.org/w/index.php?title=Bonnie_Parker&redirect=no is still "schema:about" Bonnie Parker, even though its content is an instruction to visit the section of another page. Traditional printed encyclopaedias have see entries in their indices for a reason. What I hope for is, when adding a sitelink in the UI and selecting a redirect, to get the choice to either add the redirect itself (where it links to a section or is semantically distinct from its target) or the redirect-target (redirect for misspelling or alternate name). This URI https://en.wikipedia.org/wiki/Bonnie_and_Clyde#Bonnie_Parker is also schema:about Bonnie Parker. If you don't want to allow sitelinks to redirects, then allow them to article sections. In both cases you are stretching the expectation that sitelinks point to articles: either extend it in one direction to include main-namespace non-article pages, or in the other direction to include sections of articles. Either way the reader should be able to navigate from an article in one language to a section of an article in another language, where the former and latter are both about the same subject. We already have sitelinks to non-mainspace non-article pages, even though w:en:WP:Village Pump isn't about d:Q16503, it's an instance of Q16503. The primary point of sitelinks isn't the RDF or the semantics, it's to allow users to move between related content. Pelagic (talk) 18:19, 15 December 2020 (UTC)[reply]

Voting

Time and date handling improvements

  • Problem: Dealing with time and dates on Wikidata is a long-standing problem, with several aspects that usually are brought forward to discuss. This is an attempt to summarise the most pressing aspects, by selecting the relevant Phabricator tickets and checking all previous Community wishlist requests:
  1. allowing time values more precise than "day" (task T57755) - this is a recurring request that dates back to October 15, 2013 to have a potential time precision up to seconds, and not limited just to days;
  2. supporting non-Gregorian/Julian calendars (task T252627) - this is another non-trivial request for handling sources that express dates and occurrences in calendars other than Gregorian and Julian;
  3. fixing annoying problems with displaying date values (task T63958 and task T95553) – pretty self-explanatory, mostly related to a better localisation of values, that nonetheless is needed.
These are the main three aspects, though many other things might be needed to be looked at. Our common hope is to find finally someone who can tackle those problems.
  • Who would benefit: primarily Wikidata contributors, but also Wikidata re-users at large
  • Proposed solution: add functionalities and fix current problems in this field
  • More comments:
  • Phabricator tickets:
    • task T57755 – Allow time values more precise than day on Wikidata
    • task T63958 – Use existing $dateFormats to format dates on Wikidata
    • task T95553 – Full stop in messages such as Wikibase-time-precision-century is incorrect in English
    • task T252627 – Support for additional calendar models
  • Proposer: Sannita - not just another it.wiki sysop 13:56, 18 November 2020 (UTC)[reply]

Discussion

Voting

Support ISO 8601-2:2019 to specify uncertainty about times

  • Problem: Many times sources specify a date with some uncertainty. A source might say that a person died between 1341 and 1345, or that it was approximately created at a certain date.
  • Who would benefit: Anyone who wants to record data with uncertainty in a way similar to what the sources say.
  • Proposed solution: Adopt the ways of specifying Time Interval, Qualification of a date and Unspecified digit(s) from the right layed out in ISO 8601-2:2019 (described in http://www.loc.gov/standards/datetime/edtf.html). This would allow writing 1341/1345 for a date of death between 1341 and 1345.
  • More comments:
  • Phabricator tickets: https://phabricator.wikimedia.org/T207705
  • Proposer: ChristianKl21:33, 17 November 2020 (UTC)[reply]

Discussion

Voting

sort statements as you wish

  • Problem: Sometimes, users want to check on a larger number of items whether these already have the Pid statement. Example: It can be very annoying to look up whether a given human (Q5) already has a religion or occupation statement; these would be somewhere down there between a lot of other statements…
  • Who would benefit: anyone who edits Wikidata
  • Proposed solution: It would be very useful to have some statement – chosen by you! – right above when you open the page of an item (perhaps right below an instance of statement). Or even sort the statements freely.
  • More comments:
  • Phabricator tickets:
  • Proposer: Geogast (talk) 17:19, 23 November 2020 (UTC)[reply]

Discussion

It's a good idea but did you know that you can go directly to a statement using the following url path Qid#Pid : Wikidata:Q2826599#P3206? PAC2 (talk) 17:42, 23 November 2020 (UTC)[reply]

In fact, I didn't. Thanks for this! But it would love a more practical approach than writing #Pid in the URL every time.--Geogast (talk) 18:59, 23 November 2020 (UTC)[reply]
If users are coming from an external site (ie. Wikipedia) the anchor linking to the middle of the page is very confusing. It would be better if the anchored property would be at the top of the page so there would be context AND you could select multiple properties to be grouped to the top. (example code: Wikidata:User:Zache/vector.js and link Wikidata:Q2826599#P17,P571). --Zache (talk) 18:52, 27 November 2020 (UTC)[reply]
Yes, that was my idea: The anchored properties go to the top. And: You choose the properties (opt-in); anyone who doesn't need this feature won't use it and won't get confused.--Geogast (talk) 18:44, 30 November 2020 (UTC)[reply]

Voting

Extend Gadget - Drag'n'drop

  • Problem: Drag'n'drop is a great gadget that transports statements from Wikipedia to Wikidata with a simple drag and drop. It would also be nice if you could use this method for "commons" as well, to quickly insert images.
  • Who would benefit: Everyone who uses it
  • Proposed solution: Expansion of this gadget
  • More comments:
  • Phabricator tickets:
  • Proposer: Crazy1880 (talk) 18:21, 25 November 2020 (UTC)[reply]

Discussion

You can check my rewrite of this gadget I did some time ago, add importScript('User:Yarl/DragNDrop.js') to your /common.js subpage. Yarl (talk) 14:36, 14 December 2020 (UTC)[reply]

Voting

Slow and fast edits

Discussion

Voting

Duplicates and merge candidates

  • Problem: There is an increasing number of items that are empty or possible duplicates
  • Who would benefit: Wikidata editors
  • Proposed solution: Improve on prior art like Projectmerge to detect duplicates not only by labels but by comparing properties and links with other items; migrate the WD:DNM do not merge lists to something more usable (example suggested in the discussion page, migrate to P1889 statements
  • More comments:
  • Phabricator tickets:
  • Proposer: Sabas88 (talk) 12:38, 20 November 2020 (UTC)[reply]

Discussion

Voting

Expand automatic edit summaries

  • Problem: When one't watchlist is set to display edits made on linked statements on Wikidata, they are always displayed in numerical code even if labels exist on the Wikidata entries. For example, this diff on enWikipedia's watchlist displays as "Created claim: Property:P4552: Q5456; Added reference to claim: Property:P4552: Q5456" whereas on Wikidata it's two diffs with two edit summaries, "Added reference to claim: mountain range (P4552): Andes (Q5456)" and "Created claim: mountain range (P4552): Andes (Q5456)".
  • Who would benefit: People who use their watchlist on a non-Wikidata project to monitor changes to the Wikidata item linked to an article they have watchlisted. On enWikipedia some templates draw information from Wikidata so making it easy to monitor the edit content may be beneficial.
  • Proposed solution: The watchlist should display the language label if it does exist in lieu of the numerical code; in this case the summary should be "Created claim: Property:mountain range: Andes; Added reference to claim: Property:mountain range: Andes" perhaps with the "property" omitted if it makes the summary overlong.
  • More comments: I hope I didn't send this - which is a re-do of two previous proposals among the same lines - in too late.
  • Phabricator tickets: phab:T108688; phab:T171027 may be worth paying attention to since it's a technical issue that could impact on this project.
  • Proposer: Jo-Jo Eumerus (talk, contributions) 09:46, 29 November 2020 (UTC)[reply]

Discussion

Voting

Anti-vandalism tools for Wikidata

  • Problem: Wikidata has a lot of vandalism and the tools to fight it are not very good.
  • Who would benefit: Admins, all wikis that use Wikidata
  • Proposed solution: Develop tools to fight vandalism. A start would be w:en:WP:TWINKLE, to automate reverting, warning a user, blocking, and leaving the right templated messages in all those cases. Then we could also get something like w:en:WP:HUGGLE to load changes in real time and get them patrolled.
  • More comments: This will attract an army of editors to fight vandalism on Wikidata, similar to English Wikipedia. This will improve trust in Wikidata.
  • Phabricator tickets:
  • Proposer: Rschen7754 01:57, 17 November 2020 (UTC)[reply]

Discussion

  • Note that there is currently a ticket for Huggle (T183141) marked high-priority for (at least partial) support of Wikidata. Courtesy ping since they authored said ticket: Petrb. Perryprog (talk) 02:39, 17 November 2020 (UTC)[reply]
  • Yes, this is needed so badly. {{u|Sdkb}}talk 02:43, 17 November 2020 (UTC)[reply]
  • There are some bugs with issuing warnings, but otherwise Huggle seems to work mostly OK for me... I always keep Wikidata in my feed alongside English Wikipedia. My only real complaint is phab:T199500, which is actually a big problem, but all things considered using Huggle is still more efficient than Special:RecentChanges. As for Twinkle, the first step to get the UI localized. That's in progress now and slated to be completed by the end of the year. MusikAnimal talk 02:57, 17 November 2020 (UTC)[reply]
  • There is also the fact that recreations are a lot harder to track due to them being at new entity ids instead of at the same title, so title blacklisting or creation protection isn't available... --DannyS712 (talk) 03:27, 17 November 2020 (UTC)[reply]
  • Nice idea, yes we do need twinkle like tool which will work on wikidata, actually modifying global twinkle would be an good idea? to fit in requirements of Wikidata, commons, wikisource? QueerEcofeminist [they/them/their] 05:40, 18 November 2020 (UTC)[reply]
  • This is an important proposal, but I do not think that Huggle and Twinkle can be made particularly useful for Wikidata RC patrolling. Those tools let us basically make ad-hoc revision-based assessments, but we rather need tools for a much more user-based patrolling process. In other words: the question usually is whether a given user generally makes good faith edits (and to a lesser degree has the skills to get it right), rather than whether a particular edit was made with good faith.
    Modifications in Wikidata are often composed of several “atomic” edits (i.e. a collection of edits that are usually close to a smallest possible increment); it is often not useful to look at individual edits (diffs) in particular, as only the overall picture tells us the full story. This is even more important in situations involving more than one item (sitelink moves, mergers, etc.).
    Another key feature for Wikidata patrolling are editorial filter options, so that the patroller can efficiently filter edits of a certain type of modification (e.g. limited to a specific language or property). There are some tools out there doing exactly this, but improvements are certainly possible. —MisterSynergy (talk) 09:20, 18 November 2020 (UTC)[reply]
  • Since more and more wikis rely on Wikidata, it's fundamental to protect the integrity of the information in it. --Andyrom75 (talk) 22:12, 20 November 2020 (UTC)[reply]
  • I know this proposal is vague but I think that is okay. More than any particular vandalism tool, we just need any vandalism tool so that we can encourage more discussion about what kind of vandalism Wikidata experiences and how we prepare a long term effort to counter it. This works as an open proposal - just start with the technical development of any vandalism tools on Wikidata. Blue Rasberry (talk) 18:17, 24 November 2020 (UTC)[reply]
  • tools are needed but rather than twinkle / huggle, you need an ORES AI built on existing reversion dataset. the high confidence vandalism can get reverted by bot, with the less confident vandalism sent to a response queue. with human intervention by editor not edit, trying to pivot editors to productive editing. good faith mistaken edits should go to a coaching queue. you would need to train a task force of responders in how to act in a firm but fair way - edit warring over individual edits is a failed model. the tools will not attract the vandal fighters, rather the response team must be recruited to increase data quality. Slowking4 (talk) 02:13, 27 November 2020 (UTC)[reply]

Voting

Access to Wikidata from external (Media)Wikis

  • Problem: Wikidata should be accessible from non-wm wikis as well.
Deutsch: Es sollte ein Zugriff auf die Daten von Wikidata auch aus fremden Wikis möglich sein.
  • Who would benefit: MediaWikis hosted outside of WMF*infrastructure could use data from Wikidata in an easy way, which could otherwise foster to contribute to this project.
Deutsch: Viele Wikis in allen Sprachen außerhalb der Wikimediawelt hätten die Möglichkeit auf permanent aktuelle Daten.
  • Proposed solution: I think there is a need for a MediaWiki-extension which could easily {{#invoke}} data from Wikidata also in an wmf-external mediawiki. especially if an external wiki is already linked with Wikidata by an external-id property (e.g. P6228 or P8713) there should be a technical possibility to configurate the external wiki on which way the local wiki is linked to Wikidata. Let's say - use pageid (as used in P6228) or pageTitle (P8713) and a specific Property to look up for the corresponding Wikidata Item. This should happen in the background so an {{#invoke}} module for external mediawikis could parse the specific values from all the given statements and the linked item.
  • More comments:
Deutsch: Bidirektionale Schnittstellenfunktion zum Abbau von redundanten Informationen und Daten von und zu Wikipedia-Softwarelösungen und Regionalwikis--Kasa Fue (talk) 17:42, 25 November 2020 (UTC)[reply]

Discussion

  • Open Citizen Science with regiowikis and wikidata depends highly on open data and their linkages! --Jeb (talk) 14:55, 27 November 2020 (UTC)[reply]
  • The advantages using InstantCommons for the integration of external media could also bring advantages for using InstantWikidata data. --Erfurth (talk)

Voting

Allow access to some inverse statements through Lua and parser functions

  • Problem: Wikidata currently has more than 100 inverse properties, which are in some ways redundant: in theory, if two statements are exact inverses of each other, one should be unnecessary. It is possible to use WDQS (and user scripts, etc.) to find inverses of statements (those matching a given value and property), but this can't be done normally on Wikimedia wikis, which means that a lot of data is effectively impossible to use on wikis. Furthermore, property proposals are occasionally declined because the properties would be unnecessary for querying, even though they would be useful for infoboxes and other templates. Allowing inverses to be accessed would significantly reduce the complexity of using data from certain properties, and would reduce the maintenance burden for properties which currently have inverses.
  • Who would benefit: Template designers, Wikidata editors, Wikimedia readers (particularly on smaller wikis)
  • Proposed solution: Creation of a parser function, Lua function and/or similar in Wikibase to allow inverse statements to be accessed (with a hard limit on how many entities can be returned from one query), automatic display of inverses on Wikidata item pages and other entity pages (also with a hard limit), and some way of integrating the data that is currently stored using inverse label item (P7087). Alternatively, another approach to allowing the data to be surfaced could be taken if that would be more technically feasible.
  • More comments: See also Community Wishlist Survey 2019/Wikidata/Automatically add inverse properties, which targeted the same problem but suffered from the proposed solution's lack of clarity. It was within the top 10 until an objection was raised on technical grounds. Some databases, such as MusicBrainz, already display their "inverses" automatically, so it may be possible to use the approaches of such databases as inspiration or as a basis.
  • Phabricator tickets: phab:T209559
  • Proposer: Jc86035 (talk) 18:46, 20 November 2020 (UTC)[reply]

Discussion

  • Instead of enforcing data integrity by maintaining bidirectional relations and navigation along them by the software, this is left to the editors. As a help provided by the software, endless lists of constraint violations and popup messages are created. Enough said? I more and more get the impression that Wikidata is not at all focused on manual editing. Strong support for any solution on bidirectional relations. --Herzi Pinki (talk) 19:32, 20 November 2020 (UTC)[reply]
  • This would be very helpful. It will both help making infoboxes with content with Wikidata, remove a reason to have redundant data on Wikidata, and help to get more consistent data. For some properties the same statements would be also be displayed and be editable on the items pages for both the subject and object. --Dipsacus fullonum (talk) 14:00, 21 November 2020 (UTC)[reply]
  • This would be good, but I am worried that these would probably be 'expensive' queries. Some sort of caching is probably needed. On the other hand, the inverse properties can be bot-maintained on Wikidata, and we could get a lot stricter with insisting that the inverse property always exists (e.g., with bots enforcing the inverse properties, or removing values that are missing the inverse, on a daily basis). Thanks. Mike Peel (talk) 20:23, 28 November 2020 (UTC)[reply]
    @Mike Peel: I think the latter would be the most tenable alternative, although it would obviously be more difficult to maintain, and I think it would be outside the scope of the wishlist survey. I don't think it would make sense to pursue it unless a more permanent solution that involves changes to Wikibase is definitively ruled out. Jc86035 (talk) 10:31, 29 November 2020 (UTC)[reply]
    Sorry, no, bot-maintained inverses are a BAD idea. A vandal adds a bad statement. Bot adds inverse. Vandal's statement is reverted. Bot re-adds the original as it's the inverse of the previously bot-added inverse. And other variations on the same problem. I know you're doing some of these already, but it should not be expanded unless all these complexities are clearly addressed. ArthurPSmith (talk) 15:02, 30 November 2020 (UTC)[reply]
    @ArthurPSmith: In those cases the editor needs to remove both of the inverse properties, not just one of them. It's a little bit more work but it's not that much. However, I'm all for a better solution here! Thanks. Mike Peel (talk) 16:50, 30 November 2020 (UTC)[reply]
    @Mike Peel: Realistically, would the queries being expensive actually be a significant problem for the parser functions? MediaWiki already has several expensive parser functions, which the proposed parser functions seem roughly comparable to, and it seems unlikely to me that a single infobox would ever need to have 100 "expensive" queries for inverse statements. Given w:WP:PERF and MediaWiki's already-aggressive page caching, I would hope that this wouldn't be a problem inherent to the proposal. Jc86035 (talk) 17:18, 30 November 2020 (UTC)[reply]
    I'm more worried about the CPU time it would take rather than whether it is classified as 'expensive' in mediawiki terms. The infoboxes on Commons already occasionally run into the 10s limit, and I know de.wikivoyage has similar issues. Thanks. Mike Peel (talk) 17:30, 30 November 2020 (UTC)[reply]
  • Yes, yes, yes. Completely agree with Jc86035 and Dipsacus fullonum. Lack of this feature is limiting the ability of Wikidata to change the world.--Vojtěch Dostál (talk) 20:24, 28 November 2020 (UTC)[reply]

Voting

  • Problem: red links can not be linked to Wikidata, although in most cases a decent Wikidata-item already exists
  • Who would benefit: Future writer of the article, plus the reader that hits the red link can already get some info from both WikiData or other languages
  • Proposed solution: Find a format like [[:d:Q8493024:red link|Show this text]] to connect the items
  • More comments:
  • Phabricator tickets:
  • Proposer: Edoderoo (talk) 19:26, 20 November 2020 (UTC)[reply]

Discussion

  • I see the point in this proposal but a reader of say a Wikipedia shouldn't come to Wikidata by following a normal looking link without warning, so the link should somehow be marked as a Wikidata link. Also there must be a way to convert the links to Wikidata to ordinary links once a local article on the link topic is created. That can maybe be a bot job. --Dipsacus fullonum (talk) 13:37, 21 November 2020 (UTC)[reply]
  • @Edoderoo: I don't think the proposal is clear enough. w:Template:Interlanguage link already exists (and is available on multiple wikis), and its functionality seems to overlap substantially with what you've described. Are you proposing functionality that that template already has, or are you proposing that similar functionality should be implemented in MediaWiki or in an extension? Jc86035 (talk) 17:26, 21 November 2020 (UTC)[reply]
This template is a en-wiki-thing. I want to link red links on any language, without editing on en-wiki. There is really no overlap at all. Edoderoo (talk) 17:57, 21 November 2020 (UTC)[reply]
The template is on the en-wiki and not wikidata. I also thought about something that finds red links in wikidata and deletes it (or maybe writes all the red links on a special page and users can do it Manually themselves). I'll support any solution to this problem. Euro know (talk) 12:26, 22 November 2020 (UTC)[reply]

Voting

Wikidata edit content storage method

  • Problem: Wikidata is currently stored item by item.If that happens, the number of edits will increase, and it will be difficult to keep track of the edit history.
  • Who would benefit: all user
  • Proposed solution: I suggest changing to a system that saves several edits together. Editing of "Label", "Description", "Also known as" is treated as one editing even if you edit multiple places. Also, if multiple items are added to one property, it will be one edit. I've given two examples here, but I suggest that you save your Wikidata edits together in addition to this example.
  • More comments:
  • Phabricator tickets:
  • Proposer: Mario1257 (talk) 16:22, 30 November 2020 (UTC)[reply]

Discussion

  • As a general principle, data should be recorded at the smallest possible/practical level of granularity. Displaying data can then be done in various ways, including in an aggregated or filtered form. So, I'm inclined to think that perhaps the edit history should be reworked to be smarter about showing edits so that navigation is easier. Silver hr (talk) 03:28, 1 December 2020 (UTC)[reply]
  • Mario1257, can you clarify this wish for us? Are you referring to changing and simplifying the user experience, or no? Please get back to us by December 7th. Thanks! — The preceding unsigned comment was added by IFried (WMF) (talk)
    Re-ping Mario1257 since the first one didn't go through. Sorry about that! We have about 24 hours left before voting. MusikAnimal (WMF) (talk) 17:29, 7 December 2020 (UTC)[reply]

Voting

Show the ORES score of the linked Wikipedia page

  • Problem : The ORES quality score of a Wikipedia page is not visible, nor available in Wikidata or Wikidata Query.
  • Who would benefit : Moderators, editors, volunteers, spam fighters, all people maintaining or ameliorating Wikipedia pages, all readers
  • Proposed solution : Show the ORES score in the Wikidata Wikipedia list of pages. The reader could immediately have an impression of the quality of the article in the different languages.The value could also be made available via Wikidata Query. In that case a query could be run for good/bad/mediocre articles, related to a certain domain, instance, or subclass to be able to identify good/bad articles and to ameliorate them. That list could then be used by any tool being able to work with item Q-numbers.
  • More comments: See also:
  • Phabricator tickets:
  • Proposer: Geertivp (talk) 13:27, 21 November 2020 (UTC)[reply]

Discussion

Voting

Display reference in edit summary when a reference is added

  • Problem: The edit summary for this diff does display that a reference was added, but not which reference it is. References can be unreliable, spam links etc. so having them be easy to monitor is desirable.
  • Who would benefit: People who patrol Wikidata items for problematic edits, since the content of the diff is immediately displayed.
  • Proposed solution: Add the content of the reference to the edit summary; in this case it would be "Added reference (imported from:English Wikipedia) to claim: mountain range (P4552): Andes (Q5456)"
  • More comments: I hope that this isn't too late, it's a repost of two similar requests in the preceding years. This feature would be useful as well if it displayed in crosswiki watchlists. There may be length issues when the reference is long.
  • Phabricator tickets:

Discussion

Voting

Creation of new objects resp. connecting to existing objects while avoiding duplicates

  • Problem: The problem of connecting newly created articles to existing objects respectivley creating new objects for unconnected pages (when, how, by whom, ...) for hundreds of newly created articles per day in different language versions, and how to avoid duplicates amongst the currently 90 million objects, has been discussed for years again and again without a real solution, for example at d:Wikidata:Requests for permissions/Bot/RegularBot 2
  • Who would benefit: Improved data quality, i.e. less duplicates
  • Proposed solution:

At d:Wikidata:Contact_the_development_team/Archive/2020/09#Connecting_newly_created_articles_to_existing_objects_resp._creating_new_object_-_additional_step_when_creating_articles,_categories,_etc. a possible solution has been discussed:

An additional step after saving a newly created article etc. to present to the user a list of possible matching wikidata objects (e.g. a list of persons with the same name; could be a similar algorithm as the duplicate check / suggestion list in PetScan, duplicity example) or the option to create a new object if no one matches (depending one the type of the object, some values could be already be pre-filled and pulled from the article, e.g. from categories or infoboxes). From my point of view, one current problem is, that a lot of creators of articles, categories, navigational items, templates, disambiguations, lists, commonscats, etc. are either not aware of the existance of wikidata or did forget to connect a newly created article etc. to an already existing object or to create a new one if not yet existing, which might lead to (more) duplicates, if this creation respectivley connection is not done manually, but by a bot instead, which have to be merged manually afterwards.

In addition, there could be specialized (depending on the type of the objects, e.g. one bot for humans, one for films, one for building, etc.) bots, which are for example able to check for various IDs (like GND, VIAF, LCCN, IMDb, ...) in order to avoid creating duplicates and creates new items or connects matching items based on IDs.

Also, if someone uses the "translation function" to create a translated article in another language version, then the new translated article could be connected automatically to the object of the original article. And after a version import (after a translation), at the moment often the link to the Wikidata object gets lost and the article has to be reconnected again a second time manually.

Discussion

  • Just to give the scope of the problem if nothing is done: I needed several months to integate 2,000 items wihout P31/P279 that had accumulated in biochemistry from freshly created en-WP articles. Wikipedia editors are left alone with the task of creating WD items for their articles. I am now facing 1,000 more items from de-WP articles. Any guidance that can be given to WP editors will be helpful. --SCIdude (talk) 08:05, 17 November 2020 (UTC)[reply]
  • We already have constraints that flag items with non-unique IDs. I don't think that those cases should be automatically resolved by bots and in any case bots are created by the community and don't need to be created by the community wishlist team.
I don't think the way to make Wikidata more popular is to force Wikipedia editors to do the work of interfacing with Wikidata when they create new articles. I don't think either dewiki nor enwiki would activate such a feature if it would be available to them. ChristianKl15:33, 17 November 2020 (UTC)[reply]
On the contrary, a whole heck of a lot of opposition at en.WP is precisely because the integration is so loose. Integration rolled out before support for client watchlists was provided (and then after the amount of changes were too much so the devs scaled that back). --Izno (talk) 18:14, 17 November 2020 (UTC)[reply]
  • This is probematic - how can system say "this article should be connected with this item?" But easy solution should be hey, on Wikidata is item with smae label as your article without sitelink to this wiki. Is it the same? And this system needs some modifications, e.g for wikisource, where "The Book" is ususally not the same thing as "The Book" on Wikipedia, but only edtion. JAn Dudík (talk) 15:11, 18 November 2020 (UTC)[reply]

Voting

New batch edit API

Discussion

  • @GZWDer: Could you elaborate on the issue? What is the problem with 1000 WDQS updates? How should the "batch edit API" work? Are you proposing, for example, to make the same change to 1000 items at the same time? Apologies for my ignorance, I admit I'm not the most Wikidata-savvy. I'm hoping a little clarification will benefit voters, too, and not just me :) Thanks, MusikAnimal (WMF) (talk) 22:36, 2 December 2020 (UTC)[reply]
    • WDQS currently reload an item when it is edited. So when an item is edited 100 times, it must be reloaded 100 times. Moreover, many small reloads is more expensive than a large one.--GZWDer (talk) 12:35, 3 December 2020 (UTC)[reply]
  • I have a plan to import more than 200 million (scientific articles) items. However, currently we need seven years to complete this if we do one edit every second.--GZWDer (talk) 21:25, 9 December 2020 (UTC)[reply]

Voting

  • Problem:

Wikidata query service result links to WMF wikipages are rendered as urlencoded URLs. The urlencoded page titles aren't human readable. Full URLs are also consuming so much screen space that the common problem is that result rendering will be failbacked to single column mode and result is not rendered as a table.

  1. Example of problematic result: https://w.wiki/jgK
  2. Example url value: <https://fi.wikipedia.org/wiki/Min%C3%A4_ja_Morrison>
  • Who would benefit:

All users who are querying links to Wikimedia wiki pages using Wikidata Query Service.

  • Proposed solution:

Add SPARQL prefix for WMF wikis OR change the the HTML rendering so that the rendered link would be like <wp:fi:Minä_ja_Morrison> in the result view.

Example result: https://w.wiki/jyS

Discussion

Voting