@GZWDer: Could you elaborate on the issue? What is the problem with 1000 WDQS updates? How should the "batch edit API" work? Are you proposing, for example, to make the same change to 1000 items at the same time? Apologies for my ignorance, I admit I'm not the most Wikidata-savvy. I'm hoping a little clarification will benefit voters, too, and not just me :) Thanks, MusikAnimal (WMF) (talk) 22:36, 2 December 2020 (UTC)[reply]
WDQS currently reload an item when it is edited. So when an item is edited 100 times, it must be reloaded 100 times. Moreover, many small reloads is more expensive than a large one.--GZWDer (talk) 12:35, 3 December 2020 (UTC)[reply]
I have a plan to import more than 200 million (scientific articles) items. However, currently we need seven years to complete this if we do one edit every second.--GZWDer (talk) 21:25, 9 December 2020 (UTC)[reply]
Support QuickStatements is slow and tends to randomly fail. Current experience of mass-editing is miserable, developers implicitly penalize active users by introducing more and more rate limits. Come on, since 2013 I did not see a single improvement in this direction, only restrictions. And I perfectly understand that "batch edit api" is not just "yet another api on top of existing primitives". This should be a targeted pipeline redesign, not necessarily starting with "bulk" part, for example, starting with introduction of non-bulk operations commonly used in QuickStatements, like "atomically insert statement with references and qualifiers, or update existing statement, if any". That's my only vote. Other proposals are definitely cool, but would kill Wikidata due to growing user base, as more users will just peck existing APIs until denial of service. Lockal (talk) 16:29, 16 December 2020 (UTC)[reply]