IRC office hours/Office hours 2014-08-19
Google Summer of Code 2014 and FOSS Outreach Program for Women/Round 8 wrap-up meeting hosted on 2014-08-19 in #wikimedia-officeconnect.
[14:59:54] <qgil> Alright, let's get started, changing topic etc
[14:59:54] <guillom> Better!
[14:59:55] <bawolff> Nemo_bis: Well responding to google calanders is sometimes hard: "Google Calendar invitations cannot be forwarded via email. This event belongs to bawolff+wn@gmail.com and you are logged in as bawolff@gmail.com"
[15:00:37] <hexmode> bawolff: isn't gmail great?
[15:00:43] <bawolff> I know
[15:00:50] <kart_> Hello World.
[15:00:57] <bawolff> Its also a pita for hangout invites
[15:01:27] <qgil> Well, let's see. This meeting is absolutely experimental. We aim to showcase more than 20 projects on IRC.
[15:02:02] <qgil> Just for fun, if you are a GSoC or OPW intern, please raise your hand
[15:02:11] <qgil> (mentors will come next)
[15:02:15] * wctaiwan does so
[15:02:17] * mvolz raises hand
[15:02:31] * MatmaRex hangs around
[15:02:31] * jlschatz_intern raises hand
[15:02:33] <rohit-dua> \me raises hand
[15:02:35] <andrewbogott> o/
[15:02:39] <rohit-dua> o/
[15:02:40] <kunalg> o/
[15:02:41] <andrewbogott> oops, not an intern
[15:02:42] <sandaru> o/
[15:02:43] <qgil> andrewbogott, booo
[15:02:46] <qgil> :)
[15:02:46] * tonythomas o/
[15:03:16] <aaron_xiao> o/
[15:03:18] <ashley> rar
[15:03:24] <fhocutt> o/
[15:03:32] <kondi> o/
[15:03:33] <hardikj> o/
[15:04:05] <discoveranjali> o/
[15:04:35] <qgil> Ok, we have quorum already now. While new interns can keep raising hands, what about mentors, please raise your hands as well
[15:04:41] * siebrand raises his hand.
[15:04:44] <divec_> ✋
[15:04:46] <subbu> o/
[15:05:00] <bd808> \o
[15:05:03] <prtksxna> ヾ(^-^)ノ
[15:05:22] <aharoni> hola
[15:05:26] <aharoni> I'm trying to get Vikas.
[15:05:49] <kart_> \0
[15:06:17] * Nemo_bis raises hand for https://www.mediawiki.org/wiki/Extension:Translate/Mass_migration_tools
[15:06:28] <Raylton> o/
[15:06:29] <Jeff_Green> o/
[15:06:34] * andrewbogott 's hand still up, blank look on face
[15:06:50] <prtksxna> ヘ( ^o^)ノ\(^_^ ) andrewbogott
[15:06:50] <ali_king_intern> o/
[15:07:05] <qgil> ok ok, looking good (please keep raising your hands while I basically copy & paste the invitation you have probably read already)
[15:07:16] <qgil> This is how the showcase will work:
[15:07:23] <qgil> * We will start from East to West, based on the location of the intern.
[15:07:23] <qgil> First stop: Taipei.
[15:07:32] <qgil> * Each intern (or mentor if the intern can't make it) will point to the
[15:07:32] <qgil> project page and a demo (if possible), and will add a short comment written
[15:07:33] <qgil> in advance. We will leave a bit of time for comments questions, and then we
[15:07:33] <qgil> will move on.
[15:07:42] <qgil> open discussion will follow, if time permits.
[15:08:08] <qgil> (We could have used MeetBot but I decided not to complicate more things)
[15:08:18] <aharoni> o/
[15:08:29] <qgil> So let's get started:
[15:08:36] <tonythomas> yay !
[15:08:42] <qgil> ***** Wen-Chien Chen, Taipei, Taiwan
[15:09:03] <wctaiwan> hi
[15:09:03] * hexmode raises hand
[15:09:03] <wctaiwan> I'll just paste the stuff I wrote, I guess >.>
[15:09:03] <wctaiwan> For GSoC I worked on adding a ContentHandler backend to MassMessage for storing and managing delivery lists. In a nutshell, it changes MassMessage so that users can create and manage lists of pages to deliver to using a GUI that has validation and autocomplete, and the lists are stored internally in JSON.
[15:09:09] <wctaiwan> The project page is at https://www.mediawiki.org/wiki/Extension:MassMessage/Page_input_list_improvements and there’s a test wiki with the new code at http://mm-ch.wmflabs.org/
[15:09:20] <wctaiwan> The code hasn’t been deployed yet; but it’s basically ready and mostly waiting for database changes on WMF wikis (which are in progress) since the database doesn’t fully support ContentHandler features right now.
[15:09:42] <wctaiwan> (that's it)
[15:09:54] <wctaiwan> I'm here if anyone has any questions / comments :)
[15:09:55] <guillom> If I may, I just want to say how excited I am about this, as a regular user of MassMessage :) Thank you!
[15:10:03] <wctaiwan> \o/
[15:10:16] <qgil> questions / comments for wctaiwan ?
[15:10:36] <guillom> wctaiwan: Once your work is deployed, how difficult would it be to add that GUI?
[15:10:47] <wctaiwan> er, it's basically a new special page
[15:10:54] <guillom> Or is that already implemented?
[15:11:01] <wctaiwan> so once it's deployed, you go to Special:CreateMassMessageList and go from there
[15:11:15] <guillom> Oh.
[15:11:20] * guillom goes to test the demo :)
[15:11:26] <wctaiwan> http://mm-ch.wmflabs.org/wiki/Test!!! is what a delivery list looks like
[15:11:41] <qgil> I'm very happy to see code deployed by the end og GSoC / OPW projects!
[15:11:55] <wctaiwan> it hasn't yet :P but it should be doable soon-ish
[15:12:01] <guillom> Awesome :) Thanks again!
[15:12:05] <wctaiwan> :D
[15:12:08] <qgil> wctaiwan, biggest surprise or difficulty?
[15:12:27] <wctaiwan> there was a test that was passing locally but failing on jenkins
[15:12:43] <wctaiwan> took legoktm and myself a long time to figure that out >.> (it was a caching issue)
[15:13:36] <wctaiwan> it all went relatively well; really. I thought the project was relatively simple, but it seemed to be able to take up the entire coding period all the same. :P
[15:13:37] <qgil> wctaiwan, it sounds that your project turned out quite well. A good start for this meeting!
[15:13:43] <wctaiwan> yeah :)
[15:13:56] <qgil> Let's start traveling West
[15:14:00] <Raylton> -log
[15:14:03] <qgil> in order to have time for everybody
[15:14:10] <qgil> ***** Aaron Xiao, Beijing, China
[15:14:18] <aaron_xiao> o/
[15:14:22] <aaron_xiao> Project page: https://www.mediawiki.org/wiki/Extension:UniversalLanguageSelector/Fonts_for_Chinese_wikis
[15:14:31] <aaron_xiao> Chinese Hanzi has too many glyphs, a simplest font will take several MB, which is unacceptable for a web page. So I worked on the FontTailor project.
[15:14:38] <aaron_xiao> As a demo, WenQuanYiMicroHei is a font for Hanzi, taking about 4.5MB. Now the server will tailor the font for every wiki page just to contain the characters used.
[15:14:45] <aaron_xiao> Please visit http://fonttailor.wmflabs.org/ . With a debug tool you can see only 20KB is downloaded. ( You can also click on the red tofu to see the new tofu-detection feature, which is more accurate by comparing pixels. )
[15:14:54] <aaron_xiao> Another example is at http://fonttailor.wmflabs.org/index.php/Test which has more characters. About 40KB is downloaded.
SOMETHING YOU WANT
[15:15:06] <aaron_xiao> (that’s all)
[15:15:37] * siebrand cheers aaron_xiao on!
[15:16:13] <kart_> aaron_xiao: +1. Wonderful and needed stuff :)
[15:16:24] <guillom> Looks great!
[15:16:27] <aaron_xiao> Thanks!
[15:16:27] <qgil> Impressive
[15:16:42] <siebrand> aaron_xiao: the second line on http://fonttailor.wmflabs.org/index.php/Main_Page has better readability. Feels a bit "heavier".
[15:16:58] <guillom> Yay for better language support while staying bandwidth-conscious.
[15:16:59] <aaron_xiao> Yes, it's another font
[15:17:04] <subbu> sounds nifty. one qn. are there scenarios where client-side-js might load additional content that will be affected by this?
[15:17:07] <qgil> Alspo a very good combination between GSoC (quoite global, usually, with a local impact (a Chinese student working in Chinese font support). This is great.
[15:17:22] <aaron_xiao> If the font tailor works, it'll look better, just like your feeling
[15:17:35] <divec_> I'd also like to point out that this is a general problem for all websites in Chinese (not just Wikipedia)
[15:18:04] <aaron_xiao> subbu: no
[15:18:04] <divec_> People generally know a few thousand characters ... but not the same few thousand ... there's often some placename or colloquialism that won't render
[15:18:42] <divec_> So this is really groundbreaking work!
[15:18:46] <wctaiwan> yeah, I wonder how well this would generalise. Up until now I've thought webfonts in Chinese would be infeasible, but something like this would allow e.g. the Chinese Wikipedia to switch to non-standard fonts.
[15:19:05] <wctaiwan> (that is, not relying on system fallbacks, which afaik is what it does right now)
[15:19:28] <subbu> aaron_xiao, can you say a bit more? is it because client-side js in mediawiki is restricted?
[15:20:04] <aaron_xiao> Do you mean tailor the font on client?
[15:20:21] <aaron_xiao> I'm a little confused
[15:20:47] <wctaiwan> aaron_xiao: I think subbu means that if AJAX loads additional content from the server, will this "just work"?
[15:20:59] <wctaiwan> (something that isn't in the original HTML sereved to the client)
[15:21:17] <qgil> Please take it to wikitech-l or other venue, we don't have time to get into much details
[15:21:23] <subbu> ok, for later :)
[15:21:25] <qgil> sadly
[15:21:31] <aaron_xiao> ok
[15:21:43] <qgil> Now entering the Indian subcontinent + Sri Lanka
[15:21:43] <subbu> thanks aaron_xiao. very neat.
[15:21:56] <qgil> Aditya Chaturvedi, Allahabad, India
[15:22:03] <qgil> ***** Aditya Chaturvedi, Allahabad, India
[15:22:09] <qgil> (keeping the format)
[15:22:12] <apexkid1> yup
[15:22:35] <apexkid1> Project: https://www.mediawiki.org/wiki/Catalogue_for_MediaWiki_extensions
[15:22:57] <apexkid1> Summary:
[15:22:59] <apexkid1> Extensions form a core component of mediawiki which allow it to be enhanced by addition of functionality. There are currently about 2000 extensions available on MediaWiki.org. However, for users of wiki, it is hard to identify and assess which extension fits a particular need
[15:23:06] <apexkid1> a third party site WikiApiary maintains data on extensions in a structured manner and and is tracking MediaWiki use in the wild. The idea is to make this process more intuitive and ensure good browsing experience, by making enhancements to Wikiapiary to accommodate more information about extensions in a structured manner and integrate some form of user feedback system while also improving the overall user interface to enrich user experience. Along
[15:23:08] <apexkid1> with this, the plan is that this extended information gained on WikiApiary will be syndicated back to MediaWiki.org
[15:23:17] <apexkid1> Demo:
[15:23:18] <apexkid1> User ratings in action: https://wikiapiary.com/wiki/Extension:ConfirmEdit
[15:23:20] <apexkid1> Pushing ratings (and other data later) on mediawiki: https://www.mediawiki.org/w/index.php?title=Extension:ConfirmEdit&diff=prev&oldid=1099093
[15:23:34] <apexkid1> Work to be done:
[15:23:35] <apexkid1> https://www.mediawiki.org/wiki/Thread:Template_talk:Extension/Modifying_the_template_to_allow_User_Ratings
[15:23:41] <apexkid1> 3rd party Partnership: WikiApiary ( www.wikiapiary.com )
[15:23:47] <apexkid1> end*
[15:24:10] <qgil> hi apexkid1 , let me say that I have an email half-written that I will send to wikitech-l
[15:24:17] <qgil> in response to the thread you started
[15:24:24] <qgil> there, basically I say two things:
[15:24:34] <qgil> one, thank you very much! this looks very good
[15:24:55] <siebrand> apexkid1: Nice. I see that the rating is not yet in the extension template, as it's not exposed on https://www.mediawiki.org/wiki/Extension:ConfirmEdit. You may want to (ask someone to) do that.
[15:25:03] <qgil> two, don't worry about integration with mediawiki.org, we knew that deploying anything on Wikimedia servers would be more complicated, as you are seeing
[15:25:24] <siebrand> Oh, you linked that. Sorry.
[15:25:28] <qgil> (and three, sorry for the not very welcoming replies you received from veteran and actually quite smart contributors) :(
[15:25:47] <Yaron> apexkid1: are there any technical obstacles left, or is it just "political" (people disagreeing with the addition)?
[15:26:16] <qgil> so yay for new contributors being brave and engaging in discussion with the established community -- all we were new once and we know it's not easy
[15:26:21] <apexkid1> qgil: Its always a pleasure to work with quite smart contibuters :)
[15:27:09] <qgil> Yaron, "political", there is an ongoing thread in wikitech-l
[15:27:14] <apexkid1> Yaron: Technically i do not think anything is missing. Only thing the community has to agree on is the correct requirement analysis.
[15:27:16] <qgil> ok, let's move on
[15:27:21] <Yaron> I'm aware of the thread, yes.
[15:27:29] <qgil> Anjali Sharma, Allahabad, Uttar Pradesh, India
[15:27:32] <qgil> ***** Anjali Sharma, Allahabad, Uttar Pradesh, India
[15:27:38] <discoveranjali_> o/
[15:27:44] <apexkid1> Thanks everyone!
[15:27:49] <discoveranjali_> * All relevant links to work available at: http://www.mediawiki.org/wiki/User:Discoveranjali/OPW_Progress_Report
[15:28:03] <discoveranjali_> Mine was a documentation internships part of OPW, I worked on plans and implementations to increase the social media profile of WIKIDATA. It includes: * Development of a social media message calendar * Headers for social media profiles (that are already up) * a slideshow tutorial to assist newbies get introduced to Wikidata * Edits to current documentation and plans for interactive online discussions
[15:28:24] * subbu didn't know that mw had 2000 extensions available .. mindboggling
[15:28:47] <siebrand> subbu: I think that may actually be only the tip of the iceberg.
[15:28:59] <Nemo_bis> subbu: more like 4000
[15:29:00] <qgil> (please let discoveranjali_ present)
[15:29:16] * Nemo_bis is listening
[15:29:27] <discoveranjali_> thanks :)
[15:29:50] <kim_bruning> wow, there's so many SoC coders!
[15:30:02] <qgil> discoveranjali_, that's it, or are you sharing links etc?
[15:30:19] <discoveranjali_> the work was quite distributed
[15:30:28] <discoveranjali_> so all links are in the above link
[15:30:45] <qgil> ok, did you deliver everything you planned?
[15:31:11] <discoveranjali_> Not completely....
[15:31:37] <discoveranjali_> just had one more plan to create write-up for a Wikidata app
[15:31:46] <qgil> That is not a problem per se, did you have a chance to fine tune plans with your mentors?
[15:32:10] <discoveranjali_> Yeah, she directed me to do more important things first
[15:32:18] <qgil> good!
[15:32:23] <mvolz> My favourite wikidata items were Q42 and Q1337. someone was clever when they were loading the db :)
[15:32:26] <discoveranjali_> and which needed to be delivered soon
[15:32:48] <discoveranjali_> mvolz :D
[15:33:00] <bawolff> lol
[15:33:10] <qgil> thank you discoveranjali_ , so you have been delivering items that the Wikidata team is using or about to use now?
[15:33:36] <discoveranjali_> my work was mostly on social media front
[15:33:58] <discoveranjali_> so that initiated tasks are ongoing
[15:34:09] <discoveranjali_> like weekly discussions, regular posts
[15:34:20] <qgil> good, thank you. ok, let's continue
[15:34:27] <discoveranjali_> updated headers etc.
[15:34:33] <qgil> ***** Jatin Mehta, Allahabad, India
[15:34:40] <jmehta> Hi
[15:34:52] <jmehta> Project Link: https://www.mediawiki.org/wiki/Extension:Semantic_Forms/Select2_for_autocompletion
[15:34:53] <jmehta> Progress Report: https://www.mediawiki.org/wiki/Extension:Semantic_Forms/Select2_for_autocompletion/Progress_Report
[15:35:04] <jmehta> The goal of this project was to switch the currently available autocompletion in Semantic Forms from jQuery UI Autocomplete to Select2.
[15:35:05] <jmehta> Select2 JS library has everything from a solid multi-select feature to AJAX loading, to rich item formatting - very easy to use and powerful.
[15:35:07] <jmehta> It supports searching, remote data sets, and infinite scrolling of results.
[15:35:08] <jmehta> Tagging: ability to add new items on the fly.
[15:35:10] <jmehta> Flexible accent folding is also available.
[15:35:11] <jmehta> Working with large, remote datasets: ability to partially load a dataset based on the search term.
[15:35:13] <jmehta> Templating: support for custom rendering of results and selections.
[15:35:15] <jmehta> demo at:
[15:35:16] <jmehta> - http://discoursedb.org/w/index.php?title=Picture_IDs_are_perfectly_sensible&action=formedit
[15:35:18] <jmehta> - http://discoursedb.org/wiki/Special:FormEdit/Visita_CPD_TecnoAlcala/Random_test_page
[15:35:36] <jmehta> end*
[15:36:42] <qgil> Good! Did you implement all the featured you planned?
[15:36:52] <jmehta> yes, all of them :)
[15:37:35] <qgil> The Semantic MediaWiki community will be happy, SemanticForms are used everywhere
[15:37:56] * apexkid1 loved using semantic forms for my gsoc too
[15:37:59] <jmehta> yeah :)
[15:38:04] <kim_bruning> semantic forms ftw
[15:38:14] <qgil> any surprises? any big lesson learned?
[15:38:16] <sandaru> I used semantic forms for my internship too :)
[15:38:16] <siebrand> He said semantic!
[15:38:22] <bawolff> GET HIM!
[15:38:28] <Yaron> :)
[15:38:28] <bawolff> ;)
[15:38:45] <kim_bruning> what's wrong with SMW? :-P
[15:39:05] <siebrand> Nuttin. Love it, just don't understand it.
[15:39:13] <jmehta> yeah, it was fun doing this project. I got to learn javascript more in depth. In fact, I learned modular javascript, this was the coolest part of this project
[15:39:16] <Nemo_bis> kim_bruning: that Wikimedia users get envious of their rich neighbours
[15:39:23] <guillom> https://upload.wikimedia.org/wikipedia/mediawiki/6/69/Hesaidsemanticga2.jpg
[15:39:30] <kim_bruning> siebrand, eh? srsly? It's not so hard to grok
[15:39:39] <bawolff> kim_bruning: There's just too much meaning
[15:39:39] <qgil> ok, ok
[15:39:40] <siebrand> kim_bruning: so much to do, so little time.
[15:39:45] <bawolff> so much structure
[15:39:50] <qgil> please bear with the oldtimers and their old jokes
[15:39:54] <kim_bruning> siebrand, I know that feeling ;-)
[15:39:55] <guillom> :P
[15:39:58] * siebrand cheers guillom on!
[15:40:09] <qgil> let's continue
[15:40:13] <qgil> ***** Dinu Kumarasiri, Colombo, Sri Lanka
[15:40:13] <kim_bruning> qgil, we can make new jokes if you like :-)
[15:40:17] * kim_bruning hushes
[15:40:18] <sandaru> o/
[15:40:23] <siebrand> Anyway, awesome work. Auto completion is generally a great feature.
[15:40:31] <jmehta> Thanks :)
[15:40:33] <sandaru> My FOSS OPW project is Welcome to Labs : Welcoming new contributors to Wikimedia Labs and Wikimedia Tool Labs.
[15:40:35] <sandaru> https://www.mediawiki.org/wiki/Welcome_to_labs
[15:40:35] <sandaru> Progress reports : https://www.mediawiki.org/wiki/Welcome_to_labs/Progress_Reports
[15:40:49] <sandaru> The purpose of my project is to improve the documentation in Wikimedia labs and complete the project documentation. To complete the project documentation With the support of my mentor Andrew Boggot I created a template defining all the required fields which were
[15:40:49] <sandaru> useful to new comers as well as other contributors. Then from that template we created a form and added it to the project page where the project admins can fill the form and complete the documentation of their projects easily.
[15:40:50] <sandaru> Ex:https://wikitech.wikimedia.org/wiki/Special:FormEdit/Nova_Project_Documentation/Nova_Resource:Testlabs/Documentation
[15:40:50] <sandaru> So I have been nagging project admins for a while to finish this.
[15:41:05] <sandaru> We have made a FAQ page mainly targeting the new comers to get their problems solved easily without going through all the help and getting started documentation
[15:41:06] <sandaru> https://wikitech.wikimedia.org/wiki/Help:FAQ
[15:41:19] <sandaru> That's all :)
[15:41:21] <Nemo_bis> ugh new pages to watchlist
[15:41:54] <qgil> thank you sandaru , how was the "nagging project admins"part?
[15:42:02] * Nemo_bis tortures sandaru a bit for lacking a category
[15:42:11] <kim_bruning> Nemo_bis, Template:Sofixit
[15:42:12] <sandaru> :D
[15:42:32] <kim_bruning> new user intros ftw, I might use this ;-)
[15:42:34] <sandaru> ok up to now. But there are still uncomplete projects
[15:42:39] * kim_bruning hadn't gotten into labs yet
[15:42:56] <Nemo_bis> kim_bruning: to add a cat I'd need to read and understand the page first :p
[15:43:31] <kim_bruning> Nemo_bis, O:)
[15:43:56] <qgil> hey, questions and comments to sandaru please
[15:43:58] * Nemo_bis wonders if we're going at sufficience pace
[15:44:07] <andrewbogott> Because the labs project docs are structured now (rather than just a big text field) I can sort and query in order to find stale projects or new projects or projects that seek volunteers… it's going to be a big improvement.
[15:45:03] <sandaru> :)
[15:45:16] <qgil> Nemo_bis might be right
[15:45:20] <mvolz> are these doc pages for projects or instances?
[15:45:24] <qgil> ok, pedal to the metal
[15:45:29] <andrewbogott> mvolz: projects
[15:45:30] <sandaru> projects
[15:45:32] <kim_bruning> even FASTER?
[15:45:38] <qgil> Just a bit
[15:45:41] <kim_bruning> yeah, there's lots of projects, I guess, still :-/
[15:45:47] <qgil> otherwise the last ones will be angry at us
[15:45:58] <qgil> Thank you for your understanding
[15:45:58] <kim_bruning> that's true too
[15:46:04] <qgil> ***** Deepali Jain, Roorkee, India
[15:46:48] <qgil> Raylton?
[15:46:51] <Djain> Hi!
[15:46:58] <Raylton> hello qgil
[15:47:06] <qgil> ^^^^
[15:47:11] <Raylton> project page https://meta.wikimedia.org/wiki/Book_management_2014
[15:47:21] <qgil> Isn't Deepali here?
[15:47:23] <Djain> Progress Page: https://meta.wikimedia.org/wiki/Book_management_2014/Progress
[15:47:29] <qgil> ahá
[15:47:29] <Raylton> test page http://tools.wmflabs.org/bookmanagerv2/wiki/Muggles%27_Guide_to_Harry_Potter/Introduction
[15:47:45] <Djain> Introduction: The project focuses on stabilization of BookManagerv2 extension for management, editing and reading of books in Wikibooks and Wikisource.
[15:48:10] * Djain Demo Book on tools lab: http://tools.wmflabs.org/bookmanagerv2/wiki/Book:The_Interpretation_of_Dreams
[15:48:55] <qgil> So this is in fact a continuation of the work done by a previous GSoC project, right?
[15:49:03] <Djain> Yes.
[15:49:13] <Djain> Major points:
[15:49:28] <Djain> 2. Support for books with large number of sections.
[15:49:33] <Djain> *1.
[15:50:03] <Djain> 2. Functionality to import old books from Wikibooks and Wikisource.
[15:50:13] <Djain> 3. Import book metadata.
[15:50:44] <Djain> 4. Some enhancements in book reading interface like Readability options and fullscreen mode.
[15:50:58] <qgil> I hope you stick around helping the Wikisource projects, they are very grateful to technical contributors. :)
[15:51:13] <Djain> Yes I certainly plan to. :)
[15:51:32] <qgil> Very good!
[15:51:35] <Nemo_bis> So to actually use this a Wikisource needs to do a lot of migration?
[15:52:04] <Nemo_bis> Am I going to ever see Collection made usable for Wikisource/Wikibooks books?
[15:52:28] <qgil> Nemo_bis, you wanted t increase the path... :)
[15:52:40] <Nemo_bis> qgil: we can have 2 or 3 conversations at once
[15:52:43] <Djain> Now instead of manually creating book sections for Bookmanagerv2, books can be imported automatically.
[15:52:44] <Nemo_bis> just start the next
[15:52:48] <qgil> Moving on, please move any interesting discussions to wikitech-l
[15:53:10] <qgil> Nemo_bis, this might be simple for you, but not as simple or not very nice for an intern showcasing their project...
[15:53:34] <qgil> This is why I'm trying to keep clean cuts. Thank you for your understanding, and moving on.
[15:53:42] <qgil> ***** Amanpreet Singh, Roorkee, Uttarakhand, India
[15:54:33] <qgil> mmm, this is the project with many mentors
[15:54:41] <apsdehal> hi
[15:54:46] <apsdehal> I worked on Wikidata Web Annotator for Wikidata that helps in annotating information from websites and feed them on Wikidata with proper references.
[15:54:52] <qgil> hi, good!
[15:55:27] <apsdehal> It uses many subprojects for its working, all are open source and hosted on WMFLabs.
[15:55:49] <apsdehal> Mainly all are created under this
[15:56:15] <apsdehal> Project: https://mediawiki.org/wiki/Wikidata_annotation_tool
[15:56:35] <qgil> Did you complete the features you planned?
[15:56:44] <apsdehal> Project Report: https://mediawiki.org/wiki/Wikidata_annotation_tool/updates
[15:56:52] <apsdehal> Yes all of them.:)
[15:56:59] <qgil> Good! How was the experience with so many mentors?
[15:57:19] <apsdehal> The demo is available at http://apsdehal.in/WAFBookmarklet/bookmarklet/bookmarklet.html
[15:57:19] <apsdehal> Just drag it to your bookmark bar and you are ready to launch it on any website
[15:57:19] <apsdehal> May take time to since CSS optimization is left to be done.
[15:57:58] <apsdehal> I think this would help Wikidata reach new heights.
[15:57:58] <apsdehal> Moreover, I have created an OAuth application that can be reused for interacting with api of any kind of Wiki including all sisters. I hope this would further used
[15:58:23] <apsdehal> I am here if there are any questions for me.
[15:58:41] <qgil> apsdehal, being further used in Wikimedia usually depends on having a maintainer keeping pushing... I hope you want to stick around pushing for your projects.
[15:58:55] <apsdehal> I am currently on mobile it may take time for me to reply.
[15:59:03] <qgil> in any case all this looks very good, thank you!
[15:59:06] <apsdehal> Yes I am always there for my project
[15:59:10] <qgil> :)
[15:59:15] <qgil> ***** Rohit Dua, New Delhi, India
[15:59:18] <rohit-dua> hi
[15:59:18] <apsdehal> There is so much scope of improvement
[15:59:37] <rohit-dua> project-page: https://www.mediawiki.org/wiki/Google_Books,_Internet_Archive,_Commons_upload_cycle
[15:59:40] <apsdehal> And experience was good with many mentors Thanks
[15:59:43] <rohit-dua> Tool-Page: http://tools.wmflabs.org/bub/
[15:59:50] <qgil> (thanks)
[15:59:55] <rohit-dua> Summary:
[16:00:01] <rohit-dua> BUB is a web-tool built in Python hosted on WMFLabs that downloads books from public libraries like Google-Books, and then uploads them to the Internet Archive.
[16:00:09] <rohit-dua> It has single upload option as well as mass-uploads. For mass-uploads the tools has downloaded over 11,000 books. (help of Nemo_bis google Id's)
[16:00:25] <rohit-dua> Link to uploads by bub: https://archive.org/search.php?query=subject%3A%22bub_upload%22&sort=-publicdate
[16:00:34] <rohit-dua> The github repo is not updated.
[16:00:41] <rohit-dua> Demo:
[16:00:47] <rohit-dua> The tool can be accessed from http://tools.wmflabs.org/bub/
[16:00:47] <Nemo_bis> At times, archive.org didn't have enough capacity to OCR everything ^^
[16:01:00] <rohit-dua> The tool can be used by selecting a library, then entering the Id/url and the email(for notification), and then confirm and submit.
[16:01:49] <qgil> Already working, very good! What about the Commons part of the cycle?
[16:02:18] <rohit-dua> For uploading to commons, IA-Upload tool is being used
[16:02:39] <Nemo_bis> but bub sends a link by email to prefill stuff
[16:02:49] <rohit-dua> Users get an email after ocr is completed, with direct linnk to ia-upload
[16:03:19] <qgil> "has downloaded over 11,000 books" means that these books are already in Commons as well?
[16:03:30] <rohit-dua> no in archive.
[16:04:18] <rohit-dua> For OCR to complete it may take 15-20 minutes depending on the book.
[16:05:02] <qgil> No time to discuss, but I want to add that this is very exciting. I hope this tool is known by the IA team now.
[16:05:08] <qgil> thanks
[16:05:13] <Nemo_bis> IIRC they noticed
[16:05:20] <rohit-dua> thanks :-)
[16:05:23] <qgil> ***** Kunal Grover, Delhi, India
[16:05:33] <kunalg> Hi
[16:05:52] <kunalg> The project had 5 parts which are listed here.
[16:06:00] <kunalg> https://www.mediawiki.org/wiki/Extension:Translate/Usability_improvements_2014
[16:06:10] <kunalg> This is the status of the tasks here, some bugs had additional discussions also which are linked here
[16:06:14] <kunalg> https://www.mediawiki.org/wiki/User:Kunalgrover05/Progress_Report
[16:06:24] <kunalg> The changes
[16:06:25] <kunalg> Page language selection for mediawiki(ready, but dependent on another patch- current version doesn't have all features)
[16:06:29] <kunalg> https://translatewiki.net/wiki/Special:PageLanguage
[16:06:34] <kunalg> Language bar design(waiting to be merged)
[16:06:38] <kunalg> https://sandbox.translatewiki.net/wiki/Plop2
[16:06:47] <kunalg> Changes in Special:AggregateGroups- Ability to edit groups' names and descriptions, Read only version, Performance improvements(Merged)
[16:07:06] <kunalg> Deleting or moving translation units should update translation page(One patch merged, one pending)
[16:07:10] <kunalg> Optional title translation
[16:07:16] <kunalg> Patches still waiting to be merged:
[16:07:18] <kunalg> https://gerrit.wikimedia.org/r/#/q/owner:kunalgrover05%2540gmail.com+status:open,n,z
[16:07:46] <kunalg> That's all
[16:08:28] <siebrand> qgil, guillom : https://sandbox.translatewiki.net/wiki/Plop2 may interest you a lot.
[16:08:39] * guillom clicks.
[16:08:40] <qgil> Links to gerrit patches and test wikis, I'm positively surprised how many projects reached this level
[16:08:55] <siebrand> Design by the venerable pginer.
[16:09:16] <qgil> (I had tried to be neutral and objective with this GSoC project which tries to fix a problem I reported...) :)
[16:09:22] <Nemo_bis> (Error code: ssl_error_bad_cert_domain)
[16:09:45] <siebrand> qgil: I was not aware you were aware, so I waited to point you to it until GSoC was over ;)
[16:09:48] <qgil> I have ben surprised how well you have worked with maintainers and the designer
[16:10:11] <qgil> ok, very good kunalg , thank you!
[16:10:20] <kunalg> qgil: Thanks :)
[16:10:29] <siebrand> Nemo_bis: That's an NSA scare you can ignore.
[16:10:44] <qgil> (I wasn't sure about picking 5 major Bugzilla reports as opposed to a single project, but you had very good mentors pushing for them...)
[16:10:46] <kart_> kunalg: nice work!
[16:11:01] <qgil> ***** Tonyt Thomas, Kerala, India
[16:11:04] <Nemo_bis> siebrand: I know, just a fyi in case you want higher conversion rates for your links
[16:11:11] <tonythomas> hi ! :)
[16:11:30] <tonythomas> Adding in my prepared speech : My project was to generate VERP return paths for every sent email from WMF wikis so that the default wiki@wikimedia.org will be replaced to some wiki-{blah}@wikimedia.org. Later on reception of a bounce, our BounceHandler extension API handles it and unsubscribe the user on observing a bounce frequeny pattern. We could get the
[16:11:30] <tonythomas> thing into beta by yesterday.
[16:11:30] <tonythomas> Thanks to Jeff,Lego,bd808, we got the first VERP email ( sent to me https://dpaste.de/raoy ) --You can demo the new VERPed email sending here http://en.wikipedia.beta.wmflabs.org/wiki/Special:EmailUser is not working for a demo. We deployed the same as a log only mode, so that we analyse the pattern for a month and fix a bounce limit.
[16:12:27] <Nemo_bis> yay real mails
[16:12:30] <tonythomas> Challenges - to make sure that the $local_part of the email address stayed below the upper limit - 64
[16:12:45] <tonythomas> thanks to csteipp for his great contributions
[16:13:09] <tonythomas> also -> POSTing the email to the bouncehandler API
[16:13:44] <tonythomas> oh no - actually http://en.wikipedia.beta.wmflabs.org/wiki/Special:EmailUser was fixed after I wrote my speech
[16:13:55] <tonythomas> its working and sending VERPE'd emails
[16:14:18] <valhallasw`cloud> tonythomas: does this also give us the SPF 'Sent by enwiki on behalf of <user>'?
[16:14:19] <mvolz> better than the other way around! :)
[16:14:45] <valhallasw`cloud> (I see the spf header uses the wikimedia mail address instead of your, so I suspect it does)
[16:15:01] <tonythomas> valhallasw`cloud: the Authentication part ?
[16:15:28] <qgil> two things I remember about tonythomas : he came early and with very clear ideas; he got a -2 from... Tim-away (?) because of a wider discussion on technology selections, and he found new course like a pro. Very well done.
[16:15:46] <valhallasw`cloud> tonythomas: I don't quite know how it works, but with SPF you need to mark emails as 'sent on behalf' to prevent 'this message might not have been sent by X'-messages. I think :-p
[16:16:12] <tonythomas> qgil: thanks. and as I promised in my reply email - we wrote a new extension SwiftMailer - github.com/wikimedia/mediawiki-extensions-SwiftMailer :) Thanks to the team
[16:16:20] <Jeff_Green> valhallasw`cloud: SPF doesn't actually involve marking the email, you post a DNS record for the domain that authorizes IPs/subnets as senders
[16:16:39] <Jeff_Green> I'm not sure where the "Sent by enwiki" thing you mention comes from?
[16:16:42] <valhallasw`cloud> Jeff_Green: unless you send email as someone else (which is what Special:SendEmail does)
[16:16:45] <qgil> as Nemo_bis yaya for real emails - Wikimedia servers handle gazillions of them
[16:16:52] <qgil> ok, moving on...
[16:16:57] <qgil> ***** Vikas S Yaligar, Surathkal, Karnataka, India
[16:17:14] <vikasyaligar> Hello !
[16:17:27] <vikasyaligar> I worked on "Automatic cross-language screenshots", where my job was to create automation for generating cross language screenshots for VisualEditor UserGuide.
[16:17:48] <vikasyaligar> Project page: https://www.mediawiki.org/wiki/Language_Screenshots
[16:17:53] <vikasyaligar> Progress: https://www.mediawiki.org/wiki/Automatic_cross-language_screenshots/progress
[16:17:59] <vikasyaligar> Demos:
[16:18:02] <vikasyaligar> English: https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide/
[16:18:08] <vikasyaligar> Hebrew: https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide/
[16:18:19] <vikasyaligar> Like this we have covered 20 languages depending on translation => https://translatewiki.net/w/i.php?title=Special%3AMessageGroupStats&x=D&group=ext-visualeditor-0-all&suppressempty=1#sortable:3=desc
[16:18:23] <siebrand> (language code is missing in the links)
[16:18:31] <siebrand> https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide/en https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide/he
[16:18:40] <vikasyaligar> oops; https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide/he for hebrew
[16:18:48] <vikasyaligar> thank you siebrand
[16:19:39] <siebrand> I hope aharoni doesn't see that the Hebrew translation is hopelessly out of date :)
[16:20:05] <qgil> vikasyaligar, there is not better demo than a feature actually deployed :)
[16:20:53] <aharoni> qgil, vikasyaligar : \o/
[16:20:58] <aharoni> yes, actually deployed ;)
[16:21:01] <vikasyaligar> qgil: yup :)
[16:21:13] <aharoni> siebrand: I see it
[16:21:31] <qgil> vikasyaligar, vikasyaligar how difficult is now for other teams / projects to use this feature? is it documented somewhere with instructions?
[16:21:33] * siebrand looks disappointed now.
[16:21:35] <aharoni> the whole guide was heavily updated a few days ago
[16:22:33] <vikasyaligar> qgil: yup it is documented in https://www.mediawiki.org/wiki/Language_Screenshots
[16:22:42] <qgil> perfect, thank you!
[16:22:48] <qgil> ***** Pratik Lahoti, Pune, India
[16:23:08] <kondi> not here :(
[16:23:11] <Nemo_bis> I am
[16:23:16] <Nemo_bis> Hello all! BPositive/Pratik Lahoti, mentored by me and Nikerabbit/Niklas Laxström, has worked on Translate extension's Mass migration tools. ur1.ca/i09ny The problem: each multilingual wiki, like Meta and its spin-off mediawiki.org, in 10 years before Translate, accumulated thousands of translation pages and hundreds of thousands of strings which we need to manually migrate for Translate adoption.
[16:23:22] <Nemo_bis> The goal: make extremely hard work slightly less hard. The approach: a special page to import old translations, where copy-and-paste work is made less wrist-damaging by simple markup heuristics; and one to reduce manual basic preparations for new pages, with a set of regexes.
[16:23:27] <Nemo_bis> Hard work means few users, but important ones to support: they're happy to see this, but so far little usage other than myself. You can see it used e.g. at ur1.ca/i09li , or just try yourself on a Translate wiki, e.g. at https://www.mediawiki.org/wiki/Special:PageMigration / PagePreparation. The tool is currently restricted to translation administrators.
[16:23:30] <aharoni> qgil, vikasyaligar - thank you
[16:23:32] <Nemo_bis> We tried hard to stick with MediaWiki best practices and have all work documented on bugzilla ur1.ca/i09m9 , so you can easily see what the work is and was about. ur1.ca/i09mr June was the most intense month and saw PageMigration deployed, then BPositive started a full time job and we slowed down a lot. ∎
[16:24:20] <qgil> So is it uncomplete now?
[16:24:24] * siebrand cheers BPositive, Nemo_bis, and Nikerabbit on!
[16:24:25] <Nemo_bis> Sort of
[16:25:10] <siebrand> Is there a usable product?
[16:25:11] <qgil> Well, at least we got some progress, but there is a lesson learned here ref looking for a job while being an intern -- otherwise understandable
[16:25:13] <Nemo_bis> The basic pieces are in place, but there are some features missing and this means that on a certain amount of pages the feature will be less useful than we'd like it to be
[16:25:53] <qgil> In Wikimedia, progress is progress. :)
[16:26:00] <Nemo_bis> I tried it on some dozens pages from https://www.mediawiki.org/wiki/Project:Language_policy/Migration_list and it was helpful in a majority of cases
[16:26:04] <qgil> thank you Nemo_bis Nikerabbit BPositive
[16:26:33] <qgil> Nemo_bis, is the work left documented in for of bug reports or similar?
[16:26:47] <Nemo_bis> qgil: yes, there should be a bug report for each thing
[16:27:00] <qgil> ok, this is very good, thank you!
[16:27:02] <Nemo_bis> there are a couple with some scope definition issue
[16:27:02] <siebrand> Nemo_bis: Thanks. Something is much better than notihing :)
[16:27:22] <Nemo_bis> siebrand: indeed, several pages I would never have had the courage to migrate without this ,)
[16:27:27] <qgil> ***** kondi, India
[16:27:34] <kondi> Hi
[16:27:41] <kondi> Project page: https://www.mediawiki.org/wiki/Extension:LocalisationUpdate/LUv2
[16:27:42] <qgil> (and we have to stick to 5 minutes per person)
[16:27:49] <kondi> My project required me to write a service (LUv2, for the lack of a better name) and rewrite the LocalisationUpdate extension (WIP).
[16:27:55] <kondi> The basic idea is to make the process of updating l10n messages more optimized. This service is intended to keep the latest possible version of collection of translation messages from translatewiki.net. Clients can subscribe to this service to receive 'push' when updates are available. The server follow
[16:28:00] <kondi> Here, the push is, only new/modified messages (no downloading whole files) for a particular language (not all 4xx languages).
[16:28:05] <kondi> No demo available atm, sorry for that. O:-) But I hope to make it available sometime this week. I'm usually lurking on mediawiki-i18n and wikimedia-dev, so ping me if you have any specific questions.
[16:28:11] <kondi> That's about it!
[16:28:16] <kondi> I think. :-)
[16:28:53] <qgil> kondi, I hope that regardless of GSoC deadlines you are able to complete the interesting work you started!
[16:29:32] <qgil> questions / comments?
[16:29:38] <kart_> qgil: Hopefully :)
[16:29:57] <kondi> qgil: yes, the service is the major component. That is complete now. The extension shouldn't be a problem.
[16:30:17] <siebrand> Where's the current code?
[16:30:31] <kondi> https://github.com/konarak/LUv2/tree/epicmess (please bear with me, it'd have much nicer, proper commits in about a week)
[16:30:37] <kondi> siebrand: ^
[16:30:43] <siebrand> THanks kondi
[16:30:45] <qgil> ok, thank you kondi
[16:30:50] <qgil> ***** Hardik Juneia, India
[16:30:53] <hardikj> Hi all, this summers, I worked with parsoid team to build a wikitext linter (Linttrap) which will enable us to lint broken wikitext whenever a page is parsed.
[16:30:54] <hardikj> GSOC application - https://www.mediawiki.org/wiki/User:Hardik95/GSoC_2014_Application
[16:31:02] <hardikj> Lintrap will detect broken wikitext found on the wiki pages and will
[16:31:02] <hardikj> also collect stats about certain wikitext usage patterns. Once a page is parsed, Lintrap uses parsoid based logger facility to log
[16:31:02] <hardikj> them to a web service. We call it Lintbridge
[16:31:02] <hardikj> http://lintbridge.wmflabs.org/. Currently Lintbridge is hosted on
[16:31:03] <hardikj> Wikimedia Labs and use mongodb to store all the issues.
[16:31:06] <hardikj> Lintbridge offers a REST api which can be used by bots and other applications to fix the broken wikitext. Linttrap uses this REST api to store issues into Lintbridge.
[16:31:11] <hardikj> Demo - http://lintbridge.wmflabs.org/_html/issues
[16:31:13] <hardikj> API - http://lintbridge.wmflabs.org/_api/issues
[16:31:15] <hardikj> here is the stats page here - http://lintbridge.wmflabs.org/stats
[16:31:17] <hardikj> here are the types of broken wikitext / issues we collect currently (we have plans to add more) http://lintbridge.wmflabs.org/_html/type
[16:31:20] <hardikj> we currently have tested this on 2000 pages from http://parsoid-tests.wikimedia.org/topfails/0
[16:31:22] <hardikj> feel free to browse the site :)
[16:31:26] <hardikj> a small example - http://lintbridge.wmflabs.org/_html/issues/53e7cf9f640eaa742884358e
[16:31:32] <hardikj> future plans : integrate these issue with checkwiki project, we had chat with chekwiki guys and most of the integration plans are ready and integration script is also ready, which is hosted here https://github.com/hardikj/lintbridge/blob/master/Utils/pop_checkwiki.js
[16:31:43] <hardikj> Checkwiki - http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Check_Wikipedia
[16:31:58] <hardikj> overall, this summers has been awesome experience for me, I will like to thank subbu, gwicke, cscott and all other members of parsoid team :)
[16:32:04] <hardikj> (mark and arlo)
[16:32:19] <hardikj> sorry for a long summary
[16:33:07] <MatmaRex> hardikj: is there documentation for what the various issue types mean?
[16:33:08] <qgil> hardikj, I will confess that I asked your mentors to check your proposal and yourself thoroughly
[16:33:45] <qgil> because I wasn't sure about you being "just" a good communicator or also a good developer. I'm very happy to see that you are a good developer with good communication skills. ;)
[16:33:50] <MatmaRex> e.g. things like "multi-template"
[16:34:06] <hardikj> MatmaRex, no, not yet :(
[16:34:22] <MatmaRex> this looks really interesting, but i find myself unable to use the UI :/ (i also tried it out when you emailed wikitech some weeks ago)
[16:34:24] <hardikj> but will have will it soon, with proper api documentations
[16:34:32] <subbu> MatmaRex, there are currently some comments in the code, but it probably needs some docs.
[16:34:48] <MatmaRex> aight. thanks
[16:35:09] <qgil> In any case, another project showcased in Labs before the deadline. Excellent!
[16:35:13] <hardikj> https://github.com/wikimedia/mediawiki-services-parsoid/blob/master/lib/dom.linter.js
[16:35:26] <qgil> ok thank you hardikj ... and we move to Europe!
[16:35:31] <qgil> ***** Jack Phoenix, Turku, Finland
[16:35:35] <hardikj> thank you all :)
[16:35:49] <siebrand> I think we're in danger of going over time by about 30 minutes or so.
[16:35:58] <ashley> rar everyone
[16:36:23] <ashley> I wrote a new, modern, scalable and attractive MediaWiki skin called BlueSky, compatible with MediaWiki 1.23 and newer, based on wikiHow's current default skin (which, for various reasons, isn't exactly portable outside the wikiHow environment as-is). The original proposal is available at https://www.mediawiki.org/wiki/A_modern,_scalable_and_attractive_skin_for_MediaWiki and I'm pleased that...
[16:36:25] <ashley> ...we only had to make one relatively minor change to the specs during the summer (minimum required version was bumped from 1.22 to 1.23).
[16:36:26] <ashley> Unlike wikiHow's, BlueSky's default color scheme is blue, but plenty of other themes are available and ready to use.
[16:36:28] <ashley> Progress reports, for those interested, are at https://www.mediawiki.org/wiki/User:Jack_Phoenix/GSoC_2014 ; on-wiki documentation is available at https://www.mediawiki.org/wiki/Skin:BlueSky , and live demo is at http://social-tools.wmflabs.org/w/index.php?title=Root&useskin=bluesky . To test themes, just append the usetheme parameter, followed by the correct theme name, to the URL, like...
[16:36:29] <ashley> ...this: http://social-tools.wmflabs.org/w/index.php?title=Root&useskin=bluesky&usetheme=red
[16:37:15] <bawolff> pretty
[16:37:17] * bawolff like
[16:37:34] <siebrand> ashley: Can it be translated on translatewiki.net yet?
[16:37:49] * qgil with his 3rd party MediaWiki admin hut likes it
[16:37:56] <bawolff> I like that it integrates different colour themes
[16:38:11] <ashley> siebrand: not as far as I'm aware, since the last time I checked TWN didn't support mediawiki/skins/ repositories :-(
[16:38:20] <siebrand> ashley: It does.
[16:38:23] <bawolff> I see lots of people trying to mess with css to change colour of monobook/vector, and users should not have to do that just to change colours
[16:38:42] <siebrand> ashley: If you add a complete qqq.json, it might just happen soon.
[16:39:05] <wctaiwan> bawolff: unfortunately still no UI exposed to change the theme for individual users :/
[16:39:17] <ashley> siebrand: ah, must be a new feature, wasn't the case back in...February or so? when I wrote about the Nimbus skin on TWN's support page; but yeah, qqq should be easy enough to add :)
[16:39:21] <siebrand> ashley: https://gerrit.wikimedia.org/r/#/q/topic:mediawiki-skins+project:translatewiki,n,z
[16:39:31] <siebrand> ashley: Added during Wikimania
[16:39:48] <MatmaRex> (i made siebrand do it! :P)
[16:39:59] * siebrand denies everything.
[16:40:00] <ashley> wctaiwan: theme selector UI is something that'd be lovely to have, but out of the scope of my particular project; that being said, IMO themes altogether should be a core feature...maybe something for GSoC 2015? ;-)
[16:40:05] <ashley> siebrand: \o/
[16:40:09] <qgil> ashley, this project + the cleanup of Vector + other initiaives these days are changing MediaWiki's look & feel for everybody
[16:40:28] <qgil> ok, thank you ashley
[16:40:31] * wctaiwan cheers for ashley and MatmaRex and other skin people :)
[16:40:31] <qgil> ***** Bartosz Dziewoński, Kraków, Poland
[16:40:37] * wctaiwan cheers again :P
[16:40:37] <MatmaRex> https://www.mediawiki.org/wiki/Separating_skins_from_core_MediaWiki
[16:40:37] <MatmaRex> I worked to separate core skins out of MediaWiki, removing cross-dependencies with MediaWiki itself, and making it possible for non-core skins to have all of the capabilities of core ones.
[16:40:37] <MatmaRex> I think the project went well, I completed the basic plan and most of the optional items. The timing of various parts has proven different than I expected, but in the end it all balanced out.
[16:40:37] <MatmaRex> There isn't a good way to showcase the changes, as this was largely an "internals" project. You might compare the contents of skins/ directory (and mediawiki/skins repository) in MW 1.23 and now, or review the updates to https://www.mediawiki.org/wiki/Manual:Skinning.
[16:40:44] <MatmaRex> there goes the blurb. questions? :)
[16:41:04] <siebrand> Oh it was noticed :)
[16:41:19] <siebrand> Darn ugly skin that "no default skin available" skin.
[16:41:51] <wctaiwan> it'll pay off in the long term, I think
[16:41:53] <qgil> MatmaRex, you were already an insider. Was it simply to get time funded to work, or did you have different types of interactions with your "mentors", or any surprise?
[16:42:44] <MatmaRex> qgil: that was one of the reasons, and also to get leverage to push things through; big patches can rot for a very long time without a consistent project behind them
[16:43:08] * siebrand nods.
[16:43:09] <qgil> I hope the experiment paid off
[16:43:17] <siebrand> I think it has.
[16:43:21] <MatmaRex> qgil: i was a pretty low-maintenance mentee, i think :) it ended up being mostly mentors emailing me about what they can do than the other way around
[16:43:23] <wctaiwan> +1
[16:43:45] <MatmaRex> most of my patches went through the standard review and were merged by whoever, Jon and Ori stepped in when sometihng was stuck too long
[16:43:50] <qgil> an experiment. ok, thank you
[16:43:54] <qgil> ***** Riilke, Germany
[16:44:02] <siebrand> Nikerabbit also did two rounds as GSoC student in a similar situation as MatmaRex in 2009/2010. It's a great option for current Mediawiki devs that are still in university or high school.
[16:44:47] <qgil> mmm
[16:45:02] <qgil> No time to wait
[16:45:06] <qgil> ***** Ali King, Edinburgh, UK
[16:45:11] <siebrand> O, o. Points deducted :)
[16:45:30] <ali_king_intern> My project is redeveloping the RDFIO extension for importing data in RDF format into Semantic MediaWiki. I've put out an improved minor release (1.9.6), but the major one is still under development.
[16:45:32] <ali_king_intern> Here is the extension page with the most recent release
[16:45:33] <ali_king_intern> http://www.mediawiki.org/wiki/Extension:RDFIO
[16:45:47] <ali_king_intern> The next version in development, 2.0.0, so far has basic template updating functionality, and I'm now working on edge cases and supporting infrastructure. This works for cases where the template is already in the page, or if a new page is created in a category with an associated template, it adds then populates the template call.
[16:45:48] <ali_king_intern> Here is the GitHub repo with most up to date development
[16:45:50] <ali_king_intern> https://github.com/zahara/RDFIO/tree/develop
[16:45:51] <ali_king_intern> Project reports:
[16:45:53] <ali_king_intern> http://www.mediawiki.org/wiki/Extension:RDFIO/Template_matching_for_RDFIO/Reports
[16:46:04] <ali_king_intern> I'm not as far advanced with the project for a couple of reasons. Firstly some things took longer than expected - this was mostly to do with troubleshooting things like the development environments, which delayed the start of development. Secondly I had to take some time out of the project (totalling about 2 weeks) for personal reasons.
[16:46:18] <ali_king_intern> Work is well underway on the final release, and I hope to put this out by early September. I'd also like to continue work on other enhancements after this - there are a few items on our project Trello board for future action. This includes implementing a triple store shared between wikis, possibly improvements to the RDF library used, and other changes for improved interoperability.
[16:47:03] <qgil> Have you learned as much as you expected with this project?
[16:47:34] <ali_king_intern> Certainly learned a lot - I spent much of the first few weeks of the project searching for and being overwhelmed by information. Once I started coding, the pace of learning really picked up - should definitely have got started with that sooner.
[16:47:41] <qgil> I mean... don't worry about delays, and personal problems happen to everybody. If you are learning and you are excited about the project, then this is what it counts.
[16:47:47] <ali_king_intern> My understanding of the problem definitely improved once I got stuck in. I've realised that the way I drew up the project plan is not really in line with the way I work in practice, so there's a lesson for future projects.
[16:48:26] <ali_king_intern> And definitely enjoyed the experience (even the frustrating bits)
[16:48:31] <qgil> It would be great if you could blog about this, especially this last part. We love new lessons learned!
[16:49:06] <mvolz> fyi London is East of Edinburgh :)
[16:49:10] <ali_king_intern> Definitely. I've also been asked to speak at TechMeetup about it next month, so thinking about that one...
[16:49:33] <qgil> ooops
[16:49:46] <qgil> It was a problem of copy paste, not of geography
[16:49:53] <qgil> thank you ali_king_intern
[16:49:54] <ali_king_intern> mvolz: haha - same timezone, later alphabetically
[16:49:58] <qgil> ***** Marielle Volz, London
[16:50:01] <mvolz> :D
[16:50:09] <mvolz> Final report: https://www.mediawiki.org/wiki/User:Mvolz/Weekly_Reports#.28not_so_final.29_Report:_August_18th
[16:50:10] <mvolz> Demo: https://en.wikipedia.org/wiki/User:Mvolz/veCiteFromURL#Known_bugs
[16:50:10] <mvolz> My project, citoid, is a node service that generates citations given a url/search term, and a plug-in for VisualEditor to insert those citations.
[16:50:10] <mvolz> Project page: https://www.mediawiki.org/wiki/Citoid
[16:50:10] <mvolz> The demo is a gadget and is from mid internship-ish. It's hardcoded for en wiki.
[16:50:10] <mvolz> Demo: https://en.wikipedia.org/wiki/User:Mvolz/veCiteFromURL
[16:50:11] <mvolz> What I've been working on since then is modifying it to be able to be used on any wiki by using template data.
[16:50:11] <mvolz> Good news is: I have everything working on localhost! But I still have a lot to do to make it production ready :).
[16:51:22] <qgil> mvolz, are willing to do this work? :)
[16:51:37] <qgil> you
[16:51:41] <mvolz> Namely, I have to commit the changes to Extension:TemplateData and Extension:VisualEditor that are required for everything to work. I plan to do that over the next weeks... I'm sure many -2s are in my future
[16:51:53] <qgil> :D
[16:52:28] <qgil> Are the VE maintainers helpful and responsive?
[16:52:54] <mvolz> yup
[16:53:11] <qgil> this feature will be so welcomed by editors when it's ready
[16:53:20] <qgil> thank you mvolz , and now we cross the Atlantic (and most America)
[16:53:32] <qgil> ***** Jaime Lyn Schatz, Redmond, WA, US
[16:53:39] <jlschatz_intern> I've been working on the Open Historical Map (OHM). The overall OHM project page: http://wiki.openstreetmap.org/wiki/
[16:53:40] <jlschatz_intern> Open_Historical_Map The goal of the project was, essentially, to put a time slider on a fork of the Open Street Map to enable users to search for historical maps that have been loaded into the system. My mentor has been muninn-project/Rob Warren from Carleton Univerity, CA.
[16:53:54] <jlschatz_intern> My original OPW proposal page:
[16:53:55] <jlschatz_intern> https://www.mediawiki.org/wiki/Historical_OpenStreetMap
[16:53:57] <jlschatz_intern> My logs:
[16:53:58] <jlschatz_intern> https://www.mediawiki.org/wiki/User:JaimeLyn/Weekly_Reports
[16:54:00] <jlschatz_intern> An example of how we used the issue tracker on GitHub:
[16:54:01] <jlschatz_intern> https://github.com/OpenHistoricalMap/ohm-website/issues/15
[16:54:10] <jlschatz_intern> My apologies for not having a demo. Heroku does not play nicely with the version of Rails the project uses. With more time, I could get it working (but with more time, I could get more of the **actual** project working, instead of just yak shaving.)
[16:54:24] <jlschatz_intern> The codebase is quite complex. It is actually three separate codebases, written in several languages: Javascript, Ruby, C, C++ and Python (with a healthy serving of Bash scripts for interpreting map paths!) I was able to integrate a minimum viable slider into the website, have it feed the year portion of the query through and up to the server. I've also modified the renderer to accept time as a parameter. A big stumbling block (besides
[16:54:25] <jlschatz_intern> not knowing C/C++ and spending some internship time studying the language !!) has been the difficulty in keeping the renderer backwards-compatible with non-time queries, but I'm still working on it!
[16:54:35] <jlschatz_intern> There's still so much more to do. Three months was simply not enough time for me to complete the entire project. I plan to keep contributing.
[16:54:43] * kim_bruning hands jlschatz_intern a nice golden yak
[16:54:48] <jlschatz_intern> The project can use more contributors! The team lead Susannaanas (https://meta.wikimedia.org/wiki/Grants:IdeaLab/Build/Make_an_editing_interface_for_the_Map_template) has started a page in the Idea Labs for one piece of the project: https://meta.wikimedia.org/wiki/Grants:IdeaLab/Build/Make_an_editing_interface_for_the_Map_template
[16:54:52] <kim_bruning> (now you can shave *golden* yak hairs)
[16:54:58] <jlschatz_intern> thanks kim_bruning :)
[16:55:57] <jlschatz_intern> Wow. That's quite a wall of text. Kind of matches the scope of the project :/
[16:57:15] <qgil> I want to mention that this project is a good example of a "lobby" (sorry for the wording) like "Maps" pushing for projects regardless of project/org boundaries
[16:57:15] <qgil> knowing that the end result will benefit all of us
[16:57:16] <qgil> I'm very happy to see that your project is now a reality, and I hope it is ready to e deployed soon
[16:57:17] <qgil> Hav you kep a good connection with the Maps people in Wikimedia?
[16:57:18] <qgil> (oh, my typing)
[16:58:05] <jlschatz_intern> I've dealt mostly with Susanna in Finland. Hoping to keep the connections growing and getting better
[16:58:30] <qgil> ok, sorry for the abstract-ish question. thank you jlschatz_intern
[16:58:30] <qgil> ***** Frances Hocutt, Seattle, WA, USA
[16:58:42] <qgil> (am I disconnected?)
[16:58:49] <fhocutt> Hello!
[16:58:49] <fhocutt> My project is "Evaluating and Improving MediaWiki web API client libraries". https://www.mediawiki.org/wiki/Evaluating_and_Improving_MediaWiki_web_API_client_libraries .
[16:58:54] <siebrand> qgil: I see your text.
[16:58:55] <rfarrand> qgil: nope
[16:58:56] <fhocutt> I wrote a standard for third-party API client libraries (https://www.mediawiki.org/wiki/API:Client_code/Gold_standard) and used it to evaluate the libraries available in Java, Perl, Python, and Ruby.
[16:58:56] <fhocutt> When I began this project, https://www.mediawiki.org/wiki/API:Client_code was sorted only by language and it was difficult for a developer who was new to the API to evaluate whether a given library would work for them. Now API:Client code has links to the evaluations when present! If you want to add to the evaluations yourself, feel free to ask me any questions on #mediawiki or wikitech-l.
[16:59:07] <qgil> ping?
[16:59:16] <fhocutt> I am here!
[16:59:21] <kim_bruning> qgil, pong
[16:59:40] <fhocutt> For the final portion of my project I'm contributing documentation improvements and a search feature to the Java Wiki Bot Framework (https://github.com/eldur/jwbf). I have made the README considerably more friendly to new developers, and am finishing up the search feature and a new document for developers this week.
[16:59:56] <kim_bruning> oh, that's handy. There's now a java wiki bot too?
[17:00:03] <kim_bruning> How about the python one?
[17:00:04] <qgil__> hi
[17:00:05] <fhocutt> there has been for some time now!
[17:00:11] <kim_bruning> qgil, ih
[17:00:23] <kim_bruning> fhocutt, could have fooled me. Cool beans :-)
[17:00:24] <qgil> ok, back
[17:00:27] <fhocutt> I evaluated that too! https://www.mediawiki.org/wiki/API:Client_code/Evaluations
[17:00:35] <fhocutt> four of them, actually.
[17:01:01] <fhocutt> qgil: did you get my descriptions?
[17:01:15] <qgil> yes, sorry, I had some lag
[17:01:32] <fhocutt> Any questions?
[17:02:04] * jlschatz_intern has no comments, just applause :)
[17:02:14] <fhocutt> thanks, jlschatz_intern!
[17:02:37] <qgil> fhocutt, thank you very much for helping to work in this fwild field of MediaWiki APIs
[17:02:58] <fhocutt> you're welcome! It was interesting to see all the implementations of the same idea.
[17:03:10] <qgil> we are putting effort in improving our API documentation (and the APIs themselves) in several fronts, and your project has been very helpful
[17:03:20] <fhocutt> I'm very glad to hear that.
[17:03:41] <qgil> I also hope this project fit in your next steps...
[17:04:03] <qgil> thank you fhocutt
[17:04:05] <fhocutt> I have plans to go back through the projects in a month or two and see what's changed, and update the evaluations.
[17:04:13] <qgil> and last but not least....
[17:04:17] <qgil> ***** Helen Halbert, Vancouver, BC, Canada
[17:04:29] <thepwnco> Hello. My project was a collection of various activities related to outreach and increasing the profile of Wikidata through improved documentation.
[17:04:39] <thepwnco> Because it's not *so* exciting to share an assortment of updated Help pages, here's a link to a page with the first two interactive tutorials for Wikidata created using the Guided Tours extension: http://www.wikidata.org/wiki/Wikidata:Tours (there are more on the way)
[17:05:00] <thepwnco> I also started my internship a bit later than most OPW participants, so am this last week pushing to launch a completely new Main page for Wikidata. A mock-up is available here: http://www.wikidata.org/wiki/Wikidata:Portal_Redesign/draft
[17:05:51] <siebrand> Oh, I fixed some Dutch translations for that just an hour ago :)
[17:06:04] <thepwnco> siebrand: thank you :)
[17:06:11] <qgil> thepwnco, this looks like a lot of work done!
[17:06:14] <siebrand> Didn't dare to hit the button, as I was warned that it would start making edits on my behalf.
[17:06:54] <thepwnco> qgil: it was, and as others before me have mentioned, there is still a lot more to be done
[17:06:58] <fhocutt> thepwnco, I wish that this had existed when I started my project! Nice work.
[17:07:14] <qgil> Also have no doubt that working on Help pages is a very good investment of time in any project. Thank you for this in the name of all the new users that will never meet the author of these texts. :)
[17:07:29] <thepwnco> fhocutt: thanks, that is nice to hear
[17:07:40] <qgil> Thank you thepwnco
[17:07:53] <thepwnco> :D my pleasure
[17:07:58] <qgil> Well, all in all we almost fit all the presentations in 2 hours!
[17:08:08] <qgil> I expected less interns showing up
[17:08:14] <andre__> haha
[17:08:26] <qgil> and I also expected more "failure" among those showing up
[17:08:28] * fhocutt grins
[17:08:33] <fhocutt> we're a responsible bunch.
[17:08:37] <qgil> so BIG THANK YOU to all of you -- mentors included
[17:08:55] <qgil> and well, now we will have the reviews
[17:09:01] <kart_> qgil: Thanks!
[17:09:02] <qgil> and some of you will finish the last bits
[17:09:03] <kondi> Thank you for administering qgil! :-)
[17:09:22] <kunalg> qgil: Thanks
[17:09:25] <kart_> kondi: kale malie ;)
[17:09:27] <qgil> and... hopefully you will want to stay, to learn more and meet more people and make more users more happy
[17:09:39] <jlschatz_intern> Yes - not an easy task of admining this session!! +1 qgil
[17:09:52] <kondi> kart_: internet par to roj nu che. have afk malisu! ;-)
[17:10:13] <hardikj> thanks qgil :)
[17:10:31] <andre__> Thanks Quim for having this session
[17:10:34] <andre__> ...and thanks to everybody who participated! It was really interesting to see the diversity of achievements, and feeling all the motivation around here!
[17:11:02] <kart_> Bye!
[17:11:12] <rfarrand> nice work everyone :
[17:11:13] <rfarrand> :)
[17:11:20] <aaron_xiao> thanks qgil and my mentor divec_ :)
[17:11:36] <qgil> I will still remind you about wrapping up projects etc, but in any case THANK YOU
[17:11:36] <qgil> also thank you to andre__ and rfarrand for their help co-administering the programs
[17:11:36] <qgil> EOF