Talk:Brazil Program/Education program/Goal

Growth of PTWP

edit

Discussion for the measures revolving around the growth of PTWP.

  • Is it possible for us to be able to measure the number of people recruited from general university outreach? How many should our goal be?
From my experience, this is incredible difficult to measure, with the exception of a handful of individuals. Until we have a system in place for measuring the effectiveness of outreach events, I think we should keep this number either very low to a small handful. Other thoughts? I was surprised to see the Cairo goal being equal to the number of students recruited. Jwild 23:46, 19 January 2012 (UTC)Reply

Community Impact

edit
  • Should we have a separate measures for the Portuguese Wikipedia community and the Brazil Wkimedia community?
  • What is the appropriate ratio of Campus Ambassadors who are Wikipedians? What is the appropriate CA:Student ratio? (context, for India it was 1:30; for the US, it was about 1:12).
I think Wikipedians should be higher for pt.wikipedia - maybe 75%? Jwild 00:03, 20 January 2012 (UTC)Reply

Student Reactions

edit
  • Should we say "prefer"? Or should this be phrased more as "students would recommend a Wikipedia assignment type course to a friend"?
I am not sure that I think preference here matters at all - a lot of times students don't necessarily prefer a model that they liked but was hard. Also, I could see a student wanting to take a wikipedia assignment course in certain topics, but not feeling comfortable answering this question so broadly as to cover the gamete of all courses. Jwild 00:11, 20 January 2012 (UTC)Reply

Professor goals

edit

Should we add something about teaching professors about OER and copyright? (from User:Everton137) Jwild (talk) 21:16, 15 March 2012 (UTC)Reply

My thought is that we should add a question in the post-semester survey asking if students/teachers now know more about OER/IP/Copyright following the semester, but I don't think that is necessarily an explicit measure of success. For example, I think we would not fund this project if it only resulted in learnings about copyright. But we *would* continue to fund this program if it solely resulted in more high quality content. (though of COURSE we want the teachers/students to learn about copyright and that is one of our selling points of the program in general. but as a measure of success for the WMF, I'm not sure.) Jwild (talk) 21:34, 15 March 2012 (UTC)Reply
Jessie, I've never meant that only learning about copyright issues would be the main goal for the project and I am sorry if this was what seemed to be. For the professors goals, the main thing we have to do to with them is keep them inside the project. They are who teache dozens, if not hundreds sometimes, of university students every year. If they believe in a conscious way that Wikipedia can really be used as a teaching tool, this is the first step for increasing the number of editors in Wikipedia: the students. Copyright issues are part of the Wikimedia Foundation pillar, i. e., all content that we produce and share is free knowledge (acoording to Wikimedia Foundation and, also, Open Knowledge Foundation definitions - freedom defined and open definition, respectively). Professors should be aware of what that means, otherwise it'll be difficult to explain to their pupils what they are doing and how important is the necessity that knowldge will be free. I remember when we talked to Nitika, copyright issues was a big problem with Idian classes, so I believe that if a professor doesn't clearly understand such things, we can have a lot of forbidden content being inserted inside Wikimedia or other Wikimedia Foundation projects, which will be a bad impact for the project as a whole and it won't be well seen by the Wikimedia community. Just as an example, let's imagine a professor of arts having their students inserting each one, in a class of 30, 5 images on Commons with all rights reserved. There'll be a lot of conflict between new editors (students) and Commons community that will have to deal (as volunteers, it's good to remeber) with such high amount of work that could be avoided if we prepared professors in such a way to explain, with ambassadors help, that to students. --Tom (talk) 17:34, 22 March 2012 (UTC)Reply

Prioritizing goals

edit

I think we should prioritize our decision in the Background section about wanting to keep the program small in terms of numbers of students, number of professors, and number of courses. These were the strategies we decided upon about HOW we are going to mitigate the challenges of Brazil. Following on that point, I think this translates into not adding on additional classes for this semester, even if it means sacrificing the total number of students that actually edit. I think it makes sense to reduce the number of desired students before sacrificing the quality of service we can provide the professors for the pilot. Jwild (talk) 00:19, 16 March 2012 (UTC)Reply

I agree, Jessie. Since we are starting a pilot, I think the number of professors we have now for analysing and understaning the challenges we'll face in Brazil is good. Even as a pilot program, I believe we already have had a lot of outreach through news that appeared on large newspapers in Brazil, mainly, I believe, to the awesome work done by Juliana, Otavio and her students, which is one single succefull class - with an extraordinary job done, we must say. If we want duplicate, triplicate or whatever the number of classes/professors/students that we consider as feasible for Brazil in the coming semesters, I belive this is the key for success, work well with the classes that joined the project'. This is what will make people believe in the project, this is whay will make professors embrace the idea of using Wikipedia as a teaching tool, this is the beginning of the path to keep new editors inside Wikipedia.
From the end of last year, we have talked to a lot of professors and we included those who responded more ethusiasticaly to the project or on time to we give the necessary support. Now we have to include in our roadmap a better timing to do that, mainly if we are increasing the number of classes in a sustainable way, which I belive we should. --Tom (talk) 17:22, 22 March 2012 (UTC)Reply

Retaining students post-semester

edit

Retaining student editors after the semester would certainly be great, but I think it should be excluded as one of the metrics of success for this particular program. Back in the original design of the Education program, that was intentionally excluded from the metrics of success, since the aim of the program was to bring in high quality editors for a short period of time, who would naturally be replaced by the next semesters' editors, even if they chose NOT to edit before. That said, I think it is a measure we should still track when looking at long-term trends. Actually, a group from the Uni. of Michigan just did a research project and found that in fact the students that come from the Education Program were 20000x more likely to continue editing than other first time editors (4% vs. 0.0002%)! Jwild (talk) 00:19, 16 March 2012 (UTC)Reply

I agree that retaining students shouldn't be a metric of sucess. However, as Argenton pointed out last Friday: "Como vão manter o projeto sustentável sem esgotar os voluntários?" (how can we sustain the project without overstrain our volunteers?) I assume that no one will be CA or OA forever so we should start thinking about multiplying our high-engaged volunteers. . As I see, there's two possibilities. 1)WMF have knowledge that brazilian community is too small so project will take a lot of more time to grow. This possibility don't need extra effort to understand what makes a student highly engaged, but we are in danger about the project's future after the boom phase where volunteers starts to leave the project. 2) We should start thinking now how to engage more students as highly engaged volunteers. So far we know that students want some kind of academic reward and I think Juliana is doing something related at Unirio. Argenton gave some ideas that didn't sound reasonable to me (e.g. lots of lectures) because it looks like an huge effort to our small community to a non-interested public. My opinion is that we should collect evidences to understand what could make a student highly motivated and work solutions for next semester.OTAVIO1981 (talk) 14:40, 19 March 2012 (UTC)Reply
I have a proposal for future recruitment of ambassadors. I do not know what other universities are practicing in this purpose. I based my thoughts on common practices in brazilian universities, that's why it may not be applicable in other countries. Please, see this link and make your comments, it is very important for enriching the idea. My excuses, in advance, for the long text. Vini 175 (talk) 23:17, 19 March 2012 (UTC)Reply
Are you sure that is the right link? It is not working for me.... Thanks for taking the time to do this! Jwild (talk) 00:30, 22 March 2012 (UTC)Reply
Now it works! Thank you! :) Vini 175 (talk) 00:34, 22 March 2012 (UTC)Reply

Educational impact of the program

edit
This is just brainstorming for the moment

What do we want with an education program in Brazil: We need to answer what we want from an education program here: (a) Increase the number of experts on different fields as contributors? (professors, teachers and advanced students) Would this raise awareness about Wikipedia as a teaching and collaboration tool and bring new contributors? Would this bridge the gap between academia and Wikipedia? (b) Do we want to increase the number of articles or produce high quality articles (new and improve existing) with an educational impact for readers of Wikipedia in Portuguese? What readers are we aiming at?

I have been thinking about (b) regarding the educational impact on readers depending on the answer of (a). If the answer to (a) is to increase the number of experts, what is the impact of the content being produced? How many readers do we have for this content (expected to be of high quality!) being created?

Let me take an example of a good article created by advanced physics students during the first phase of the pilot in Brazil, w:pt:Teorema da decomposição de Helmholtz. This article was moved recently to the main domain and it was accessed about 6 times per day in the last 4 days, which is statisticly not significant at the moment, but my guess is that it won't be accessed as much as more basic subjects of physics. This is an advanced topic which only a few Portuguese speakers can understand or would find useful. At the same time, we have a matrix on physics articles for the articles quality versus its importance:

Where we can see there is a huge amount (about 150) of articles with importance 4 and quality 1 - a matrix like that could pass through a review of more experts and we could have of the fields of study. If we take as an example an article about physics that is taught at basic school, w:pt:aceleração (acceleration), we can see in the last 90 days access statistics it has from 100 to 500 access per day (23,5k in total), which for sure has a bigger impact on reader as a learning tool, if the quality of the article was better.

Think of the list of article for the offline Wikipedia focused on basic schools and the possibility of having the essential articles organized by experts by each field, as in this particular example :

For sure, having a list of a particular field of study with a high quality would have a bigger impact on the readers. Would this reduce the prejudice among teacher and professors regarding Wikipedia? With a strategy of bringing experts to use Wikipedia as a teaching (professor) and learning (students) tool, thinking of the content created that can be very specialized, would have the same impact for readers?

If we think about new editors, will the program raise enough awareness to bring new wikipedians? What is the main interest of an academic for contributing for Wikipedia or having their students contributing? Should we focus on univeristy students that will become future teachers at basic school (licenciates) or should we also involve people future researchers and university professors? Would research groups be an interesting group to involve in an education program? Could it be more efficient than an undergraduate students class, if we think about the quality of articles being created?

I have made here more questions than have given answers, which I hope we can answered through this second phase of the pilot and give the best path for a education program in the Brazilian context. Maybe other important questions should be done as well, so feel free to add them. --Ezalvarenga (talk) 22:34, 5 August 2012 (UTC)Reply

I must completely agree with you Everton. Education program should focus in excellent basic content that every encyclopedia have. Not with advanced topics that sometimes looks like original research.OTAVIO1981 (talk) 20:27, 14 August 2012 (UTC)Reply

Timelines, evaluation of the Program's impacts and its continuity

edit

There's a challenge for us regarding our capacity to evaluate the program impacts parallelly the planning and implementation of the project on a continuous basis: while we should be planning from october on and recruiting professors and ambassados throughout november and december for 2013, so we can start semester from the very beggining of March, when classes start (January and February are terrible months to be in touch with academia in Brazil), we know we won't have data and information enough to evaluate the program and its methodologies by this time. What kind of timelines we should establish and how can we make changes in the process if we'll need to start working on 2013 before having conclusions about 2012?

One suggestion is the, for the first semester of 2013, we invest in the continuity of the program and take January, February and March to evaluate in depth the program during 2012, together with professors and ambassadors, and, if we come up with the conclusion that major changes should take place, we implement them on the second semester of 2013.— The preceding unsigned comment was added by Ocastro (talk) OTAVIO1981 (talk) 20:32, 14 August 2012 (UTC)Reply

I think we should evaluate last semester to build with last year's result our portfolio of sucessfull methodologies. OTAVIO1981 (talk) 20:32, 14 August 2012 (UTC)Reply
+1 Jwild (talk) 00:00, 16 August 2012 (UTC)Reply

Professors and universities are receptive to the program!

edit

Should we update the # of prominent universities goal, so we met this last semester? Like, make it 3 prominent universities? And/or - should we change the goal to have something regarding diversity? This is a little retroactive, but we wanted to diversify the group fo professors we worked with, which is part of the reason we made the open call. Just some thoughts - neither are imperative! Jwild (talk) 21:38, 29 August 2012 (UTC)Reply

Hi! good thoughts! I think the 3 prominent universities is good. About the diversity, I think it was a goal from the first to the second semester, but I'm not sure it's a consistent goal. I'm not sure if making it more diverse next year would be a goal, so I'm not sure we would have consistency on it after a while - and then if it's worthy to create it. What do you think? I'm not certain, just thoughts, really. --Oona (WMF) (talk) 15:19, 30 August 2012 (UTC)Reply
I agree that 3 prominent universities is good but I'm not sure about diversify. What exactly you mean? I agree diversify it's important but we should stick to our main goal as improve quality by mean of keeping teachers using wiki as a teaching tool. Another good metric would be no teacher leave the program within 1 year because it means they are still motivated enough to keep working with us. Everything beyond that like raise number of editors it's a great bonus, but it's still a bonus.OTAVIO1981 (talk) 17:13, 30 August 2012 (UTC)Reply
Ok regarding the diversity of teachers: ok that's fine! Your comment, Oona, makes me wonder: are these all consistent/long-term goals? I thought we were modifying them some semester-to-semester?
Re: Otavio: I think 100% professor retention is a bit high and unrealistic, given schedules, but I do think that we should shoot for a higher retention rate. We have the goal "More than 50% of the teachers indicate that they would use Wikipedia as a teaching tool again", but what they say is different from what they do :) Maybe we could add something like "25% of professors from past semesters participating again." Maybe we can check with the US Education Program about a realistic retention rate. Jwild (talk) 14:37, 31 August 2012 (UTC)Reply
You may be right, we're changing somehow and that's fine to have specific goals. My only doubt is: shouldn't we focus on the consistent goals and make the specific ones to things we want to achieve in that semester? It would be ok to include diversity, my only doubt is whether this isn't something we already achieved (would it be good then to set it as a goal?)--Oona (WMF) (talk) 16:27, 31 August 2012 (UTC)Reply
Agree 100% is unrealistic and something like "25% of professors from past semesters participating again." is good. Oona, I don't believe that we already achieved any diversity and I insist that is a trick goal. Diversity could mean gender, courses, universities, subjects or brazilian states and we don't have human resources to deal with all of them. I believe such problem of diversity will be solved by itself when actual courses will work by themselves and we will be training others. But if you believe that is possible write something plausible to achieve please go ahead. OTAVIO1981 (talk) 20:29, 31 August 2012 (UTC)Reply
I tought Jessie meant diversifying the professors involved. Compared to the last semester, I would say we achieved (different courses, different universities, different cities). But then, this single debate shows how difficult it is to define 'diversity' in this context. Jessie, what had you in mind when you raised this? How do you think we should define and track it?--Oona (WMF) (talk) 20:36, 31 August 2012 (UTC)Reply

Feedback and proposed changes on the WikiProjects measurements and work

edit

I think you're on the right track with these measures. I'm going to go ahead and break down the ones you had into different categories: "Quality", "Activity" and "Retention". Measures of quality will track improvements in articles. Measures of activity will track who is participating, and how much they are participating and where they are participating during the study period. Measures of retention will look at how many of these participants continue editing after the study period is over, and what their editing rate is. I'm also going to make some of these measures more specific, link to some external resources that you might find useful, and try to answer the questions you have at the bottom. I'll probably also leave some questions of my own :) Jmorgan (talk) 18:25, 19 December 2012 (UTC)Reply

Thanks, Jonathan! These categories make sense. --Tom (talk) 18:27, 19 December 2012 (UTC)Reply
I will try also to think how to divide in this way and I can start editing the main page here. --Tom (talk) 19:17, 19 December 2012 (UTC)Reply
Okay, I've added in some new impact measures for you to consider. Here are a few of my basic questions.
  • do you have a list of Medicine articles that you want to focus on? I can see here that there are many un-assessed medicine articles. It might be worth it to focus on a small set of these at first. Maybe ask community members to 'nominate' a set of 50-100 medicine articles for project members to start on? Then list those articles on the Wikiproject's main page.
  • How long will your study period be? For the Teahouse, our original 'pilot period' was 3 months.
  • I came up with this idea of having experts use the Article Feedback Tool to rate some Medicine articles before and after the study. Do you think this is realistic? I was imagining that you could email a list of links to some of your contacts in universities and hospitals, and ask each person to rate a few articles. I think this could be the best way to measure article quality. But I don't know if it has been done before, or if it sounds realistic to you! I have confirmed with some of the Analytics folks that it is possible to get access to the Article Feedback Ratings for Portugese Wikipedia.
  • This is a bigger question: what is your single, biggest goal for this project? For example, is it more important for you that you get 10 experts to participate, or that you get 100 new editors to participate? Is it more important that you increase the quality of 100 articles during the study period, or that you increase the number of editors who are still editing three months after the study period by 5%? The numbers I'm using here are arbitrary, but you get the idea: deciding which of these goals you are most committed to will help you decide how to spend your time & energy.
This is exciting! Let me know when you want to meet next, Jmorgan (talk) 20:36, 19 December 2012 (UTC)Reply

Article Feedback Tool?

edit

Should we use this? Does someone here know if this project was considered useful? --Tom (talk) 12:59, 8 April 2013 (UTC)Reply

As we discussed in our meeting: I think that if we had independent subject matter experts evaluate articles that project participants have worked on it could be useful. But we would probably have to have them look at a large number of articles in order to be able to show a difference, because the AFT rating scale is very coarse-grained. To throw out a wild estimate: if we had at least 10 experts evaluate 20 articles apiece, we have a chance of identifying an overall change in quality (if there is one). But that may still be too small a sample, and we don't yet know how many articles project participants will work on.
Another consideration: we should probably only look at articles that have had a significant amount of content contributed by project participants; if someone adds a single source or a sentence or two, it's unlikely that their contribution will make a big enough difference that an evaluator would rate the article differently.
And a third consideration: AFT doesn't provide the kind of rating scheme that an expert would find most useful. The vesions I've seen (not sure what the final one looked/looks like?) had scales for 'trustworthiness', 'completeness', etc. Experts would probably want to use different criteria.
The more I think about this, the less useful it seems :) However, it may have been tried or considered before, and we should look into that: folks like LiAnna, Jami, Sage or Frank may know more about whether this is useful. Jmorgan (WMF) (talk) 03:21, 12 April 2013 (UTC)Reply

Period to measure the WikiProject

edit

The bot was ready at the end of last week, 4 April. I have just called the person in charge of the communication in the Ministry of Health and she is on vacation until 15 April. We need to finish the folders and e-mails that will be sent to the 60+ school hospitals and medicine associations by 15 April, I think.

That said, my estimate on when we can start measuring is from 22 April to 22 July (3 months). Small note: this is a project I believe it is worth trying, then it is possible I will have to continue helping on my spare time. --Tom (talk) 13:20, 8 April 2013 (UTC)Reply

Unfortunately I could not get an answer from the person in charge to help us to outreach this until last week and she missed our meeting last Friday. I will hopefully meet her today, 29 April, and let you know the results. --Tom (talk) 09:45, 29 April 2013 (UTC)Reply

Some question I've made to myself (to be improved/answered)

edit
  1. How to measure if the articles improvement by the group will be successful?
  2. Can we identify the critical mass threshold to form a self-organized group?
  3. What if the group formed focus only in maintenance tasks?
  4. How can we better stimulate the group?
  5. How to stimulate the community to develop outreach activities?
  6. Best ways to use bots to diminish manual work of volunteers

--Tom (talk) 13:32, 8 April 2013 (UTC)Reply

Teahouse for some inspiration

edit

--Tom (talk) 13:42, 8 April 2013 (UTC)Reply

Do bytes added per article on average means something?

edit

Initially I proposed to add an average of 10k bytes per article, inpired by the thoughtful WEP goals. But I am really unsure on who much this is relevant and how can we use the bytes add as a way to measure success and quality. For instance, we can have an article where lots and lots of references are added, but that not necessarily means too much bytes. Jonathan has also noted "I agree that bytes added is not a very good measure. # edits and # articles edited per editor might be better. Also, see my idea above for using AfT ratings to measure quality."

Any thoughts on this? --Tom (talk) 14:28, 8 April 2013 (UTC)Reply

Bytes is a very rough measure, and it may be inaccurate because the 'byte count' of an edit presented in the page edit history (and in the database) only compares the difference between the number of bytes the page had before the edit and the number of bytes it had after the edit. So, if someone removes 10 bytes of content and then adds 10 bytes of content in the same edit, that edit will be recorded as having added 0 bytes! Evan Rosen has developed a script for the Wikipedia Education Program that looks at other measures such as characters added, and it also provides a more accurate count of the number of bytes someone contributed to an article with each edit. He's still working out some of the kinks of the script, and all of these calculations are time-consuming and prone to errors, but hopefully when it comes time to run this analysis several months from now we will have a tool that quantify the amount of content contributed by an editor to a page. Let's revisit this idea after launch. Jmorgan (WMF) (talk) 03:10, 12 April 2013 (UTC)Reply

Tool to measure WikiProjects

edit

Please, see here. --Tom (talk) 17:28, 8 April 2013 (UTC)Reply

I've re-posted the table here (where it belongs!) and added some new datapoints and better description. Jmorgan (WMF) (talk) 04:38, 12 April 2013 (UTC)Reply
  • how many invited users have made > 100 edits per month to the WikiProject pages

Questions about WikiprojectBot and metrics

edit

I would like to get some information about how the bot will select people to invite, and how the data will be stored and reported. We should make sure we're doing this the right way now, so we don't realize we're missing something important later on. There's not a whole lot of documentation for the code yet (which is understandable, but should be rectified), so some of this may have been addressed already in the script.

1. does the bot invite blocked editors or editors?

We should avoid this. It looks bad to the community if a bot is sending cheery welcome messages to a known vandal. I had this issue with HostBot. You can get an editor's block status through the API.

It's not functional yet but will be soon. That point is important and have a high priority.
2. How does the bot distinguish between new editors and veteran editors?

We don't want to send invitations to veteran Wikipedians. I suggest that the bot only invite editors who have fewer than 100 total edits (all namespaces). That's a threshold which has been used in previous pilots to distinguish 'newbies' from veterans. You can also use registration date: in that case, the bot would invite new users who had been around for less than 3 months and had edited a medicine article. This may be addressed in the code and/or the bot request, but I'd appreciate it if someone could post an answer here as well.

A period of 3 months seems short for activity observed. A first view in a 4 month period there is a group of 108 editors, it can be greater if we add medicine related categories, in these group the average number of revision is ~3.5.
Jonas, who are these 108 editors? Anyone that has made an edit on a WikiMedicina article in the past 4 months? HAndrade (WMF) (talk) 06:28, 19 April 2013 (UTC)Reply
Exactly, Henrique, I'm not sure yet about how many articles we are talking about, ~300 pages. --Jonas (WMF) (talk) 17:41, 19 April 2013 (UTC)Reply
3. Does the bot invite IP editors?

I imagine the answer to this question is "of course not!" but it doesn't hurt to check :)

The bot doesn't invite anonymous editors, there're many messages of vandalism to such users, and is unclear how effective these invites could be.
4. Excluding IPs, veterans and blocked editors, how many people are eligible to be invited every day?

This is the most time-critical question. If our current criteria (whatever they are!) only get us a few potential invitees every day, we need to find other sources of invitees. I think that if we have fewer than 50, we need to get more! One way to get more new editors to invite is to also invite all new editors who have made a certain number of edits: for example, all editors who joined within the past 4 days, have made at least 10 edits, and have not been blocked. We need to know how opening up our criteria effects our daily average, too, so we will need to do some experiments. This is easier to do through the database than the API: if you want my help, just ask!

It's an important aspect to see which I'm including in these next version. Yes, I saw how easer it can be through database.
5. How will we generate incremental reports?

At various points during the pilot, we will want to know the answers to basic questions that we can only get at by looking at our data. Questions like: how many people have we invited so far?, how many of them have shown up?, and even more sophisticated questions that require us to compare our data with the data stored in the pt.wikipedia database. It's easy to query and join MySQL tables, or to dump a MySQL table into a spreadsheet and look at it. With Shelve and other flat-file data storage engines, this functionality will have to scripted. Ideally, someone other than the bot owner should be able to run queries or generate these reports, but as long as the bot owner is willing to provide reports or simple counts when someone asks for them that should be fine. Jmorgan (WMF) (talk) 05:06, 12 April 2013 (UTC)Reply

I can create the scripts that will generate the reports in daily basis. I'm just not sure how the bot is logging data, but I think Jonas can easily explain it to me. Also we need to decide what we are going to measure. I'll put in bullets the ideias of Jmorgam, Tom and Henrique so we can discuss them.
  • how many have we invited so far?
  • how many of the inviteds have shown up?
    Clickthroughs/pageviews are difficult to measure. Let's look at edits to the WikiProject page as an indication that someone 'showed up.' Jmorgan (WMF) (talk) 16:53, 15 April 2013 (UTC)Reply
  • how many of the email inviteds created a username?
  • how many of the email inviteds subscribed to the WikiProject?
  • how many of the email inviteds made at least 1 edit on the WikiProject?
  • how many users have made > 5 edits per month to the WikiProject pages
  • how many new users have made > 5 edits per month to the WikiProject pages
  • how many invited users have made > 5 edits per month to the WikiProject pages
  • how many email invited users have made > 5 edits per month to the WikiProject pages
  • how many email invited users have made > 5 edits per month to Wikipedia
  • how many users have made > 100 edits per month to the WikiProject pages
  • how many new users have made > 100 edits per month to the WikiProject pages
  • how many invited users have made > 100 edits per month to the WikiProject pages
  • how many email invited users have made > 100 edits per month to the WikiProject pages
  • how many email invited users have made > 100 edits per month to Wikipedia
  • how many Medicine articles edited
  • how many Medicine articles edited by WikiProject members
  • how many Medicine articles edited by inviteds
  • how many topics being discussed on WikiProject talk page
  • how many topics being discussed by inviteds on WikiProject talk page
  • how many topics being discussed by email inviteds on WikiProject talk page
  • how many articles exists on WikiProject
  • how many new articles were created on WikiProject
  • how many old articles were categorized on WikiProject
  • how many good articles on WikiProject
  • how many featured articles on WikiProject
  • how many invited users are watching the WikiProject page?
  • how many email invited users are watching the WikiProject page?

HAndrade (WMF) (talk) 18:58, 12 April 2013 (UTC)Reply

Return to "Brazil Program/Education program/Goal" page.