Learning patterns/Coordinating Amongst Student Courses

A learning pattern foreducation
Coordinating Amongst Student Courses
problemMost English-language Wikipedia articles related to aquatic science topics are difficult to read, poorly organized, and/or missing information. Indeed, >70% of articles assessed by WikiProject Limnology and Oceanography (WP L&O) have only a rough collection of information, lack relevant citations, and/or are deficient in key elements such as diagrams or information boxes. Aquatic Wikipedia pages are often the first hit of a search engine query, and given that people who are curious about science topics use Wikipedia to learn more about natural phenomena they observe, there is an open need to increase the availability and readability of aquatic-related information on Wikipedia.
solutionFoster a natural synergy among our target populations--scientists, educators, and motivated learners--that will increase quality aquatic-related information on Wikipedia. To accomplish this we will connect aquatic scientists from WP L&O to instructors and students working on aquatic-related Wikipedia editing projects using the Wiki Education dashboard and associated tools. The program functions similarly to a scientific journal, where a subject matter editor (our proposed coordinator) finds reviewers (professional aquatic scientists from WP L&O) to help improve content that was contributed by the authors (students participating in our pilot program).
creatorJayzlimno
endorse
created on18:59, 9 January 2022 (UTC)
status:DRAFT

What problem does this solve?

edit

This program is designed to provide a peer support network and coordinate amongst themed student courses to add subject-specific content to Wikipedia. The examples we provide are for the aquatic sciences but we think the pattern could be applied to similarly scoped subjects (e.g. geology, botany, sociology, etc..).

What is the solution?

edit

Assigning students the task of improving aquatic Wikipedia pages ensures that content will be added, and connects the students to the professional aquatic scientists ensures that the content added is reputable. Volunteer peer-review is a service that aquatic scientists already provide for scientific publications, and our recruitment videos will explain the importance of extending this review service for Wikipedia content. Furthermore, crowd-sourcing content review orchestrated by our coordinator relieves pressure on the course instructors to be subject matter experts for all aquatic topics. This opens up opportunities for incorporating Wikipedia editing in the classroom to instructors who may be intimidated by the thought of reviewing content across a wide range of aquatic topics.

Things to consider

edit
  • A dedicated project coordinator is essential to coordinate amongst courses.

Support for Educators

edit
  • Support for the educators included written resources (available via the Google Drive folder), a Slack channel, and on-demand support by email.
  • The project coordinator also sent short weekly emails of resources/links to the educators throughout the term and put together some handouts on topics that would be helpful and distributed these along with the weekly emails.
  • The support that the educators needed was really variable. Some needed little or no support, while others were new to Wikipedia and needed quite a bit of help with certain tasks (e.g. selecting articles, deciding whether to run assignments individually or in pairs, etc.).
  • We recommend to use an approach similar to the following – create and maintain a set of online resources, and then send weekly (or biweekly) check in emails to maintain contact with the educator team. Educators viewed the project coordinator as more of a resource when the coordinator was the one actively maintaining contact with them.

Support for External Reviews

edit
  • External reviews were quite time consuming, and if this is something that will be offered to multiple courses, a dedicated point-person is a necessity. There are a lot of moving parts, and having one person who’s acting as a point of contact and keeping an eye on all of the day-to-day stuff is really important – otherwise, the reviews can start to get a bit messy (e.g., reviewers not completing assignments, reviews not ending up in the right place, etc.)
    • There’s an “economy of scale” that comes from having one person do this for a bunch of classes, but an alternative in future might be to have each instructor find one point-person for their class – it could be the instructor themselves, or could be a graduate student or TA – who recruits reviewers and coordinates the reviews for all of the students in that class.
  • One semester, our reviewer pool was small, and we asked each reviewer to do 3 articles on average. This was a pretty big ask of the reviewers…3 reviews could take quite a bit of time to complete.
  • When we had more reviewers available, we asked reviewers to do 1 article each, although leaned on more experienced Wikipedians to do 2 reviews if needed (or sometimes if the articles were very short we grouped them into pairs). This seemed to be a much more reasonable time commitment for the reviewers.

Other Information Regarding External Reviewers

edit
  • Advertising for external reviewers via Twitter was very successful (we got about 40 responses!) But – reviewer expertise doesn’t always match well with the topics of the students’ articles. Especially at the end of the review period, reviewers were asked to go outside of their comfort zones / expertise.
  • Balancing the number of reviewers vs the number of reviews got tricky. We didn’t want there to be so many reviewers that anyone went unused – but also wanted to make sure that we didn’t have to ask anyone to do a bunch of extra work. Think carefully about how many articles you will have and estimate how many reviewers are needed (within a range)
  • Some classes had delays with their Wikipedia assignments, so their external review period was rescheduled (sometimes several times, and sometimes at the last minute). Due to changes in the schedule, we lost some reviewers who were no longer able to participate, and also had to ask some reviewers to wait quite a bit longer than they’d anticipated for their assignments.
  • Because of the delays, the review period was quite long (~1.5 months). Generally, throughout the review period, our project coordinator would spend about an hour sending review assignments, answering emails, posting reviews, tracking review status, etc. every couple of days – and some days it would take longer due to questions from reviewers, rescheduling/reassigning stuff, etc. As a result, the review period ate up quite a lot of the project coordinator hours, and it was kind of tricky to anticipate when extra time would be needed.
  • Reviewer attrition. This is unavoidable, but a total pain. Most reviewers were just great and went above and beyond, all before the deadline – but a few never responded to emails, or never sent in a final review. So, there can be a bit of a scramble to reassign reviewers and get feedback to students before their class deadlines. Having a small pool of people on standby to help out in a pinch was a big help – some reviewers said they were happy to do another review if needed, so our project coordinator would respond that they were putting them on the standby list – and in the end, our coordinator had to draw on all of them.
  • Reviewers who didn’t have a Wikipedia account required some extra work on our project coordinator's end. Instead of posting their comments directly to a Wikipedia talk page, they’d send their feedback to our coordinator and copy/paste it into the talk page for them. Some reviewers completed their reviews in Word, using tracked changes and/or comment bubbles that wouldn’t copy over to Wikipedia very easily, so in those cases our coordinator would send the feedback to the instructor by email instead. We recommend that all reviews need to be returned in a plain-text document with no tracked changes to make the copy/paste process more straightforward for whomever handles the returned reviews.
  • Because of the above notes - the number of hours that will be needed to run an external review is kind of unpredictable! Estimate on the high end - our project coordinator spent about 100-130 hours coordinating external reviews across two semesters and 16 courses (284 students total, 173 unique Wikipedia articles).
  • Also, if you decide to go the route of one coordinator managing the reviews for a number of classes, capping the number of reviews that are supported is a good idea (we did 64 in the busier semester, which felt like it was busy but manageable…so maybe a cap of about 80 per semester would be reasonable?).
  • There have been a number of reviewers who have expressed an interest in participating again in future semesters, and a lot of very complementary feedback on the project from the reviewers as well!

When to use

edit

Endorsements

edit

See also

edit
edit
edit

References

edit