Looking back at the Classroom Program in 2015

Helaine Blumenthal
Helaine Blumenthal

Wiki Ed supported more students in 2015 than ever before. With improvements in our tools and resources, we’ve been able to maintain quality work from those student editors.

It was a year of rapid growth and considerable change. In one term, the courses we supported rose from 117 (spring 2015) to 162 in fall 2015 — an almost 40% increase. The number of students enrolled in those courses rose from 2,290 in spring to 3,710 in the fall — a 60% increase. Closing the gender content gap, one of our major initiatives, has also seen great success. We supported 34 women’s studies classes in 2015. In those classes, 907 students improved 696 articles and created 89 new ones.

Those numbers tell a story of incredible growth supported by more resources available to classes than we’ve ever been able to offer. These students contributed 5.15 million words to Wikipedia, improved 10,200 articles, and created 1,019 new entries. Printed and bound as a book, that would be just over 7,000 pages— 13 days of silent reading.

But there’s a quirk in that story.

When we compared Fall 2015 to Fall 2014, we saw that there were nearly 1,000 more students in Fall 2015. That’s great news. But the weird thing was that these students didn’t seem to be contributing as many words as their cohort from Fall 2014.

We scratched our heads. Are our new resources causing a reduction in student contributions? We’ve always known that growth alone does not a success make. To keep quality on pace with quantity, we launched our Dashboard in the spring. That’s helped instructors create better Wikipedia assignments and track student work. Moving from Wikipedia to our Dashboard was a major change to the Classroom Program. It has been a big improvement for instructors, students, and the community.

And yet, students contributed less content.

We wondered if the course creation tool was making it so easy for instructors to design courses, that they were designing courses where students weren’t asked to contribute as much as we know they can.

We also changed the way students take, and instructors track, the online training. As a result, many of our courses have added the training as a graded requirement. In the spring, 52% of the students we supported completed the training, and 74% completed it in the fall. We think that’s led to higher-quality contributions and fewer issues involving student work. But has it lead to fewer contributions?

It would have been a mystery, but luckily, Wiki Ed brought on Kevin Schiroo, resident data detective. His first case was to examine what happened to content this term. After all, our chief focus is on improving Wikipedia in ways that tap students’ abilities, and give students the confidence to make bold contributions of their knowledge to the public sphere. We want to make sure students in these courses are challenged to contribute the quality and quantity of work we know is possible.

What Kevin told us was kind of amazing. It wasn’t that students were asked to contribute less this term. It wasn’t that we had discouraged bold editing through our online training or classroom resources.

The issue was: Their content was being challenged less often, leading to fewer whole-scale revisions of content. We always encouraged students to contribute to Wikipedia in the spirit of a peer review. That’s one of the great learning experiences the assignment carries. In the past, students made contributions, which were then questioned by other editors. In the fall of 2014, our students were more likely to revert changes without discussing the reasoning behind the change. This resulted in stressful back-and-forth reversions, and even edit wars.

For each term, Kevin made a list of all the articles students had worked on. From that list, he pulled all the revisions that were made to those articles during the term to find and remove reverted edits. With a list that was clear of any unproductive contributions, he was able to tally all of the words that were added to the page by students, knowing that anything that remained was a productive edit. Counting in this way, the difference in words added between the two terms became significantly smaller. Kevin concluded that the reverted content had been inflating the productivity of Fall 2014 compared to Fall 2015.

We heard and responded to those concerns from 2014 throughout 2015. We created a series of red flags for onboarding classes, improved our student and instructor training, and created tools to track student contributions more efficiently. We’re serious about making sure Wikipedia assignments benefit Wikipedia, as well as student learning outcomes.

This term, we’re seeing the fruits of those efforts to improve contributions. Students are getting their contributions right, and when they aren’t, they’re more likely to discuss community edits appropriately. The result is that a whopping 40% of the drop in student content contributed to Wikipedia this term is the result of students following better Wikipedia etiquette. We’ve seen a real drop in reversions and problematic edits.

The content they’ve contributed may fill fewer books than students wrote in 2014, but the books they’re writing require less revising from longterm Wikipedia editors. And those books would hold some incredible, and diverse, content. For example, a detailed description of the surface of Venus, a history of Western Canada, lots of information about women scientists, information on Japanese theater (some using sources that had been translated from Japanese). It would also have a lot of information about bees and mass spectrometry.

In hindsight, it’s been a great year for student contributions on Wikipedia. In fact, we were surprised to see just how much the support efforts have paid off. It makes us even more confident that we can continue to grow through 2016 while maintaining excellent student contributions. We’re constantly expanding tools and resources to make sure this trend continues.

We’re still looking for courses in all subjects. Our Year of Science is especially aimed at supporting courses in STEM or other sciences. We’d love to hear from you.

Helaine Blumenthal
Classroom Program Manager

Categories

3 thoughts on “Looking back at the Classroom Program in 2015

  1. Helaine:

    Great article! I enjoyed reading about how the team analyzed the data, and then looked more closely to better understand the teaching and learning patterns.

    Is there a way to change the earlier data recording student contributions and separate out student contributions vs. subsequent reversion contributions? Maybe the category we are really trying to measure is something like “non conflict contributions”?

    Yours,
    Bob

    1. Hi Bob,

      Helaine asked me to answer your question since how we measure impact is my area of responsibility.

      It’s a good question, but I’m hesitant to re-run numbers from past terms. Here’s why: We’ve always known that the “characters added” measurement is imperfect. It over-counts if students get reverted and add their work back in as this post mentions, but it also under-counts under very common instances like student editors aren’t logged in when they make their edits, they never actually enroll on the course page so we don’t have their username in our cohort, or they lose their password and end up creating a new account that is one character off the username listed on the course page so obviously them to their instructor but not recognized by our system. It also doesn’t acknowledge at all the work that goes into media assignments, since they add very few characters to Wikipedia but a lot of *value* to the article. There’s also the question of where you draw the line on productive student contributions — is an infobox with only one or two fields filled out worth the characters that dropping it into the article adds? Adjusting one element of this still leaves a flawed measurement. Note we’re still comparing apples to apples — we’re still measuring the amount of characters added to the article namespace by student editors that term.

      If we were to change what we measure, I’d like to think about it more holistically to address all the challenges in the current measurement. We have discussed creating some sort of “content improvement index” that provides a much less flawed measure of student impact. We’d need to do some significant research to figure out the right formula, but it would assign points for contributions based on factors like how frequently references are added, how much content sticks, images or videos created, adding existing images to articles, how many views the article gets so there’s also a measurement of impact on readers, etc. If we can figure this formula out, that’s when I think we should re-run past terms’ data so we get a better comparison.

      LiAnna

      1. Hi LiAnna and All:

        A “content improvement index” sounds like a very compelling idea! I would think that the index could apply to both classroom and non-classroom Wikipedia contributions. It would be very interesting to compare those values, to present a more complete picture of the thought processes and working habits of both sets of contributors.

        Yours,
        Bob

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.