A few months ago at PubComm, on a whim, I attended a workshop titled “Avoiding the Seven Deadly Sins of Wikipedia: Understanding and Working with Wikipedia Culture” by Mary Mark Ockerbloom, the Wikipedian-in-Residence at the Chemical Heritage Foundation. While I had only added to an informal school-specific Wikipedia once, for a class, I was swept up in her descriptions of writing and citation standards, the vigilance of power users, and especially the unbalanced demographics of Wikipedia users and how that affects what topics are covered, and to what extent, on the website. Ockerbloom showed statistics about how, for example, 85-90% of Wikipedia users who indicate their gender are men. The number of articles about men vs other genders, and their respective lengths and depths, reflects this.
I think this issue is a prominent one in any crowdsourcing project: how do users affect what work is being done? To be fair, this is an issue with any project: people will naturally want to do work that is more relevant or interesting to them. However, on the larger scale of crowdsourcing, these kinds of biases become more apparent. How does this get fixed, to make sure that the products of Wikipedia editing are distributed more evenly?
One solution is through Wikipedia edit-a-thons, such as through the Wikipedia Rewriting Project, which organizes drives to write about underrepresented topics, such as women and people of color. These kinds of events have raised amazing traction in the past few years and have contributed to a wealth of new articles being added to Wikipedia, however this is ultimately a small dent. Could there be another model for this, beyond simply encouraging people to write about underrepresented topics more or gaining more women and POC users?
I was struck by the simplicity of other crowdsourcing projects, such as Building Inspector. Through this site which seeks to improve map-reading AI, users can identify colors on a map, fix footprints, and transcribe addresses. Users choose the task they want to do, and the website presents a small area of a map for users to complete the task. The website automatically produces different areas of the map, so user preference for map location does not factor into the work being done. It’s a fairly mindless activity that users can click through and make an impact on the digital humanities without much consideration or energy.
Can some of this function be translated to Wikipedia? This could perhaps be done with minor edits, such as proofreading and finding citations. A user could log on and be presented with a random paragraph or “[needs citation]” marker. The user could then proofread the paragraph for comprehension, or attempt to find a citation for the claim. This would especially make it easy for more people to contribute, especially those who do not have the time, energy, or knowledge to edit Wikipedia more fully. However, this would take much more energy than simply clicking on a map, and of course does not solve the issue of submitting content in the first place.
Of course, the problem of Wikipedia containing a significantly larger portion of articles about white men is much larger than just Wikipedia: the patriarchy, white supremacy, and other forms of oppression all play a role in current Wikipedia users both being and writing primarily about white men. These hegemonies must ultimately be dismantled, but in the meantime, let’s all go join edit-a-thons!