Wikipedia’s volunteer editors are fleeing online abuse. Here’s what that could mean for the internet (and you)
- Replies 9
We’re now sadly used to seeing toxic exchanges play out on social media platforms like X (formerly Twitter), Facebook and TikTok.
But Wikipedia is a reference work. How heated can people get over an encyclopedia?
Our research, published today, shows the answer is very heated. For example, one Wikipedia editor wrote to another:
That’s a problem for many reasons, but chief among them is if Wikipedia goes down in a ball of toxic fire, it might take the rest of the internet’s information infrastructure with it.
It’s the fourth most popular website on the internet, behind only such giants as Google, YouTube and Facebook.
Every day, millions of people worldwide use it for quick fact-checks or in-depth research.
And what happens to Wikipedia matters beyond the platform itself because of its central role in online information infrastructure.
Google search relies heavily on Wikipedia and the quality of its search results would decrease substantially if Wikipedia disappeared.
But it’s not just an increasingly authoritative source of knowledge. Even though we don’t always lump Wikipedia in with other social media platforms, it shares some common features.
It relies on contributors to create the content that the public will view and it creates spaces for those contributors to interact. Wikipedia relies solely on the work of volunteers: no one is paid for writing or editing content.
Moreover, no one checks the credentials of editors — anyone can make a contribution. This arguably makes Wikipedia the most successful collaborative project in history.
However, the fact that Wikipedia is a collaborative platform also makes it vulnerable.
A 2015 survey found 38% of surveyed Wikipedia users had experienced harassment on the platform.
What if the collaborative environment deteriorates, and its volunteer editors abandon the project?
What effect do toxic comments have on Wikipedia’s editors, content and community?
Every editor has a personal user’s talk page, and the majority of toxic comments made on the platform are on these pages.
We collected information on 57 million comments made on the user’s talk pages of 8.5 million editors across the six most active language editions of Wikipedia (English, German, French, Italian, Spanish and Russian) over a period of 20 years.
We then used a state-of-the-art machine learning algorithm to identify toxic comments. The algorithm looked for attributes a human might consider toxic, like insults, threats, or identity attacks.
We compared the activity of editors before and after they received a toxic comment, as well as with a control group of similar editors who received a non-toxic rather than toxic comment.
We found receiving a single toxic comment could reduce an editor’s activity by 1.2 active days in the short term. Considering that 80,307 users on English Wikipedia alone have received at least one toxic comment, the cumulative impact could amount to 284 lost human-years.
Moreover, some users don’t just contribute less. They stop contributing altogether.
We found that the probability of leaving Wikipedia’s community of contributors increases after receiving a toxic comment, with new users being particularly vulnerable. New editors who receive toxic comments are nearly twice as likely to leave Wikipedia as would be expected otherwise.
First, toxicity likely leads to poorer-quality content on the site. Having a diverse editor cohort is a crucial factor for maintaining content quality. The vast majority of Wikipedian editors are men, which is reflected in the content on the platform.
There are fewer articles about women, which are shorter than articles about men and more likely to centre on romantic relationships and family-related issues. They are also more often linked to articles about the opposite gender. Women are often described as wives of famous people rather than for their own merits, for example.
While multiple barriers confront women editors on Wikipedia, toxicity is likely one of the key factors that contributes to the gender imbalance. Although men and women are equally likelyto face online harassment and abuse, women experience more severe violations and are more likely to be affected by such incidents, including self-censoring.
This may affect other groups as well: our research showed that toxic comments often include not just gendered language but also ethnic slurs and other biases.
Finally, a significant rise in toxicity, especially targeted attacks on new users, could jeopardise Wikipedia’s survival.
Following a period of exponential growth in its editor base during the early 2000s, the number has been largely stable since 2016, with the exception of a brief activity spike during the COVID pandemic. Currently about the same number of editors join the project as leave, but the balance could be easily tipped if the people left because of online abuse.
That would damage not only Wikipedia, but also the rest of the online information infrastructure it helps to support.
There’s no easy fix to this, but our research shows promoting healthy communication practices is critical to protecting crucial online information ecosystems.
This article was first published on The Conversation and was written by Ivan Smirnov, Research Fellow, University of Technology Sydney
But Wikipedia is a reference work. How heated can people get over an encyclopedia?
Our research, published today, shows the answer is very heated. For example, one Wikipedia editor wrote to another:
i will find u in real life and slit your throat.
That’s a problem for many reasons, but chief among them is if Wikipedia goes down in a ball of toxic fire, it might take the rest of the internet’s information infrastructure with it.
The internet’s favourite encyclopedia
In some ways, Wikipedia is both an encyclopedia and a social media platform.It’s the fourth most popular website on the internet, behind only such giants as Google, YouTube and Facebook.
Every day, millions of people worldwide use it for quick fact-checks or in-depth research.
And what happens to Wikipedia matters beyond the platform itself because of its central role in online information infrastructure.
Google search relies heavily on Wikipedia and the quality of its search results would decrease substantially if Wikipedia disappeared.
But it’s not just an increasingly authoritative source of knowledge. Even though we don’t always lump Wikipedia in with other social media platforms, it shares some common features.
It relies on contributors to create the content that the public will view and it creates spaces for those contributors to interact. Wikipedia relies solely on the work of volunteers: no one is paid for writing or editing content.
Moreover, no one checks the credentials of editors — anyone can make a contribution. This arguably makes Wikipedia the most successful collaborative project in history.
However, the fact that Wikipedia is a collaborative platform also makes it vulnerable.
A 2015 survey found 38% of surveyed Wikipedia users had experienced harassment on the platform.
What if the collaborative environment deteriorates, and its volunteer editors abandon the project?
What effect do toxic comments have on Wikipedia’s editors, content and community?
Abusive comments lead to disengaging
To answer this question, we started with Wikipedia’s “user’s talk pages”. A user’s talk page is a place where other editors can interact with the user. They can post messages, discuss personal topics, or extend discussions from an article’s talk page.Every editor has a personal user’s talk page, and the majority of toxic comments made on the platform are on these pages.
We collected information on 57 million comments made on the user’s talk pages of 8.5 million editors across the six most active language editions of Wikipedia (English, German, French, Italian, Spanish and Russian) over a period of 20 years.
We then used a state-of-the-art machine learning algorithm to identify toxic comments. The algorithm looked for attributes a human might consider toxic, like insults, threats, or identity attacks.
We compared the activity of editors before and after they received a toxic comment, as well as with a control group of similar editors who received a non-toxic rather than toxic comment.
We found receiving a single toxic comment could reduce an editor’s activity by 1.2 active days in the short term. Considering that 80,307 users on English Wikipedia alone have received at least one toxic comment, the cumulative impact could amount to 284 lost human-years.
Moreover, some users don’t just contribute less. They stop contributing altogether.
We found that the probability of leaving Wikipedia’s community of contributors increases after receiving a toxic comment, with new users being particularly vulnerable. New editors who receive toxic comments are nearly twice as likely to leave Wikipedia as would be expected otherwise.
Wide-ranging consequences
This matters more than you might think to the millions who use Wikipedia.First, toxicity likely leads to poorer-quality content on the site. Having a diverse editor cohort is a crucial factor for maintaining content quality. The vast majority of Wikipedian editors are men, which is reflected in the content on the platform.
There are fewer articles about women, which are shorter than articles about men and more likely to centre on romantic relationships and family-related issues. They are also more often linked to articles about the opposite gender. Women are often described as wives of famous people rather than for their own merits, for example.
While multiple barriers confront women editors on Wikipedia, toxicity is likely one of the key factors that contributes to the gender imbalance. Although men and women are equally likelyto face online harassment and abuse, women experience more severe violations and are more likely to be affected by such incidents, including self-censoring.
This may affect other groups as well: our research showed that toxic comments often include not just gendered language but also ethnic slurs and other biases.
Finally, a significant rise in toxicity, especially targeted attacks on new users, could jeopardise Wikipedia’s survival.
Following a period of exponential growth in its editor base during the early 2000s, the number has been largely stable since 2016, with the exception of a brief activity spike during the COVID pandemic. Currently about the same number of editors join the project as leave, but the balance could be easily tipped if the people left because of online abuse.
That would damage not only Wikipedia, but also the rest of the online information infrastructure it helps to support.
There’s no easy fix to this, but our research shows promoting healthy communication practices is critical to protecting crucial online information ecosystems.
This article was first published on The Conversation and was written by Ivan Smirnov, Research Fellow, University of Technology Sydney