Skip to content
davide-ragusa-24118-unsplash-Cropped
23 March 2018| doi: 10.5281/zenodo.1205982

No, YouTube’s “trolls” won’t destroy Wikipedia

YouTube’s decision to embed Wikipedia articles under controversial videos caused a huge uproar. Commentators and scholars chastised YouTube for being lazy, for outsourcing their responsibility, or for “crushing” the online encyclopedia. Especially the last allegation however ignores one important aspect: the audience.  

Amidst allegations that YouTube would radicalise people and push towards more extreme content and threats by Sen. Mark Warner to take action, the social media platform announced that it would show Wikipedia articles alongside videos of conspiracy theories and hoaxes. This was criticized by many, with the verdicts ranging from lazy, to outsourcing responsibility, as well as endangering or “crushing” Wikipedia. Especially the last claim was echoed by several commentator or scholars on Twitter. However, they forget about the most important aspect: the audience.

So the train of thought about endangering Wikipedia goes roughly like this: a) There are a lot of “trolls” all over the web (e.g., 4chan, Reddit, Twitter, Facebook, YouTube, etc.) and they have shown that they are both willing and capable of gaming the system. b) YouTube is a home for conspiracy theories and extreme content and thus attracts “trolls”. The conclusion of a) and b), then, is that by implementing Wikipedia articles into YouTube, YouTube has angered the trolls and will send them directly to Wikipedia, the open online encyclopedia, that can be edited by everyone.

A confession of failure

Granted, it is unarguably irresponsible, lazy, and ignorant from Alphabet Inc., the company that owns YouTube and which is valued at over $570 billion by Forbes, to just implement these changes without even asking Wikimedia. It is a confession of failure. But the people criticizing YouTube for sending over the “trolls” forget three things:

First, “trolls” are not new for Wikipedia. It’s not like “trolls” weren’t aware that Wikipedia existed and haven’t tried before (it is most likely, they are trying again and again). Contropedia, a tool by the Digital Methods Initiative from the University of Amsterdam analyses Wikipedia edits. Researchers who used this tool highlighted in their analysis of climate change or Gamergate how contested these issues were. Indeed, Wikipedia has been a hotly contested place to begin with. That’s why there are rules, policies, and guidelines in place which even differ from country to country. And although the traffic might increase, the Wiki community, for now, has shown the be very resilient.

Further reading: Hate speech and fake news – how two concepts got intertwined and politicised

Second, many of the commenters seem to ignore that there is a demand for conspiracy theories, hoaxes, and extreme content. There is an audience that wants to watch this content.  The underlying assumptions of the discussions surrounding misinformation, extreme content, or conspiracy theories seem to be focussed on the supply side: who is creating the content? What is YouTube recommending? Who tries to manipulate who? You could also say that position assumes a unidirectional perspective: it strips the agency away from the audience by not acknowledging it. But we know that the audience is not just there and passive. In the 1970s, Stuart Hall explained the different ways people “read” media content and that while some people read it the way it was intended, others do not. The audience is stubborn and unruly.

Extreme positions and the mainstream

Third, there’s no reason to assume that the “trolls” are not aware of the fact that their extreme political opinions or conspiracy beliefs are not part of the mainstream. That is why they are on YouTube, in the first place. Because they don’t see those topics and talking points being discussed in the mainstream. Do you really believe that a person that believes that the world is flat or that a New World Order exists doesn’t know that these positions are not shared by most? Or do you really believe that they don’t think of Wikipedia as part of the mainstream? There are several Wikipedia alternatives for conspiracy believers and people from the far-right for a reason. This point is crucial, because it both highlights why YouTube’s idea is futile and explains why “trolls” won’t suddenly flock to Wikipedia in a united uproar. Imagine believing in reptiloids and suddenly YouTube wants to teach you how that’s wrong. For them, it’s just not new.

It is, of course, likely that YouTube’s implementation might lead to an influx of people who are trying to edit Wikipedia entries so that they reflect their position. At least initially and more out of protest than out of intentions to change Wikipedia. But this won’t crush or endanger Wikipedia. The claims that it would are based on the assumptions that the “trolls” are not aware of Wikipedia, that the “trolls” are not, in fact, an audience, and that this audience is not aware that they are a minority and not represented in the mainstream. A few years ago, the Hans-Bredow-Institute for Media Research in Germany had a project that was called “(Re-)Discovering the Audience”. Understanding the „trolls“ as audience, re-frames the issue in a way that highlight societal issues and fractures, and that shows us a way forward.


Jonas Kaiser is an associate researcher at HIIG, DFG postdoctoral fellow, and affiliate at the Berkman Klein Center for Internet & Society at Harvard University. He is currently working on a book for Oxford University Press on the far-right in Germany and the U.S. and how they (ab)use social media platforms including YouTube.


This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Jonas Kaiser

Associated Researcher: Knowledge & Society

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore current HIIG Activities

Research issues in focus

HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.

Further articles

An older woman in a wheelchair sitting alone. This reflects the urgent need to combat loneliness through digital tools and community-based, inclusive care facilities.

Opportunities to combat loneliness: How care facilities are connecting neighborhoods

Can digital tools help combat loneliness in old age? Care facilities are rethinking their role as inclusive, connected places in the community.

Antique stone statue of a woman holding a mask, symbolising the stolen identities and loss of consent in deepfake pornography."** Let me know if you'd like it adjusted for a specific platform or audience.

Unwillingly naked: How deepfake pornography intensifies sexualised violence against women

Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change.

Abstract paper cut-out on a wall symbolising the complexity and interconnection of global initiatives, representing efforts in mapping the landscape of public interest AI as explored in the article.

Artificial intelligence with purpose: Mapping the landscape of public interest AI

How is AI being used for the common good? A new dataset is mapping the landscape of public interest AI by cataloguing impactful projects worldwide.