Nein, YouTubes “Trolle” werden Wikipedia nicht zerstören
YouTubes Entscheidung Wikipedia-Artikel unter kontroverse Videos einzubetten stieß auf große Empörung. YouTube wurde Faulheit, das Abschieben von Verantwortlichkeit und auch die Zerstörung Wikipedias vorgeworfen. Insbesondere der letzte Vorwurf vergisst dabei eins: das Publikum.
Amidst allegations that YouTube would radicalise people and push towards more extreme content and threats by Sen. Mark Warner to take action, the social media platform announced that it would show Wikipedia articles alongside videos of conspiracy theories and hoaxes. This was criticized by many, with the verdicts ranging from lazy, to outsourcing responsibility, as well as endangering or “crushing” Wikipedia. Especially the last claim was echoed by several commentator or scholars on Twitter. However, they forget about the most important aspect: the audience.
So the train of thought about endangering Wikipedia goes roughly like this: a) There are a lot of “trolls” all over the web (e.g., 4chan, Reddit, Twitter, Facebook, YouTube, etc.) and they have shown that they are both willing and capable of gaming the system. b) YouTube is a home for conspiracy theories and extreme content and thus attracts “trolls”. The conclusion of a) and b), then, is that by implementing Wikipedia articles into YouTube, YouTube has angered the trolls and will send them directly to Wikipedia, the open online encyclopedia, that can be edited by everyone.
A confession of failure
Granted, it is unarguably irresponsible, lazy, and ignorant from Alphabet Inc., the company that owns YouTube and which is valued at over $570 billion by Forbes, to just implement these changes without even asking Wikimedia. It is a confession of failure. But the people criticizing YouTube for sending over the “trolls” forget three things:
First, “trolls” are not new for Wikipedia. It’s not like “trolls” weren’t aware that Wikipedia existed and haven’t tried before (it is most likely, they are trying again and again). Contropedia, a tool by the Digital Methods Initiative from the University of Amsterdam analyses Wikipedia edits. Researchers who used this tool highlighted in their analysis of climate change or Gamergate how contested these issues were. Indeed, Wikipedia has been a hotly contested place to begin with. That’s why there are rules, policies, and guidelines in place which even differ from country to country. And although the traffic might increase, the Wiki community, for now, has shown the be very resilient.
Second, many of the commenters seem to ignore that there is a demand for conspiracy theories, hoaxes, and extreme content. There is an audience that wants to watch this content. The underlying assumptions of the discussions surrounding misinformation, extreme content, or conspiracy theories seem to be focussed on the supply side: who is creating the content? What is YouTube recommending? Who tries to manipulate who? You could also say that position assumes a unidirectional perspective: it strips the agency away from the audience by not acknowledging it. But we know that the audience is not just there and passive. In the 1970s, Stuart Hall explained the different ways people “read” media content and that while some people read it the way it was intended, others do not. The audience is stubborn and unruly.
Extreme positions and the mainstream
Third, there’s no reason to assume that the “trolls” are not aware of the fact that their extreme political opinions or conspiracy beliefs are not part of the mainstream. That is why they are on YouTube, in the first place. Because they don’t see those topics and talking points being discussed in the mainstream. Do you really believe that a person that believes that the world is flat or that a New World Order exists doesn’t know that these positions are not shared by most? Or do you really believe that they don’t think of Wikipedia as part of the mainstream? There are several Wikipedia alternatives for conspiracy believers and people from the far-right for a reason. This point is crucial, because it both highlights why YouTube’s idea is futile and explains why “trolls” won’t suddenly flock to Wikipedia in a united uproar. Imagine believing in reptiloids and suddenly YouTube wants to teach you how that’s wrong. For them, it’s just not new.
It is, of course, likely that YouTube’s implementation might lead to an influx of people who are trying to edit Wikipedia entries so that they reflect their position. At least initially and more out of protest than out of intentions to change Wikipedia. But this won’t crush or endanger Wikipedia. The claims that it would are based on the assumptions that the “trolls” are not aware of Wikipedia, that the “trolls” are not, in fact, an audience, and that this audience is not aware that they are a minority and not represented in the mainstream. A few years ago, the Hans-Bredow-Institute for Media Research in Germany had a project that was called “(Re-)Discovering the Audience”. Understanding the „trolls“ as audience, re-frames the issue in a way that highlight societal issues and fractures, and that shows us a way forward.
Jonas Kaiser is an associate researcher at HIIG, DFG postdoctoral fellow, and affiliate at the Berkman Klein Center for Internet & Society at Harvard University. He is currently working on a book for Oxford University Press on the far-right in Germany and the U.S. and how they (ab)use social media platforms including YouTube.
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact firstname.lastname@example.org.
Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte email@example.com
HIIG Monthly Digest
Jetzt anmelden und die neuesten Blogartikel gesammelt per Newsletter erhalten.
Ob Zivilgesellschaft, Politik oder Wissenschaft – alle scheinen sich einig, dass die Neuen Zwanziger im Zeichen der Digitalisierung stehen werden. Doch wo stehen wir aktuell beim Thema digitale Ethik? Wie schaffen wir eine digitale Transformation unter Einbindung der Gesamtgesellschaft, also auch der Menschen, die entweder nicht die finanziellen Mittel oder aber auch das nötige Know-How besitzen, um von der Digitalisierung zu profitieren? Und was bedeuten diese umfassenden Änderungen unseres Agierens für die Demokratie? In diesem Dossier wollen wir diese Fragen behandeln und Denkanstöße bieten.
Warum stimmen wir Datenschutzvereinbarungen wie Cookies auf einer Website viel schneller zu und beachten sie online weniger als offline? Über das Privacy-Paradox-Phänomen
Automatisiertes Löschen in den sozialen Netzwerken gefährdet die Meinungsfreiheit. Wie könnten Regeln für die Moderation von Online-Inhalten aussehen? Autorin A. Borchardt beleuchtet einige Vorschläge.