Zum Inhalt springen
Kid expressing itself
17 Dezember 2019| doi: 10.5281/zenodo.3753022

TikTok: Ein Kaleidoskop an Bildern, Daten und rechtlichen Fragen

Ermöglichen Social Media Plattformen Menschen, sich über Grenzen hinweg zu verbinden, oder sind sie die Wächter digitaler Kommunikationsräume, in denen ihre Nutzer strenge, inhaltsbezogene Regeln einhalten müssen? Das jüngste Beispiel TikTok zeigt, dass Plattformen immer noch Schwierigkeiten damit haben, benutzerfreundliche Dienste mit Grundsätzen der Meinungsfreiheit zu vereinbaren. 


TikTok is the new social media platform “en vogue”. It hosts chatoyant user-generated pictures, gifs, music, and videos underlined by filters and effects that make the user experience different from usual services. TikTok’s image is young, fun, easy, and the pace is commonly described as particularly fast, leaving users feeling kind of dizzy. Currently, the TikTok application is leading in terms of downloads amongst teenagers and young adults. This confirms a trend of increasingly audio-visual content dissemination and, subsequently, of visual communication. As one user describes it: “The service is supposed to help you create and to use your cinematographic imagination to make others smile”. But this enthusiasm among users is not fully shared with experts and freedom of expression advocates, and here’s why.

A platform like an amusement park

One of TikTok’s most important feature is that it provides not only the tools to create exciting imagery and the platform infrastructure to publish it, it also develops a recommender system based on human moderation and a machine learning system which constantly supplies new content. If other large social media platforms are sometimes compared to a newsstand or a marketplace, TikTok is more like an amusement park: users have constant access to young, fun, and entertaining content. On its trending page, they will find suggestions on what they should watch and what they should post: the system will nudge them in both ways. The app and its recommender system have been under scrutiny in questions mainly related to data protection, also due to the fact that the service is owned by Chinese tech company ByteDance. Experts fear that user data from the U.S. or the E.U. might be collected and later analysed by Chinese authorities. This speculation could turn out even more unpleasing if the data is somehow used in combination with the Chinese social scoring system. There is a general concern about data protection on TikTok, especially because the app collects data of everyone not only of registered users. German newspaper Süddeutsche Zeitung reported that TikTok uses the ‘fingerprinting method’ to track users. U.S. authorities have now started investigating the company with regard to the data protection aspect and experts confirm that the Chinese authorities might indeed be looking into the user data from abroad as they expand their media control globally.

How user-generated content is perceived in China could be the reason for the company’s criticised content moderation policy. The community guidelines sound similar to those of other large social media platforms but their wording is much vaguer and generic. In addition, their enforcement on the base of guidelines is opaque to users. All in all, TikToks content moderation policy replicates mistakes from previous platforms on the level of what is removed and how. First, TikToks community guidelines forbid the usual categories of unwanted content without further describing what is meant. There are a few inconsistencies like forbidding copyright infringements without providing any guidelines on how to legally include third-party content although the app is based on the dissemination of visuals and music snippets (and users rarely compose the music themselves). It bans any type of content showing nudity without mentioning any exception like breastfeeding or art. For the sake of space, I will not go into every possible point of critique but so far TikTok’s community guidelines are simplistic and lead to a shallow treatment of the issue. As to the German peculiarity, that is, the Network Enforcement Act (NetzDG), TikTok provides an explanation which unfortunately contains misleading information: according to TikTok, content that was subject of a complaint but does not fulfill the criteria for an immediate removal within 24 hours, shall be removed within seven days. In reality, under the NetzDG, the content may as well not be removed at all if it is not punishable under German criminal law. In addition to providing a “NetzDG complaint” option next to each post (similar to YouTube), users can also access the complaint form via TikTok’s privacy and safety information page. 

A secret change in moderation

A recent report by Netzpolitik.org gives unique insights into how content is moderated on TikTok. In this online spreadsheet, a secret source describes how the content moderation guidelines have been changed after a critical piece by the Guardian (which, on a side note, demonstrates the importance of reporting bad practices) according to which the company instructs moderators to ‘censor’ content related to ‘Tiananmen Square, Tibetan independence, or Falun Gong’. Indeed, several users were locked out of their accounts for talking about detention camps in China. Another important take-away of Netzpolitik’s report is how reviewers ‘censor’ content, i.e., not only removing but mainly down-grading it. According to the community guidelines, content that violates the rules shall be removed or the account closed. Decreasing the visibility of user-generated content is not mentioned as a form of sanction but is apparently used as an easier way of hiding unwanted content without being exposed to the allegation of censorship. Investigations have shown that TikTok systematically downgraded images of people with a handicap or who are overweight. The company tried to justify this behaviour as a way of protecting users from cyber-bullying. But as a platform that describes itself as ‘inclusive community’ this argument seems – again – contradictory. It also conflicts with its (presumed) legal status as a neutral platform: curating and ranking content to this extent increasingly resembles publishers’ activities. 
To conclude, it seems impossible to keep politics out of the TikTok equation. Be it the alarming situation in Hong Kong, the creeping social scoring system or the ongoing human rights violations for minorities in mainland China – one cannot help but wonder if Tiktok will be a space for more freedom of expression or a censorship agent, possibly controlled by a government (or its laws as a proxy). Its vague and yet restrictive content moderation policy is a source of concern for users, not only because of the alleged link to the Chinese State. However, the company is noticeably pressured by public opinion as well as by a rapidly changing market: if TikTok is unable to convince their users that the app users worldwide respectfully without removing and hiding unwanted content on the base opaque community standards, it might get outpaced by other services. For instance, Instagram (not to say that its policies are exemplary) has already launched an in-app very similar to TikTok called Reels, respectively Cenas in Brazil, and history might repeat when looking back at the takeover performed Instagram Stories from Snapchats format.

Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte info@hiig.de

Amélie Heldt

Ehem. Assoziierte Forscherin: Plattform Governance

Aktuelle HIIG-Aktivitäten entdecken

Forschungsthemen im Fokus

Das HIIG beschäftigt sich mit spannenden Themen. Erfahren Sie mehr über unsere interdisziplinäre Pionierarbeit im öffentlichen Diskurs.

Forschungsthema im Fokus Entdecken

Du siehst eine Tastatur auf der eine Taste rot gefärbt ist und auf der „Control“ steht. Eine bildliche Metapher für die Regulierung von digitalen Plattformen im Internet und Data Governance. You see a keyboard on which one key is coloured red and says "Control". A figurative metaphor for the regulation of digital platforms on the internet and data governance.

Data Governance

Wir entwickeln robuste Data-Governance-Rahmenwerke und -Modelle, um praktische Lösungen für eine gute Data-Governance-Politik zu finden.

HIIG Monthly Digest

Jetzt anmelden und  die neuesten Blogartikel gesammelt per Newsletter erhalten.

Weitere Artikel

2 Quechuas, die auf einer grünen Wiese sitzen und im Sonnenlicht auf ihre Smartphones schauen, was folgendes symbolisiert: Was sind indigene Perspektiven der Digitalisierung? Die Quechuas in Peru zeigen Offenheit für die Anforderungen an das Wachstum ihrer digitalen Wirtschaft.

Digitalisierung erkunden: Indigene Perspektiven aus Puno, Peru

Was sind indigene Perspektiven der Digitalisierung? Die Quechuas in Peru zeigen Offenheit für die Anforderungen an das Wachstum ihrer digitalen Wirtschaft.

eine mehrfarbige Baumlandschaft von oben, die eine bunte digitale Publikationslandschaft symbolisiert

Diamond OA: Für eine bunte, digitale Publikationslandschaft

Der Blogpost macht auf neue finanzielle Fallstricke in der Open-Access-Transformation aufmerksam und schlägt eine gemeinschaftliche Finanzierungsstruktur für Diamond OA in Deutschland vor.

ein Haufen zusammengeknüllter Zeitungen, die Desinformation im Netz repräsentieren

Desinformation: Überschätzen wir uns wirklich selbst?

Wie bewusst sind wir im Umgang mit Desinformation im Internet und vermittelt der öffentliche Diskurs ein ausgewogenes Bild der Reichweite von Desinformationen?