Skip to content
Kid expressing itself
17 December 2019| doi: 10.5281/zenodo.3753022

TikTok, a kaleidoscope of visuals, data and legal questions

Are social media platforms empowering people to connect beyond borders or are they the custodians of digital communication spaces where users have to comply with strict content moderation rules? The recent example of TikTok shows that platforms somehow still struggle with combining user-friendly services with freedom of expression principles. 


TikTok is the new social media platform “en vogue”. It hosts chatoyant user-generated pictures, gifs, music, and videos underlined by filters and effects that make the user experience different from usual services. TikTok’s image is young, fun, easy, and the pace is commonly described as particularly fast, leaving users feeling kind of dizzy. Currently, the TikTok application is leading in terms of downloads amongst teenagers and young adults. This confirms a trend of increasingly audio-visual content dissemination and, subsequently, of visual communication. As one user describes it: “The service is supposed to help you create and to use your cinematographic imagination to make others smile”. But this enthusiasm among users is not fully shared with experts and freedom of expression advocates, and here’s why.

A platform like an amusement park

One of TikTok’s most important feature is that it provides not only the tools to create exciting imagery and the platform infrastructure to publish it, it also develops a recommender system based on human moderation and a machine learning system which constantly supplies new content. If other large social media platforms are sometimes compared to a newsstand or a marketplace, TikTok is more like an amusement park: users have constant access to young, fun, and entertaining content. On its trending page, they will find suggestions on what they should watch and what they should post: the system will nudge them in both ways. The app and its recommender system have been under scrutiny in questions mainly related to data protection, also due to the fact that the service is owned by Chinese tech company ByteDance. Experts fear that user data from the U.S. or the E.U. might be collected and later analysed by Chinese authorities. This speculation could turn out even more unpleasing if the data is somehow used in combination with the Chinese social scoring system. There is a general concern about data protection on TikTok, especially because the app collects data of everyone not only of registered users. German newspaper Süddeutsche Zeitung reported that TikTok uses the ‘fingerprinting method’ to track users. U.S. authorities have now started investigating the company with regard to the data protection aspect and experts confirm that the Chinese authorities might indeed be looking into the user data from abroad as they expand their media control globally.

How user-generated content is perceived in China could be the reason for the company’s criticised content moderation policy. The community guidelines sound similar to those of other large social media platforms but their wording is much vaguer and generic. In addition, their enforcement on the base of guidelines is opaque to users. All in all, TikToks content moderation policy replicates mistakes from previous platforms on the level of what is removed and how. First, TikToks community guidelines forbid the usual categories of unwanted content without further describing what is meant. There are a few inconsistencies like forbidding copyright infringements without providing any guidelines on how to legally include third-party content although the app is based on the dissemination of visuals and music snippets (and users rarely compose the music themselves). It bans any type of content showing nudity without mentioning any exception like breastfeeding or art. For the sake of space, I will not go into every possible point of critique but so far TikTok’s community guidelines are simplistic and lead to a shallow treatment of the issue. As to the German peculiarity, that is, the Network Enforcement Act (NetzDG), TikTok provides an explanation which unfortunately contains misleading information: according to TikTok, content that was subject of a complaint but does not fulfill the criteria for an immediate removal within 24 hours, shall be removed within seven days. In reality, under the NetzDG, the content may as well not be removed at all if it is not punishable under German criminal law. In addition to providing a “NetzDG complaint” option next to each post (similar to YouTube), users can also access the complaint form via TikTok’s privacy and safety information page. 

A secret change in moderation

A recent report by Netzpolitik.org gives unique insights into how content is moderated on TikTok. In this online spreadsheet, a secret source describes how the content moderation guidelines have been changed after a critical piece by the Guardian (which, on a side note, demonstrates the importance of reporting bad practices) according to which the company instructs moderators to ‘censor’ content related to ‘Tiananmen Square, Tibetan independence, or Falun Gong’. Indeed, several users were locked out of their accounts for talking about detention camps in China. Another important take-away of Netzpolitik’s report is how reviewers ‘censor’ content, i.e., not only removing but mainly down-grading it. According to the community guidelines, content that violates the rules shall be removed or the account closed. Decreasing the visibility of user-generated content is not mentioned as a form of sanction but is apparently used as an easier way of hiding unwanted content without being exposed to the allegation of censorship. Investigations have shown that TikTok systematically downgraded images of people with a handicap or who are overweight. The company tried to justify this behaviour as a way of protecting users from cyber-bullying. But as a platform that describes itself as ‘inclusive community’ this argument seems – again – contradictory. It also conflicts with its (presumed) legal status as a neutral platform: curating and ranking content to this extent increasingly resembles publishers’ activities. 
To conclude, it seems impossible to keep politics out of the TikTok equation. Be it the alarming situation in Hong Kong, the creeping social scoring system or the ongoing human rights violations for minorities in mainland China – one cannot help but wonder if Tiktok will be a space for more freedom of expression or a censorship agent, possibly controlled by a government (or its laws as a proxy). Its vague and yet restrictive content moderation policy is a source of concern for users, not only because of the alleged link to the Chinese State. However, the company is noticeably pressured by public opinion as well as by a rapidly changing market: if TikTok is unable to convince their users that the app users worldwide respectfully without removing and hiding unwanted content on the base opaque community standards, it might get outpaced by other services. For instance, Instagram (not to say that its policies are exemplary) has already launched an in-app very similar to TikTok called Reels, respectively Cenas in Brazil, and history might repeat when looking back at the takeover performed Instagram Stories from Snapchats format.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Amélie Heldt

Ehem. Assoziierte Forscherin: Plattform Governance

Explore Research issue in focus

Du siehst eine Tastatur auf der eine Taste rot gefärbt ist und auf der „Control“ steht. Eine bildliche Metapher für die Regulierung von digitalen Plattformen im Internet und Data Governance. You see a keyboard on which one key is coloured red and says "Control". A figurative metaphor for the regulation of digital platforms on the internet and data governance.

Data governance

We develop robust data governance frameworks and models to provide practical solutions for good data governance policies.

Sign up for HIIG's Monthly Digest

and receive our latest blog articles.

Further articles

Toolkit "Making Sense of the Future" lays on the table, representing digital futures in the classroom.

Making Sense of the Future: New brainteasers for digital futures in the classroom

Explore “Making Sense of the Future”, an open educational resource combining futures studies and creative exploration to reimagine our digital futures.

Generic visualizations generated by the author using Stable Diffusion AI representing futuristic visions for futures studies

Honey, we need to talk about the future

Can futures studies challenge the status quo beyond academia and approach public dialogue as an imaginative space for collective endeavours?

two Quechuas, sitting on green grass and looking at their smartphones, symbolising What are the indigenous perspectives of digitalisation? Quechuas in Peru show openness, challenges, and requirements to grow their digital economies

Exploring digitalisation: Indigenous perspectives from Puno, Peru

What are the indigenous perspectives of digitalisation? Quechuas in Peru show openness, challenges, and requirements to grow their digital economies.