Zum Inhalt springen

Inside Content Moderation: Mensch, Maschine und unsichtbare Arbeit.

Author: Stenzel, M., Mosene, K., & Efferenn, F.
Published in: Digital Society Blog
Year: 2025
Type: Other publications
DOI: 10.5281/zenodo.17415009

Who decides what we see online and what we don’t? Moderating content on social media platforms is a complex process. It is shaped not only by platform-specific rules and technical infrastructures, but also by legal frameworks at national and international levels. Closely linked to this is the question of social responsibility. Content moderation goes far beyond simply deleting problematic posts: every decision directly affects platform users, determining which voices remain visible and which are silenced. The division of labour between algorithmic systems and human moderators repeatedly reaches its limits. Platform companies outsource large parts of this moderation work to countries such as the Philippines or Kenya, where people review highly distressing content under precarious conditions. Meanwhile, the algorithms and guidelines that shape their work are largely developed in the Global North. This shifting of responsibilities reproduces or even amplifies existing inequalities, for instance, along the lines of gender, origin, or ethnicity. This article presents research approaches that critically examine these power asymmetries and incorporate intersectional as well as decolonial perspectives. The goal is to make digital spaces and the way they are governed fairer and more inclusive.

Visit publication
Download Publication

Publication

Connected HIIG researchers

Maurice Stenzel, Dr.

Senior Forscher: Human in the Loop? & Daten, Akteure, Infrastrukturen

Katharina Mosene

Wissenschaftliche Mitarbeiterin: AI & Society Lab

Frederik Efferenn

Leitung Wissenschaftskommunikation

Aktuelle HIIG-Aktivitäten entdecken

Forschungsthemen im Fokus

Das HIIG beschäftigt sich mit spannenden Themen. Erfahren Sie mehr über unsere interdisziplinäre Pionierarbeit im öffentlichen Diskurs.