Zum Inhalt springen

Making Audits Meaningful

Author: Bloch-Wehba, H., Fernandez, A., Morar, D.
Published in:
Year: 2020
Type: Working paper

While platforms use increasingly sophisticated technology to make content-related decisions that affect public discourse, firms are tight-lipped about exactly how the technologies of content moderation function. The laconic nature of industry disclosure relating to their use of algorithmic content moderation is thoroughly unacceptable, considering that regulators need to understand the platform ecosystem in order to design evidence-based regulations and monitor risks associated with the use of AI in content moderation. This white paper sets out to explain how and why audits, a specific type of transparency measure, should be mandated by law within the four clear principles of independence, access, publicity, and resources. We go on to unpack the types of transparency, and then contextualize audits in this framework while also describing risks and benefits. The white paper concludes with the explanation of the four principles, as they are derived from the previous sections.

Visit publication

Publication

Connected HIIG researchers

Friederike Stock

Former Student Assistant: Ethics of Digitalisation | NoC