Skip to content
31 October 2012

1st Berlin Colloquium – Data Privacy and Battle Trolls

by Theresa Züger

Data Privacy and Battle Trolls

Ge Chen gave a presentation on the topic of “Copyright in Developing Countries: from normative interoperability to institutional inclusivity in global copyright governance”. In a complex manner he illustrated correlations between international copyright law and Internet Governance. He focuses on the special role of developing countries in the tension between different national interests in the conflict of copyright. He assumes that in the future as in the past realisation of copyright law on a global basis is the result of contrary international positions. According to Chen copyright law will be influenced by two main aspects: Norms of access on one hand and on the other hand a new balance of institutional conditions. The position emerged that in terms of copyright law there cannot be one single global solution. Furthermore, the right to development stands in mutual dependency with copyright law and therefore requires balancing leading to different solutions in various countries.

Ge Chen – by videobuero.de

A maybe meaningful side-statement was, that the interests of developing countries are almost ignored in international legal conflicts – as to be seen in this case. Examples as the copyright law show that the internet poses a need for political reaction, in fields that present great challenges to international relations.

Ben Kamis & Thorsten Thiel – by videobuero.de

“The Orignial Battle Trolls” was the creative as well as provocative title of the second presentation with the subtitle “why states want the internet to be a violent place”. Ben Kamis and Thorsten Thiel developed the main thesis that states are the ultimate “troll” on the internet, because it describes the internet in metaphors of violence and war. The state, in the view of the presenters, establishes itself and its rules through violence and has a hegemonic view on the internet. It produced a constant picture of a society on the internet which finds itself on the edge of chaos. Therefore, questions of security were not evoked by the system itself, but constructed by states. Their research bases on the analysis of published statements made by states in five different nations. Surprisingly (?) Germany is on the top of the list of “troll-states” and uses violent metaphors rather often in comparison to other states.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Martin Pleiss

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore current HIIG Activities

Research issues in focus

HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.

Further articles

A lone shark in the blue ocean symbolises pressure, rivalry and the “shark tank” metaphor. The image reflects emotionless competition at work, where AI can trigger feelings of inferiority and lead to a loss of trust in the technology.

Emotionless competition at work: When trust in Artificial Intelligence falters

Emotionless competition with AI harms workplace trust. When employees feel outperformed by machines, confidence in their skills and the technology declines.

Rowers hold on to each other in boats forming a row. The image illustrates that defending Europe’s disinformation researchers against coordinated attacks needs a united strategy.

Defending Europe’s disinformation researchers

Disinformation researchers in Europe face lawsuits, harassment & smear campaigns. What is behind these attacks? How should the EU respond?

The picture shows a man wiping a large glass window. This is used as a metaphor for questioning assumptions about disinformation and seeking clearer understanding.

Debunking assumptions about disinformation: Rethinking what we think we know

Exploring definitions, algorithmic amplification, and detection, this article challenges assumptions about disinformation and calls for stronger research evidence.