Skip to content
a curved white line on green football grass, coming from the bottom right corner and ending in the top right corner, symbolising how platform councils could help regulating online communication
24 January 2024

More Power to the People: How Platform Councils Can Make Online Communication More Democratic

Social media platforms have become an integral part of public and private opinion-forming. The decisions made by platforms through terms of use and algorithmic moderation practices shape the protection of human rights. This has triggered an important discussion: How can social interest be integrated into these digital spaces? One potential solution to this complex challenge is the concept of platform councils or social media councils (SMCs). But how can they ensure that public interests and democratic values are taken into account in the regulatory processes of platforms? In our research, we asked ourselves precisely this question with 30 researchers from all continents of the world. This blog post provides a brief overview of the topic.

Public discourse in digital spaces

Private companies essentially determine the framework conditions for the public exchange of opinions on online platforms. This is done through content moderation. The term describes the process by which, for example, users’ posts, images and videos are reviewed and evaluated to ensure that they comply with the platform’s community guidelines. This moderation therefore decides what content is deemed inappropriate and what is deleted, edited or authorised on the platforms. Inappropriate or harmful content includes, for example, insults, hate speech, glorification of violence, pornographic material, misinformation or spam.

Through these terms of use and algorithmic moderation of content, social media platforms also influence public discourse. They ultimately decide which user contributions remain (and are disseminated) and which are removed (or hidden). As a result, they significantly regulate how public opinion is formed in digital spaces. This, in turn, is crucial for the democratic communication rights of citizens. Companies act as rule-makers, enforcers and ultimately judges of their own decisions.

Expansion of the entrepreneurial area of responsibility

Companies are not democracies. They are not run by democratically elected representatives. They essentially follow their profit interests. However, as the European Court of Human Rights emphasized in its 2015 Cengiz judgment, “the internet has become one of the most important means of exercising the right to freedom of information and expression by providing (…) essential tools for participation in activities and discussions on political issues and topics of general interest.” In the course of this social change, responsibility for inclusivity and the protection of human rights can therefore not be located exclusively at state level. Companies also have a decisive role to play. The interplay of state laws and private community standards, among other things, has created a hybrid regulatory framework for digital platforms.

Experts agree that companies in such important positions should also be held socially responsible for the appropriate and sustainable exercise of their primacy. Do they face a conflict here because they have to make decisions against the profit interests of the company? One possible solution to this conflict of interest is to give the decision-making process more legitimacy through platform councils. 

Platform councils as a component that improves legitimacy

The idea behind platform councils is to increase inclusivity in decision-making and the design of the communication space. By involving people who are not acting in the interests of the company, fundamental rights, and other important values are to be strengthened on the platforms. Meta’s Oversight Board is seen as the first significant step towards external control of a commercial platform’s decision-making processes. However, many other platforms remain reluctant to introduce similar governance structures.

However, the exact, effective design of these councils has not yet been uniformly clarified. Platform councils are conceivable at different levels of regulation (national, regional, and global) and can be set up in different constellations. There are also different approaches as to what the platform councils should decide on. Should they only set the broad lines of moderation practice through precedents, or should they act as a kind of court and review every decision that users question?

Inclusiveness as an important factor

The composition of the councils is a key question. Experts with technical expertise and elected representatives of users and minorities could be part of the composition. The inclusion of marginalized groups is of great importance, especially to include the interests of otherwise marginalized groups. However, the inclusivity of the platform council can be at odds with its effectiveness: Larger councils that are as heterogeneous as possible could strengthen legitimacy but at the same time struggle with the challenge of inefficient decision-making. The more interests that have to be taken into account, the more time-consuming the decision-making process becomes. 

Metas Oversight Board was also set up with this in mind. In designing it, great importance was attached to inclusivity. However, our complex modern society creates representation problems that are almost impossible to solve. As a result, Meta’s Oversight Board continues to be criticized for not taking cultural or social perspectives sufficiently into account.

Other potential drawbacks of the councils could be the weakening of state regulatory authorities, a lack of clarity about responsibilities, a dilution of ethical standards, a normative cover-up effect, and an overly global approach to language rules that should be made regionally, which disregards local practices.

Learning from others

A possible model for the complex establishment of the councils could be the European Commission for Democracy through Law (the so-called “Venice Commission”). This independent advisory body within the Council of Europe provides expertise on issues of constitutional law and democratic institutions, with a focus on best practices and minimum standards.

References

Kettemann, Matthias C and Schulz, Wolfgang – Ground Rules for Platform Councils (https://graphite.page/platform-democracy-report/#read-full-article)

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Matthias C. Kettemann, Prof. Dr. LL.M. (Harvard)

Head of Research Group and Associate Researcher: Global Constitutionalism and the Internet

Explore Research issue in focus

Du siehst eine Tastatur auf der eine Taste rot gefärbt ist und auf der „Control“ steht. Eine bildliche Metapher für die Regulierung von digitalen Plattformen im Internet und Data Governance. You see a keyboard on which one key is coloured red and says "Control". A figurative metaphor for the regulation of digital platforms on the internet and data governance.

Data governance

We develop robust data governance frameworks and models to provide practical solutions for good data governance policies.

Sign up for HIIG's Monthly Digest

and receive our latest blog articles.

Further articles

What skills does one need for the race with the machines on the labour market

Skills to ‘race with the machines’: The value of complementarity

As workers are constantly urged to reskill, how can they determine which skills to invest in? Learnings from one of the world’s largest online freelancing platforms.

Russian online platforms compete with Amazon, Facebook & co.

How Russian online platforms compete with global giants

The Russian versions of Amazon, Facebook and co. are just as successful as their US competitiveness. How did they emerge and develop?

Fassade of a skyscraper with offices. Some of them have blue and purple colored lights on, some are dark, representing the data institute

How does the data institute become public-interest orientated?

The first German data institute is intended to coordinate the data ecosystem, network across sector boundaries, and enable innovations. (How) Can this be carried out in the public interest?