Skip to content
a curved white line on green football grass, coming from the bottom right corner and ending in the top right corner, symbolising how platform councils could help regulating online communication
24 January 2024

More Power to the People: How Platform Councils Can Make Online Communication More Democratic

Social media platforms have become an integral part of public and private opinion-forming. The decisions made by platforms through terms of use and algorithmic moderation practices shape the protection of human rights. This has triggered an important discussion: How can social interest be integrated into these digital spaces? One potential solution to this complex challenge is the concept of platform councils or social media councils (SMCs). But how can they ensure that public interests and democratic values are taken into account in the regulatory processes of platforms? In our research, we asked ourselves precisely this question with 30 researchers from all continents of the world. This blog post provides a brief overview of the topic.

Public discourse in digital spaces

Private companies essentially determine the framework conditions for the public exchange of opinions on online platforms. This is done through content moderation. The term describes the process by which, for example, users’ posts, images and videos are reviewed and evaluated to ensure that they comply with the platform’s community guidelines. This moderation therefore decides what content is deemed inappropriate and what is deleted, edited or authorised on the platforms. Inappropriate or harmful content includes, for example, insults, hate speech, glorification of violence, pornographic material, misinformation or spam.

Through these terms of use and algorithmic moderation of content, social media platforms also influence public discourse. They ultimately decide which user contributions remain (and are disseminated) and which are removed (or hidden). As a result, they significantly regulate how public opinion is formed in digital spaces. This, in turn, is crucial for the democratic communication rights of citizens. Companies act as rule-makers, enforcers and ultimately judges of their own decisions.

Expansion of the entrepreneurial area of responsibility

Companies are not democracies. They are not run by democratically elected representatives. They essentially follow their profit interests. However, as the European Court of Human Rights emphasized in its 2015 Cengiz judgment, “the internet has become one of the most important means of exercising the right to freedom of information and expression by providing (…) essential tools for participation in activities and discussions on political issues and topics of general interest.” In the course of this social change, responsibility for inclusivity and the protection of human rights can therefore not be located exclusively at state level. Companies also have a decisive role to play. The interplay of state laws and private community standards, among other things, has created a hybrid regulatory framework for digital platforms.

Experts agree that companies in such important positions should also be held socially responsible for the appropriate and sustainable exercise of their primacy. Do they face a conflict here because they have to make decisions against the profit interests of the company? One possible solution to this conflict of interest is to give the decision-making process more legitimacy through platform councils. 

Platform councils as a component that improves legitimacy

The idea behind platform councils is to increase inclusivity in decision-making and the design of the communication space. By involving people who are not acting in the interests of the company, fundamental rights, and other important values are to be strengthened on the platforms. Meta’s Oversight Board is seen as the first significant step towards external control of a commercial platform’s decision-making processes. However, many other platforms remain reluctant to introduce similar governance structures.

However, the exact, effective design of these councils has not yet been uniformly clarified. Platform councils are conceivable at different levels of regulation (national, regional, and global) and can be set up in different constellations. There are also different approaches as to what the platform councils should decide on. Should they only set the broad lines of moderation practice through precedents, or should they act as a kind of court and review every decision that users question?

Inclusiveness as an important factor

The composition of the councils is a key question. Experts with technical expertise and elected representatives of users and minorities could be part of the composition. The inclusion of marginalized groups is of great importance, especially to include the interests of otherwise marginalized groups. However, the inclusivity of the platform council can be at odds with its effectiveness: Larger councils that are as heterogeneous as possible could strengthen legitimacy but at the same time struggle with the challenge of inefficient decision-making. The more interests that have to be taken into account, the more time-consuming the decision-making process becomes. 

Metas Oversight Board was also set up with this in mind. In designing it, great importance was attached to inclusivity. However, our complex modern society creates representation problems that are almost impossible to solve. As a result, Meta’s Oversight Board continues to be criticized for not taking cultural or social perspectives sufficiently into account.

Other potential drawbacks of the councils could be the weakening of state regulatory authorities, a lack of clarity about responsibilities, a dilution of ethical standards, a normative cover-up effect, and an overly global approach to language rules that should be made regionally, which disregards local practices.

Learning from others

A possible model for the complex establishment of the councils could be the European Commission for Democracy through Law (the so-called “Venice Commission”). This independent advisory body within the Council of Europe provides expertise on issues of constitutional law and democratic institutions, with a focus on best practices and minimum standards.


Kettemann, Matthias C and Schulz, Wolfgang – Ground Rules for Platform Councils (

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact

Matthias C. Kettemann, Prof. Dr. LL.M. (Harvard)

Head of Research Group and Associate Researcher: Global Constitutionalism and the Internet

Sign up for HIIG's Monthly Digest


You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

The photo shows an arrow sign on a brick wall, symbolising the DSA in terms of navigating platform power.

Navigating platform power: from European elections to the regulatory future

Looking back at the European elections in June 2024, this blog post takes stock of the Digital Services Act’s effect in terms of navigating platform power.

The image shows a football field from above. The players are only visible because of their shadows, symbolizing Humans in the Loop.

AI Under Supervision: Do We Need ‘Humans in the Loop’ in Automation Processes?

Automated decisions have advantages but are not always flawless. Some suggest a Human in the Loop as a solution. But does it guarantee better outcomes?

The image shows blue dices that are connected to eachother, symbolising B2B platforms.

The plurality of digital B2B platforms

This blog post dives into the diversity of digital business-to-business platforms, categorising them by governance styles and strategic aims.