Skip to content
22 June 2021| doi: 10.5281/zenodo.4773130

One Council to Rule Them All: Can Social Media Become More Democratic?

Parliaments set the rules for democracies. Platforms rule their private online spaces. But as online spaces become ever more important for democratic discourse, we ask ourselves: Can we make platforms more democratic? We believe Social Media Councils may be the solution to make platform rule-making and rule-enforcement more accountable, transparent, and legitimate.

An article by Matthias C. Kettemann & Martin Fertmann

Few court decisions were as highly anticipated as the decision of the Oversight Board on the question of whether Donald Trump should be allowed to post on Facebook again. In the end, the Oversight Board tossed the ball back into Facebook’s half and told the platform to take another look at the Trump ban and, on this occasion,  clarify their rules and the intended sanctions. In addition, the Board members told Facebook to investigate what impact the company’s own recommendation algorithms and user design had on the increased polarization of the American public and the events of 6 January 2021, the storming of the U.S. Capitol.

These are all sensible mandates that are tantamount to importing social values. Yes, corporations must apply their rules fairly, especially when they are powerful. Yes, companies must not endanger the rights of others, especially with their products. In Germany, it is the courts (as in the “III. Weg” case) or the legislature (for example, through the NetzDG) that regularly guide platforms to uphold fundamental social responsibility. But does it make sense for a private advisory body like the Oversight Board to do so as well? 

Not an invention of Facebook

The Oversight Board is not a brainchild of Zuckerberg created out of thin air, but an implementation (admittedly: à la Facebook) of long-discussed institutional concepts to democratically legitimize private orders of large digital companies or at least to control them by independent bodies. These institutions, known as social media councils (in German, we use the term “Plattformräte”), have been pushed forward in recent years by associations such as ARTICLE 19 and Global Partners Digital, as well as by the former UN Special Rapporteur on Freedom of Expression, David Kaye.

Properly understood, platform councils are not a self-regulatory utopia: they should not replace existing models of private and government regulation of social networks, but merely complement them, providing additional impetus to improve private rule (enforcement) systems below the threshold of the justiciable.

What can platform councils achieve?

While platform councils can help examine possible violations of terms and conditions or community standards in individual cases, their true added value lies in the systemic improvement of companies’ governance systems beyond the individual case.

By formulating requirements for terms of service, enforcement practices and algorithms, review and complaint procedures that go beyond individual cases and generate implementation pressure on companies, including through public criticism, platform councils can provide valuable impetus. At the same time, they must keep a constructive debate going with companies about such improvements.

How should platform councils be structured?

Reliable empirical data on the design of platform councils is still lacking. At present, a combination of a (quasi-judicial) complaints body and (quasi-legislative) participation in shaping the rules would appear to be optimal, as this would enable systemic improvement impulses to be set on a broad basis, but their success can also be verified in specific cases.

In view of the legitimacy deficit of private standard-setting, democratic experiments such as staffing the councils with randomly selected users or citizens as democratic “mini-publics” seem necessary and appropriate. Even a panel of experts can already be valuable, insofar as ensuring consistent and non-discriminatory practices, fair procedures, and human rights-compliant rules is the goal. Unlike in the case of the Oversight Board, an industry-wide body has the effect of preventing structural dependencies vis-à-vis the controlled companies.

In this respect, well-known forms of non-state media supervision by industry-wide self-regulatory bodies such as press and advertising councils can serve as a source of inspiration. Experiences with the supervisory bodies of the state media authorities or the broadcasting councils of public broadcasters should also be used in the design of platform councils, but should not be transferred schematically in view of the widely differing control requirements.

What are the dangers of platform councils?

The accusation that such councils merely shift responsibility is understandable: If inadequately designed, platform councils run the risk of concealing actual power structures without initiating real change. They must therefore not only meet high transparency requirements themselves, but must also be equipped with information rights and linked to data access initiatives, so that different actors can understand the extent to which changes urged by them actually occur. In its current form, the Facebook Oversight Board has only very limited resources for this purpose. Whether it can successfully monitor the implementation of its decisions and recommendations in the future depends largely on its willingness to become institutionally self-empowering. However, a positive first step is that the Board refused to take a decision on the Trump ban itself but rather shifted the responsibility back to Facebook. 

What requirements should be placed on platform councils?

The discussion on platform councils is closely intertwined with demands that platforms should align the formulation and enforcement of their often international private rules with international human rights standards.

At least insofar as the idea of an international or regional (European) platform council is pursued, these standards provide a framework not only for informing the decision-making practices of such institutions, but also for developing requirements for their design.

If, on the other hand, the concept of a national, German platform council is pursued, the decades-long constitutional preoccupation of case law and literature with the supervisory bodies of public broadcasting holds is a normative treasure trove.. Such a constitutional reference could, for example, help to balance the independence of the council, which is necessary for effective control, with the cooperative relationship with the company, which is necessary for the effective implementation of its decisions.

Where do we go from here?

Admittedly: Platform councils represent only a gradual improvement. However, in the intricate regulatory triangle of states, platform companies and corporations, little more than gradual improvements can be expected. Should platform councils become established, they could resemble their institutional forebears such as the press and broadcasting councils, which have faced inevitable criticism as compromise solutions, but have held their own for decades due to the lack of feasible alternatives.

Despite its shortcomings, the Oversight Board is an important first example of a platform council and provides good material for analysis, both in terms of advantages and disadvantages. Not only with the decision in the Trump case, but also with its nearly 20 initial decisions in the past few months, it has confidently defined its position in Facebook’s regulatory structure and initiated initial changes, the permanence of which remains to be seen.

At the same time, this implementation should not be exaggerated as an ideal or monopolize the discussion about platform councils terminologically (“oversight boards”) or conceptually. On the contrary, platform councils are a starting point for discussing the future shape of digital governance. To keep the opinion market open, fair competition is needed, and competition, as is becoming apparent, is now also needed between different approaches to the institutional safeguarding of this market on the Net.


Social media companies have become exceedingly powerful. They set rules that influence how online communication takes place. Discussions on more to improve the  democratic legitimacy of platform rules have recently taken on speed. Social Media Councils, as this article shows, are a powerful tool to bring people and their problems into platform norm-making processes – if they are implemented rightly. 

Martin Fertmann (@MFertmann) is Junior Researcher for Content Moderation and International Law at the Leibniz Institute for Media Research | Hans-Bredow-Institut and a fellow at the Doctoral Research Group “Law and its Teaching in the Digital Transformation” of the Center for Law in the Digital Transformation at the University of Hamburg.

Matthias C. Kettemann (@MCKettemann) is Head of the Research Group Global Constitutionalism and the Internet at the Humboldt Institute for Internet and Society, Berlin, Research Program Head at the Leibniz Institute for Media Research | Hans Bredow Institute. He is currently visiting professor of international law at the University of Jena. This contribution is based on a study by the authors commissioned by the Friedrich Naumann Foundation. See also the authors’ interview summarizing their study, and Matthias’ analysis of the Oversight Board. This blog is based on an op-ed for the Tagesspiegel Background Digitalisierung&KI (10 March 2021).

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact

Matthias C. Kettemann, Prof. Dr. LL.M. (Harvard)

Head of Research Group and Associate Researcher: Global Constitutionalism and the Internet

Explore Research issue in focus

Du siehst eine Tastatur auf der eine Taste rot gefärbt ist und auf der „Control“ steht. Eine bildliche Metapher für die Regulierung von digitalen Plattformen im Internet und Data Governance. You see a keyboard on which one key is coloured red and says "Control". A figurative metaphor for the regulation of digital platforms on the internet and data governance.

Data governance

We develop robust data governance frameworks and models to provide practical solutions for good data governance policies.

Sign up for HIIG's Monthly Digest

and receive our latest blog articles.

Further articles

The photo shows a group of young people all looking on their mobile phones, showing that someone with No Smartphone is excluded and perceived as weird.

No Smartphone = Cringe Weirdo

In this blog post, author Jascha Bareis shares his experiences since getting his first smartphone just this year. 

This picture shows the blue and yellow flag of the European Union representing the upcoming European elections.

European elections and digital policy: German party positions

To what extent are German parties addressing digital policy in the European elections? A glance at the election programmes reveals different priorities.

The picture shows multiple hands holding each other, symbolising the integration of gender and inclusivity into digital cultural policies.

Integrating gender and inclusivity in digital cultural policies: insights from Berlin and Barcelona

Could Berlin and Barcelona's integrative approach to digitalisation serve as a blueprint for a new European cultural policy in the digital age?