Skip to content
Social Media Councils can help to force digital platforms into responsibilty
29 September 2022

Social Media Councils: An effective means of holding digital platforms accountable?

How can decisions and orders of digital platforms like Meta, Twitter and Co. be made more accountable to their users and the public interest? A recent answer is: through more participation. A number of platforms have started experimenting with Social Media Councils (SMCs) to gain civil society input for aligning platform rules with human rights. But do they work? And are they really that new? This post takes readers on a journey from press councils to today’s platforms and teases out which incentives are crucial for more responsible platform rules.

Introduction

In recent years, Social Media Councils (SMCs) have emerged as a model of, or add-on to, platform (self-)regulation. SMCs generally are designed to open up the blackbox of corporate platform structures to civil society input and public values. Politics has long underestimated social media platforms and their impact on society (douek, 775-6). As people, especially younger people, spend more and more time online, platform-based communication takes an increasingly substantial role in shaping societal discourses (Hölig/Behre/Schulz, 5-7). #metoo and #blacklivesmatter allowed the discourse on persistent sexist and racist violence to scale into general public consciousness; contrastly, the spreading of disinformation (as seen in the Russo-Ukrainian war or US election campaigns) undermines public trust and may even culminate in as large events as the US Capitol Attack. In this context, recommendation algorithms on social media also influence which information and content people do and do not see in their online experience. 

Thus, by enabling online participation, these communication structures have the potential to support and expand inclusive social discourses. On the other hand, they can also exacerbate social frictions. In extremis, platform speech can lead to, and contain, violations of human rights: privacy rights infringed by data exploitation, inequality exacerbated by opaque algorithms, threats to personal integrity through hate speech. Challenges to democratic resilience and to societal cohesion persist with disinformation and polarisation tendencies (e.g., Balkin, 1151 ff.; Rau/Simon). But what are the options to build platforms better? Are SMCs an effective alternative to top-down, command-and-control governance? As research on platforms councils demonstrates, incentives have to carefully be taken into account: why would the social media corporation feel bound to the recommendations and assessments of the council and to what extent, given their primarily economic interests? 

The rise of Social Media Councils

SMCs constitute “external governance structures tasked either with formulating and/or applying rules or determining the discoverability or visibility of content on social networks in addition to or instead of the platforms; or tasked with monitoring the platform’s activities relating thereto” (Kettemann/Fertmann, 7). Examples for industry self-regulation are the prominent Meta Oversight Board, the Twitter Trust and Safety Council, the TikTok Content Advisory Council, European Safety Advisory Council and Diversity and Inclusion Council, the Spotify Safety Advisory Council, and Twitch’s Safety Advisory Council. Meta’s Oversight Board, while also fulfilling advisory functions, is often framed to have a quasi-judicial function, adjudicating over selected content moderation cases with binding effect on Meta (Cowls et al.); the other Councils are exclusively of advisory nature. So far, the members of the SMCs consist mainly of experts of the field, for example human rights practitioners and people with expertise in platforms and platform governance, but also users as in the Tiktok Creator Council.

Press councils as historic origins

Some lessons can be drawn from the history of press councils. Since the 1950s, they mushroomed across the globe (Blum, 77). These ethics-focused institutions consist of journalists and publishers, and were built to preserve media and press freedom both against outside influences and in securing responsibility, quality and independence from within (Puppis, 65-6). The complaint mechanism of the German Press Council, for example, allows citizens to criticize unsound journalistic practices (Puppis, 213, 215, 245-6). Its effectiveness however is questionable: one example is the German “BILD Zeitung” which famously ignored many of the Press Councils reprimands (Klausa). While many journalists and magazines pledge to benefit public informational interest with their work, the production of news is still an economic endeavour – and thus subject to economic interests: How can news stories generate the most appeal for the relevant target group, and with the lowest possible cost? Naturally, these incentives carry the potential to conflict with public interests (McManus). Tabloid newspapers – like BILD Zeitung – are by many not read for their journalistic scrutiny, but rather for entertainment and the subsequent social activity of exchanging stories and opinions (Johansson). The (dis)incentive of condemnation by the Press Council is simply not strong enough to secure compliance. 

More than ethics-washing?

Press Councils are self-regulatory, yet from the perspective of the magazines the regulation is still imposed from outside. Isn’t its effectiveness different when based on voluntary pledges? 

Meta publicly pledged to adhere to the decisions of its Oversight Board. The Board is independent and bases its decisions not only on Community Standards, but also on Human Rights principles (Oversight Board, 5-6). Vaidhyanathan criticises that the OB only caters to a small fraction of the complaints and none of the systemic concerns of the platform. Still, the first OB decisions do cover important aspects, like satire vs. free speech and algorithmic moderation practices, information on deletions, and clarity of sanctions (the famous Trump case).  Bietti (276) raises concerns that the OB by its focus on content moderation detracts from other sensitive practices, such as the lucrative algorithmization of its news feed.

In another self-regulation endeavour, IBM  made transparency concessions to combat racial profiling by publishing anti-bias data sets of its facial recognition technology (FRT). Zalnieriute (143-4) marks that at the very same time IBM stayed silent on the privacy infringements correlating with the release, as well as its major role in advancing racial profiling in the first place. 

Regulation is messy. Business self-regulation in particular often involves multiple interests under the overarching aim of the best cost-benefit calculation. Corporate ethics collide with user and immediate financial interests. Without caution, ethics can easily be instrumentalized into ethics-washing.

Incentives matter

While Social Media Councils are a rather recent invention, the underlying idea is not. Self-regulatory projects of past and present, like press councils, the Oversight Board and IBM’s transparency releases, can serve as examples to exercise precision in the details of the design and promotion of SMCs. To avoid superficial solutions and ethics-washing in aligning public values and platform governance, a cost-benefit analysis may provide an appropriate lens to assess economic incentives for compliance and sincere cooperation in any self- or co-regulatory enterprises.

One such incentive is public trust: without trust the platforms lose their users and the marketplace may break down (Cusumano et al, 1277). SMCs must therefore be built in a way that fosters public trust – for example by promoting feedback-receptive designs and radical transparency. 

Aggressive regulation might also be effective

Another is the threat of aggressive government regulation (id., 1278-9). The forthcoming EU Digital Services Act already provides several co-regulatory mechanisms, such as trusted flaggers (Art. 19), the Digital Services Coordinators (Art. 38) and out-of-court dispute settlement bodies (Art. 18), but also harsh fines up to 6% of the total worldwide annual turnover (Art. 42, 59). While not the only factor, the evolution of several SMCs correlates with the starkening of legal regulation. The threat of severe enforcement (fines or even bans) may incite social media companies to secure the effectiveness of self-regulatory SMCs themselves.

In the words of Vaidhyanathan: “Self-regulation is an excellent way to appear to promote particular values and keep scrutiny and regulation to a minimum. When self-regulation succeeds at improving conditions for consumers, citizens, or workers, it does so by establishing deliberative bodies that can act swiftly and firmly, and generate clear, enforceable codes of conduct.” In governing online communication well, pulling together in a shared effort of corporations, civil society and State structures is mandatory.

This blogpost is part of the project platform://democracy which seeks to investigate whether and how Social Media Councils hold the potential for effectively aligning public values and online private ordering structures.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Josefa Francke

Former Associated Researcher: Global Constitutionalism and the Internet

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

The picture shows a white wall with several clocks, all showing a different time. This symbolises the paradoxical impact of generative AI in the workplace on productivity.

Between time savings and additional effort: Generative AI in the workplace 

Generative AI in the workplace is enhancing productivity, yet employees face mixed results. This post examines chatbots' paradoxical impact on efficiency.

The picture shows seven yellow heads of lego figures, portraying different emotions. This symbolizes the emotions university educators go through in the process of resistance to change due to digitalisation.

Resistance to change: Challenges and opportunities in digital higher education

Resistance to change in higher education is inevitable. However, if properly understood, it can contribute to shaping digital transformation constructively.

The picture shows a young lion, symbolising our automated German text simplifier Simba, which was developed by our research group Public Interest AI.

From theory to practice and back again: A journey in Public Interest AI

This blog post reflects on our initial Public Interest AI principles, using our experiences from developing Simba, an open-source German text simplifier.