Skip to content
john-cameron-IrkHdv88Xp8-unsplash
20 January 2021| doi: 10.5281/zenodo.4452615

Trump’s very own platform? Two scenarios and their legal implications

Should we applaud social media platforms for finally restricting Donald Trump’s accounts? May we hold them responsible for allowing incitement to violence to spread? Should it be up to private actors to decide whether or not to ban the US President from the digital public sphere? Most probably have a clear opinion on these questions, but in fact, they aren’t as easy to answer as it may seem.


The run on the Capitol on January 6th has dramatically shown, once again, that the spread of lies and hateful speech leads to real-life harm. But protecting freedom of expression and the free formation of opinion online while setting rules for a civilized communicative space is a complicated endeavor. It poses many questions of constitutional law and power structures in democratic societies, especially in the US but with repercussions for the rest of the world. What would change if Trump would either launch his own service or if he shifted to another service, assuming there would be no content moderation in both cases?

Scenario 1: Trump starts his own social network.

After being “indefinitely” (Facebook) and “permanently” (Twitter) suspended from the largest social media platforms, Trump suggested he could start his own service.

Two factual aspects make this scenario rather unlikely (I’ll get to the legal questions afterwards): Trump isn’t interested in dialogue. He has been using Twitter as a “typewriter”, not as a place to exchange viewpoints. He wants a channel to send unilaterally, not a forum. Secondly, he would still need the infrastructure necessary to operate. Usually, web infrastructure providers are invisible to the public and do not interfere, but when they do it makes a difference. After the attack on the Capitol last Thursday, host providers could be reluctant to support a Trump-Twitter or Trumpbook, like Amazon suspended Parler (also because the public awareness on these matters has greatly increased). App-stores too could ban the new social network’s app (they’ve already banned Parler). The lack of infrastructure doesn’t make it impossible for Trump, of course, but less probable.

Assuming Trump starts his own service, he will be very free to express his viewpoints – even more than so far. Without Twitter’s or Facebook’s community guidelines/standards, there will be no control over what he decides to share. The protection of free speech under the First Amendment is very broad and it allows only very few exceptions. Trump will be even more untamed after Joe Biden’s inauguration because his social media profile will no longer meet the requirements for a governmental public forum. According to a court decision in the case of Knight First Amendment Institute v. Trump, the tweets sent by the President qualified as a designated public forum and he was therefore not allowed to exclude (by “blocking”) people from accessing his Twitter account. Governmental communication via private digital actors is a whole other aspect of the underlying question, but what counts here is that: once Trump isn’t a government official anymore he will no longer have to allow people in his bubble. His private social network could become yet another refugium for extremists and conspiracy ideologists. Authorities will have almost no legal means to restrict it without violating the First Amendment.

Scenario 2: Trump uses a conservative platform to continue spreading his viewpoints without the barrier of content moderation.

He could turn to a platform such as Parler to avoid “strict” content rules. Indeed, Trump and other conservatives have been criticizing platforms such as Facebook, YouTube, and Twitter for propagating a left-wing perspective and “censoring” them (Trump just before his supporters stormed the Capitol: “We will finally hold big tech accountable.”).

Under Sec. 230 CDA, social media platforms don’t have to moderate: they can simply act as mere “pipes”, as opposed to editors according to the Supreme Court (Smith v. California, 361 U.S. 147). Due to the provided immunity from liability, it is up to social networks to moderate content by, for instance, banning hate speech and misinformation, or not (as we have witnessed over the past four years). Because, under the First Amendment, the State is, in principle, not allowed to regulate speech (“Congress shall make no law…abridging the freedom of speech”), this power over what may or may not be said is reserved to non-state actors, i.e. in legal relationships between private parties. Under this legal regime, Twitter is allowed to moderate or ban content because social media platforms are private actors and therefore not bound by the First Amendment. Under the state action doctrine, private parties are exempt from applying third-party fundamental rights enshrined in the Bill of Rights. The rationale behind the state action doctrine is to preserve private autonomy, leaving the relationship between private parties immune to the application of the Constitution. Private parties may only be subject to the same obligations as the government if they fall under the public function or the entanglement exception.[1]

One might argue that social media platforms are not mere private actors, like other market participants. Instead, they could be considered state actors in order for them to respect their users’ right to free speech (under to the public function exception). This has been a constant discussion among First Amendment scholars but courts are reluctant to treat social media platforms as state actors even when, according to the Supreme Court, they provide “the most important places for the exchange of views” (Packingham v. North Carolina, 582 U.S. ___ (2017)). It is, therefore, at the platform’s discretion to remove or not certain types of speech.

Could Trump then behave the way he has so far if he were to use another platform? Most likely yes, but that’s not connected to the type of platform: it has to do with the First Amendment’s protection. While one of the very few exceptions to the scope of protection is incitement to violence, the criteria established by the jurisprudence are hardly applicable to online speech. Incitement to violence may only be forbidden if it leads to “imminent lawless action” (Brandenburg v. Ohio, 395 U.S. 444) and the point in time may not be in “some indefinite future time” (Hess v. Indiana, 414 U.S. 105). According to these criteria, online hate speech isn’t concrete enough in most cases because its effects can unfold at any future point in time. Hence, even if constant incitement fuels violent actions and can have real consequences as we have seen (not only at the Capitol), it has to be more specific. This leads me to the conclusion that Trump’s speech just before the storm on the Capitol (“we are going to walk down Pennsylvania Avenue, and we are going to the Capitol, (…) and give them the kind of pride and boldness that they need to take back our country.”) would have been (if posted on a conservative network) probably concrete enough to justify an exception to the First Amendment. But for the removal of other, less specific incitements via social media, we would depend on the respective platform to act.

What’s next?

Overall, the whole situation would become even worse if Trump turns away from the platforms he has been using so far. This result leaves no room for doubt: adapting First Amendment doctrines to the digital public sphere is a pressing need, not a matter of scholar opinion. It needs to be addressed by the courts (because Congress has only very limited options under the First Amendment) in order to overcome the current obstacles and to provide a democratically legitimate answer, instead of allocating these substantial matters to the goodwill of the biggest social media platforms. One option being a new interpretation of the criteria mentioned supra, allowing for a contemporary protection of First Amendment values.


[1] For a comparison of the Drittwirkungslehre and the US public forum theories: HeldtMerging the Social and the Public: How Social Media Platforms Could be a New Public Forum, Mitchell Hamline Law Review, Vol. 6 Issue 5, Art. 1


This article was posted first on JuWiss:

Amélie Heldt, Trump’s very own platform? Two scenarios and their legal implications, JuWissBlog Nr. 3/2021 v. 11.01.2021, https://www.juwiss.de/03-2021/

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Amélie Heldt

Former Associated Researcher: Platform Governance

Explore Research issue in focus

Du siehst eine Tastatur auf der eine Taste rot gefärbt ist und auf der „Control“ steht. Eine bildliche Metapher für die Regulierung von digitalen Plattformen im Internet und Data Governance. You see a keyboard on which one key is coloured red and says "Control". A figurative metaphor for the regulation of digital platforms on the internet and data governance.

Data governance

We develop robust data governance frameworks and models to provide practical solutions for good data governance policies.

Sign up for HIIG's Monthly Digest

and receive our latest blog articles.

Further articles

eine mehrfarbige Baumlandschaft von oben, die eine bunte digitale Publikationslandschaft symbolisiert

Diamond OA: For a colourful digital publishing landscape

The blog post raises awareness of new financial pitfalls in the Open Access transformation and proposes a collaborative funding structure for Diamond OA in Germany.

a pile of crumpled up newspapers symbolising the spread of disinformation online

Disinformation: Are we really overestimating ourselves?

How aware are we of the effects and the reach of disinformation online and does the public discourse provide a balanced picture?

What skills does one need for the race with the machines on the labour market

Skills to ‘race with the machines’: The value of complementarity

As workers are constantly urged to reskill, how can they determine which skills to invest in? Learnings from one of the world’s largest online freelancing platforms.