Could fake news annul the Brazilian elections?
Fake News affecting polls – a possibility feared for the Brazilian Presidential Elections this year. As Supreme Court Justice Luiz Fux has claimed the election results could even be annulled. But how can courts measure the impact of fake news? Clara Iglesias Keller, visiting researcher at HIIG, deals with this question.
According to Supreme Court Justice Luiz Fux, they could. As reported, the President of the Superior Electoral Court stated that “[article] 222 of the Electoral Code provides that if the election’s result is influence by massively spread fake news, it could even be declared annulled”.
During the 2018 elections, the challenge of algorithmic efficiency in spreading false information takes center stage in the Brazilian context for the first time. It becomes crucial to understand these phenomena and the mechanisms available and necessary to deal with them. To annul an election by judicial decision is a complex move, both from a constitutional and electoral perspective, that in this case gathers issues related to freedom of expression, procedural democracy and the functioning of artificial intelligence. Therefore, it is necessary to reflect on the meaning and impact of the Justice’s words.
We know that fake news are false stories massively disseminated by people, organizations and mainly by armies of bots. Regarding freedom of expression’s implications, it is important to keep in mind that distorted information is not a digital communications novelty. Since ancient times – when greek sophists were accused of building an art of discourse out of information manipulation – to more recent regulatory frameworks on media ownership, this possibility has inspired legitimate concerns about ensuring a minimally impartial public debate. Speech distortion, whether argumentative or purely truthless, is a natural contingency of freedom of expression.
What raises concerns in the present time is the possibility of mass manipulation of facts through technology, giving a new proportion to this sort of distortion’s effects. It is natural for the democratic process that what echoes in the community is mirrored, for better or for worse, in the results of political procedures. But what if this echo is artificial? Can we compare personal or institutional argumentation with automated and covert replication of the same voice? Should these two situations be treated in the same way in a democracy?
It seems intuitive that they should not, and it was in this context that Justice Luiz Fux declared the possibility of annulling the elections, depending on the evidence and the effects of the false news in its final result. The electoral legislation already addresses the disclosure of facts about the candidates known as untrue, but the wording of art. 323 of the Electoral Code does not cover the technological apparatus behind an artificial echo. Also, as mentioned by the Justice himself, it would depend on its effects on the final result. How does one calculate that? Do courts have the necessary tools for the measurement of these effects?
Unlike other electoral frauds – such as the purchase of votes – the scope of false news is not quantifiable. Number of shares, views and the reach of a post could, in theory, be traced for these purposes (although they not necessarily should). But even so, little would that contribute to the investigation of false news’ effects in the ballot boxes. Between access to false news and voting, there is a whole path of subject will formation. One must access the content, accredit it or not, identify with its perspective, take it into consideration for the final decision. It is not possible to tell, for instance, if that information only reinforced the readers’ previous beliefs and interpretations, that would by themselves be enough to define his vote. Or, on the contrary, if it indeed swung their conceptions, affecting final decisions. That is not to say that false news have no impact on public debate. Recognizing and understanding their sociopolitical implications is a necessary task for democracies; declaring elections annulled without research inputs is one step ahead of that.
Addressing the spreading of fake news demands actions beyond the unbridled race that the Brazilian governmental branches usually pursue after technologies, in the eagerness to frame them within logics established for different contexts. In addition to using the existing apparatus, it is necessary to inform this process with the political notions of freedom of expression, democratic consensus and a knowledge about how technology operates, what it allows us to do and how it can help us. A possible court ruling on the influence of false news on elections needs to recognize these different layers, and along with them, their limits to reverse the outcome of a popular poll. Fake news may produce an artificial echo that can outweigh the peoples’ legitimate voice – but a court ruling that nullifies the election could end up running the same risk.
 Supreme Court Justices in Brazil also compose the country’s Superior Electoral Court.
 According to Section 323 of the Law 4.737/65, the dissemination of facts about the candidates known to be untrue will be punished with detention (two to twelve months) or financial penalty.
This article was originally published in portuguese at Jota.
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact email@example.com.
Explore Research issue in focus
Sign up for HIIG's Monthly Digest
and receive our latest blog articles.
A lot of data is collected about employees. Current studies show: People analytics has risks, but also real potential for human resources.
EU AI Act: Tomorrow's AI will be decided by authorities and companies in a complicated structure of competences.
What makes the Common Voice project special and what can others learn from it? An inspiring example that shows what effective participation can look like.