Colonised by data: the hollowing out of digital society
What exactly is colonial about the economic use of data? On November 20, Nick Couldry continued the lecture series Making Sense of the Digital Society. In his lecture, he focused on the social consequences of data collection by private companies. He characterised the current tendencies as a new form of colonisation that has serious consequences for society. In this blogpost, Marc Pirogan takes a look back at Couldry’s lecture.
Ever since the Snowden revelations in 2013, it is clear that something big is going on with data. But the real story behind the data collection by surveillance agencies such as the NSA, is how much data private companies collect. In a “public-private surveillance partnership,” government agencies profit from the private sector that leads the way. Couldry emphasised that this development is not only a new phase of capitalism, but also a new phase of colonialism.
What is colonial about data colonialism?
Colonialism’s function was historically the appropriation of resources on a vast scale. Territory was acquired, and with it the bodies, for a long time those of enslaved people, who extracted the value from the resources. The resource that is being appropriated under digital capitalism is human life, in the form of data. Therefore, humans are the target in this new phase of colonialism.
Every time we join a platform, we accept its terms of service, usually without reading it. By doing so we enter a set of ‘data relations’ that we do not totally understand. Couldry compared this to the Spanish conquest of Latin America, where local populations were forced to accept the Spaniards’ demands in a document called Requerimiento. The locals were forced to listen to and accept the demands, even though they did not even understand the language of the document.
Nowadays, according to Couldry, we are not threatened with violence in order to accept the terms of service, but there are still decisive parallels to historical colonialism. First, resources are being appropriated, now in the form of information about human experience and action. Second, social relations are being colonised, as they become “data relations” that aim to maximise data extraction for value. Third, the value created is concentrated in the hands of a few corporations. And fourth, there is an ideology that disguises the process and gives it a progressive, “civilising” frame.
Yet, today’s new colonialism builds on top of the already existing order of capitalism and is not a precondition to it as it was with the historical colonialism. Therefore, Couldry claimed, data colonialism does not need violence to be effective. Capitalism works through social relations and the social order built around it. It turned almost all social transactions into money transactions, until a counter-movement, as Karl Polanyi famously described it, emerged. Social reforms were introduced to alleviate the social effects of industrial capitalism.
Now, we seem to be at a similar point. Interactions are being transformed into data transactions that follow the purpose of maximising value extraction. These data relations do not yet face much resistance. In order to develop such resistance, Couldry emphasised that our understanding of the datafication process has to be altered.
The changing status of knowledge
If we look at the datafication process, it is changing the status of knowledge in our society. In previous eras, knowledge was, in principle, separate from economic value. In response to the early horrors of capitalism, social knowledge in the form of public statistics began to be produced in the 19th century. This knowledge was publicly funded and collected, and then publicly analysed and put to use by governments and by civil society organisations in order to achieve social reforms.
Moreover, this knowledge was publicly debated and was more or less publicly accountable. Couldry does not wish to idealise the governments of the time, as he stated in his lecture. But back then, the knowledge produced could, in principle, be contested by the public. Commercial actors had to buy this knowledge and depended on governments to share it, with exception of the insurance industry. But even the insurance industry could be legally challenged by people who felt disadvantaged, as happened in the case of African Americans in the United States. Hence, the old form of social knowledge challenged market forces and their effects on human lives.
The emerging form of social knowledge, in the form of data, is being privately collected, analysed, debated and is therefore very hard to contest. The reason is its private ownership, and also the complex and often opaque processes it depends on. When Shotspotter, a data analytics company supporting the crime prevention sector, was asked by a US judge to provide details of its proprietary algorithm, which judges use to make sentencing decisions, its CEO refused. Couldry insisted that this knowledge about our shared social world must be publicly accountable, accessible, and debatable.
Surveillance and inequality
Be it smart devices, self-tracking or facial recognition—data colonialism is expanding into every domain of life. But the effects will not play out equally for everyone. As various researchers have shown, already vulnerable populations are most likely to be harmed by the hidden judgements of algorithms. At the same time, these people have less options to resist. In order to get a job, they are more likely to accept surveillance than people in higher-status jobs. This way, Couldry argued, surveillance is connected to inequality.
Depending more and more on the way algorithms make decisions, we risk losing certain things as a society. Older rationales such as giving the poor more favorable credit terms are being replaced by a model based on one’s prior credit-related behavior. Further, older forms of expertise and judgement will be lost if public servants such as judges lose the capacity to make decisions. And what is perhaps most dangerous of all is that we lose the habit of listening to what people say and how they interpret the world, if we blindly trust data. If machines and not people interpret the world, we will lose touch with the value of democracy.
The loss of freedom
Couldry insisted that we are about to lose the space human subjects need to build their social world and interactions. When we are constantly tracked, we lose our freedom in a Hegelian sense. Hegel understands freedom as “the freedom to be with oneself in the other”. To be with digital tracking devices and its external infrastructures means that individuals are no longer with themselves.
To stop these developments, we need to question their inevitability, challenge their necessity, and imagine possibilities of connecting with each other on other terms. The costs of connection can still be renegotiated. The alarming example is China, where the government has made the purpose of its artificial intelligence program clear. It is a “market improvement of the social and economic order,” in which freedom does not have a place. As we install the same technological system of computer-based connection in the West, we are running the risk of ending up with a similar version of society.
Hence, we are entering a historical battle for the values of freedom on which our democracies were built. The urgent task is to imagine different futures for the digital society. As Yuval Harari pointed out recently, opposing the ideology of dataism “is not only the greatest scientific challenge of the 21st century, but also the most urgent political and economic project”. The challenge is so great, because the social transformation goes on largely hidden. Couldry ended his lecture with the plea to not let it become a silent catastrophe. Now, we need to work together to face these profound challenges and time is short.
Nick Couldry is Professor of Media, Communications and Social Theory at the London School of Economics and Political Science (LSE). As a sociologist of media and culture, he approaches media and communications from the perspective of the symbolic power that has been historically concentrated in media institutions. His lecture drew on the author’s forthcoming book with Ulises Mejias, The Costs of Connection: How Data Colonizes Human Life and Appropriates it for capitalism (Stanford University Press 2019).
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact email@example.com.
Sign up for HIIG's Monthly Digest
and receive our latest blog articles.
Whether civil society, politics or science – everyone seems to agree that the New Twenties will be characterised by digitalisation. But what about the tension of digital ethics? How do we create a digital transformation involving society as a whole, including people who either do not have the financial means or the necessary know-how to benefit from digitalisation? And what do these comprehensive changes in our actions mean for democracy? In this dossier we want to address these questions and offer food for thought on how we can use digitalisation for the common good.
Sustainable AI is becoming increasingly important. But how sustainable are AI models really?
Why is Artificial Intelligence so commonly depicted as a machine with a human brain? This article shows why one misleading metaphor became so prevalent.
Barriers in our physical environment are still widespread. While AI systems could eventually support detecting them, it first needs open training data. Here we provide a dataset for detecting steps...