Skip to content
consumer-trust
15 December 2016

Datafication and Consumer Trust

We’re already living in a highly data driven society. Algorithms control the content flow displayed to us through search engines, for the news we read on Facebook and Twitter, the purchases we make via loyalty programs, the movies we watch on Netflix and the music we listen to on Spotify. Our behaviours are logged, collected and analysed by a growing number of operators for a growing number of reasons. At the same time, the data driven and automated society is also growing.

Seen from a consumer perspective, this naturally provides a number of services and innovations that we appreciate and often are prepared to pay for. But this data driven development also poses challenges to both consumer protection as well as how to develop the authorities’ supervision methods and role. Digital consumer profiling is in many senses the foundation of the digital economy, an economy that includes mega operators such as Google and Facebook, but also media houses that own a number of newspapers and media websites, relative newcomers in the sharing economy, the marketing industry, both e-commerce and traditional “brick-and-mortar” stores through, for example, loyalty cards and programs. In a new overview on digitalisation and consumer interest that I’ve done for the Swedish Consumer Agency, I highlight some developing trends that need to be looked into more closely. In short, how we manage consumer data and the operators and tools used for moderating, analysing and trading in the data will be of crucial importance for consumer status in the digital economy.

The need for critical studies

First of all, we can conclude that in committing to data driven innovation we also need to encourage a critical perspective and knowledge about how to manage data driven processes, algorithm controlled processes and data-analyses. We need to be able to recognise when the consumer needs protection and empowerment, evaluate winners and losers and strive for transparency both with regards to how the machines work as well as the regulations we would like to see guiding their work. This places demands on both legislators, politicians, supervisory authorities as well as on industry organizations and the academy. In brief:

  • Seen from a consumer protection policy perspective, datafication entails a growing information asymmetry between the consumers and the market operators. We need to develop consumer protection, but the supervisory authorities also need to develop their supervisory and collaborative roles, both within and between authorities. This is as much an issue of power as of integrity and privacy.
  • We, as consumers, commit to hundreds of agreements in the course of our daily digitalised lives. What are the implications of agreements that can only be understood and influenced by one of the parties? Seen from a consumer perspective, user agreements should certainly be shorter, clearer and easier to influence individually. But, are there any other ways of managing informed consent in a time of information overload? We need to figure this one out.

Consumer power also requires insight. For example, in 2015, the Norwegian Data Inspectorate (Datatilsynet) conducted a study of all the parties present when a user visited the index home page of six Norwegian newspapers. They concluded that between 100 and 200 cookies were downloaded to the user’s computer, that the user’s IP address was forwarded to 356 servers, and that on average 46 third parties were present at each of the automated ad trades taking place on the newspapers’ homepages. None of the six newspapers provided public information about the presence of such a large number of third-party companies. How can consumers be expected to choose safe services if they are unaware of the parties present, the information collected or what it’s used for?

Knowledge, transparency and balance

Addressing the challenge of consumers in a digital economy requires knowledge, insight and balance. We need to improve our knowledge about these relatively new developments and their implications – this means research, preferably within several disciplines that specialise in data, society, law, culture and economy. And these fields need to communicate much better than they generally do. The academic ways to organise, fund and publish tend not to help with regards to this. HIIG is a good example of such an interdisciplinary venture, but we need more. Much more. It should be the model, not the exception. At the same time, this poses a challenge to supervisory and organisational methods with regards to recognising downsides to automated processes. And finally, there is a constant need to maintain an articulated balance between the market and consumer protection – which is both a political and a legal matter. Amongst others, there is a risk that consumer trust in digital services will be weakened if the use of personal information is perceived as illegitimate. And as a result of weaker trust, the potential benefits of the digitally generated economy will also very likely be weakened.

Further Links

Stefan Larsson is an Associate Professor in Technology and Social Change at Lund University Internet Institute (LUii) in Sweden, and a member of the Swedish Consumer Agency’s scientific council. From August to October 2016 he was a visiting researcher at the Alexander von Humboldt Institute for Internet and Society.

Photo: flickr.com  CC BY-NC 2.0

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Stefan Larsson

Ehem. Gastforscher: Internet Policy and Governance

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Platform governance

In our research on platform governance, we investigate how corporate goals and social values can be balanced on online platforms.

Further articles

The photo shows an arrow sign on a brick wall, symbolising the DSA in terms of navigating platform power.

Navigating platform power: from European elections to the regulatory future

Looking back at the European elections in June 2024, this blog post takes stock of the Digital Services Act’s effect in terms of navigating platform power.

The image shows a football field from above. The players are only visible because of their shadows, symbolizing Humans in the Loop.

AI Under Supervision: Do We Need ‘Humans in the Loop’ in Automation Processes?

Automated decisions have advantages but are not always flawless. Some suggest a Human in the Loop as a solution. But does it guarantee better outcomes?

The image shows blue dices that are connected to eachother, symbolising B2B platforms.

The plurality of digital B2B platforms

This blog post dives into the diversity of digital business-to-business platforms, categorising them by governance styles and strategic aims.