Skip to content
Accepting cookies

Caring about privacy but accepting cookies? Questioning the privacy paradox.

15 September 2022

Have you ever asked yourself why you agree to privacy agreements like cookies on a website way faster and consider them less online than offline? This phenomenon is called the privacy paradox, but is it really so paradoxical or are there eventually reasonable explanations for our online behaviour?

What is the privacy paradox?

The privacy paradox is understood as a concept attempting to reconcile consumer behaviour with privacy concerns (Martin, 2019, p. 67). In short, it can be defined as follows: “People’s concerns toward privacy are unrelated to their privacy behaviours. Even though users have substantial concerns with regard to their online privacy , they engage in self-disclosing behaviours that do not adequately reflect their concerns” (Dienlin, Trepte 2014).

Living in an information society

Information and Communication Technologies (ICTs) “are radically transforming devices because they engineer environments that the user is enabled to enter through gateways, experiencing a form of initiation” (Floridi, 2010, p. 5, Ch. 1). ICTs are changing our world as much as they are creating new realities. The distinction between the analogue offline and the digital online is quickly becoming blurred. “This recent phenomenon is variously known as ‘Ubiquitous Computing’ , ‘Ambient Intelligence’, ‘The Internet of Things’, or ‘Web augmented things’ (Floridi, 2010, p. 8, Ch. 1). Floridi (2010) calls this the information society, in which we are depending on ICTs because they have become a part of our reality.

How cognitive barriers influence our privacy behaviour

In the information society, we have to make privacy decisions in a limited amount of time, which companies use to their advantage. Cranor (2012) estimates for example “that it would take a user an average of 244 hours per year to read the privacy policies of every website she visits, or 54 billion hours per year for every United States consumer to read every privacy policy she encountered (McDonald & Cranor, 2008)” (Waldman, 2020).

Research has identified plenty of cognitive and behavioural barriers to rational privacy and disclosure decision making (Acquisti, Brandimarte, & Loewenstein, 2015; Camerer, 1998). One of the most pervasive cognitive biases is for example hyperbolic discounting (Waldman, 2020). The tendency to overweight immediate consequences of a decision and to underweight the ones that will occur in the future makes rational disclosure decisions difficult for the consumer. On top of the disclosure often carry certain immediate benefits like access. “But the risks of disclosure are usually felt much later. As such, our tendency to overvalue current rewards while inadequately discounting the cost of future risks makes us more willing to share now” (Waldman, 2020). The decision making process in general, as well as privacy decisions, is thus affected by incomplete information and bounded rationality (Acquisti & Grossklags, 2005). Additionally, we do not have access to all necessary information in order to make a fully informed judgement about the trade-offs involved (Kokolakis, 2017, p. 130).

An alternative approach

Kristen Martin found in her study that “consumers may be consistent in stating privacy concerns and expectations in surveys, while also retaining those concerns and expectations after engaging with a website” (2019, p. 72). She and other scholars push to overthink the privacy paradox assumption, that all privacy claims disappear after disclosure (Dienlin & Trepte, 2014; Martin, 2019; Solove, 2021). This assumption is problematic for clear responsibilities over the consumer’s data.

Privacy as a Core Value

When we say that privacy is a core value, it means that privacy needs to be protected all the time, and that firms should be held accountable for what happens with the consumers’ data after the disclosure. Martin says that “scholars who make normative claims about privacy have been arguing for privacy as a core value, which is necessary for individual autonomy and development, to foster intimacy and relationships, and for societies to flourish” (2019). Core values are not negotiable, they are positive goals we seek to attain and require in our communities (Donaldson & Walsh, 2015; Martin, 2019). Martin points out, a positive obligation to identify and respect privacy expectations from consumers is essential to business ethics (Martin, 2018; Shue, 2020).  

Where to go from here?

Evidence suggests that individuals care about their privacy even after information disclosure. The human cognitive bias in the online environment, together with the missing information for a fully informed decision, makes the decision-making processes difficult and explains the gap between the users’ privacy preferences and the disclosure behaviour. It raises the question whether the privacy paradox is really one. The privacy paradox has furthermore problematic implications for responsibilities over the consumers’ data. It places the primary responsibility for personal data on consumers, which means that companies bear little to no responsibility. An alternative approach could be seeing privacy as a core value, which would give firms a positive obligation to identify and respect privacy expectations.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Johanna Klix

Student assistant: Knowledge & Society

Sign up for HIIG's Monthly Digest

and receive our latest blog articles.

Man sieht in Leuchtschrift das Wort "Ethical"

Digital Ethics

Whether civil society, politics or science – everyone seems to agree that the New Twenties will be characterised by digitalisation. But what about the tension of digital ethics? How do we create a digital transformation involving society as a whole, including people who either do not have the financial means or the necessary know-how to benefit from digitalisation?  And what do these comprehensive changes in our actions mean for democracy? In this dossier we want to address these questions and offer food for thought on how we can use digitalisation for the common good.

Discover all 12 articles

Further articles

Content Moderation – What can stay, what must go?

The research project "Ethics of Digitalisation" has developed rules for algorhithmic content moderation in social networks.

Is the new EU directive for sustainability in AI a toothless paper tiger or a sharp and hungry lion?

A lion for sustainable AI: How to support a new standard for sustainability reporting?

The Sustainability of AI is missing proper standards. EU's CSRD might be a new directive. But is it a toothless paper tiger or a sharp lion?

EU-Kodex gegen Hack and Leak

The EU’s Regulatory Awakening? Hack-and-Leak Operations in the new EU Code on Disinformation

Digital Policy: The new EU Code on Disinformation might bring an end to platform’s arbitrary handling of hack-and-leak