Consent Under Pressure and the Right to Informational Self-Determination (2)
This blog entry continues the article published on 17 December 2012.
Is a Revitalisation of Consent in Personal Data Processing possible?
The question has to be asked, which significance can be assigned to the consent in data protection law especially online. As demonstrated, the fundamental idea of an informed voluntary consent is under pressure: by long, confusing terms of data protection on the one side and through market domination inevitable becoming online communication platforms on the other side. Does the factual idle of consent lead to a violation of the state’s duty to protect resulting from fundamental rights? Can there be alternatives to consent or can the concept of consent be revived? Solutions are sought to decrease the information deficit and to prevent the outlined predicament.
Remedies for the information deficit
Regarding the flood of information the user faces when accepting the terms of data protection it is more than questionable that (further) peremptory norms can provide a solution. Specific instructional duties often drown in the bulk of terms concerning data protection. A structured and consistent presentation could provide relief here. The projects Terms of Service; Didn’t Read and Wikimarx count on the contribution of internet users. While Wikimarx tries to highlight important and critical terms, Terms of Service; Didn’t Read attempts to inform the user about important terms at a glance by using colour coding. A general colour coded ranking of online communication is planned to directly show when visiting a website by employing a browser add-on. It remains to be observed if these attempts to channel the flood of legal information will succeed and if the entanglement of broad contractual terms and national laws can be coped with. It is at least a possibility worth considering in order to restore the users’ responsibility when it comes to the protection of their data.
Ways back to a voluntary decision?
However, in spite of potentially increasing transparency there are few possibilities to evade the growing power of online communication platforms for the effective execution of fundamental rights. In order to be heard one has to step into the space where one’s potential supporters can be found. Therefore the question arises how reasonable access terms can be established and which role the legislator can or must play in this matter.
The state’s constitutional duty to protect
Initially it is to be examined if certain legislative actions are potentially predetermined by fundamental rights. Due to the fact that the aforementioned online communication platforms are operated by private corporations, the fundamental rights are not directly applicable between operators and users. It principally rests upon the citizens to govern their contractual relations – including the use of personal data – themselves by choosing the appropriate contractual terms. By providing the law and instruments of law enforcement the state offers the parties the means for achieving an equilibrium of interests and adhering to it. But the state principally refrains from strictly designing the legal relationships in the interest of the freedom of the citizens. However, if it comes to such a domination of one party over another that one of them can unilaterally set the contractual terms, it is the duty of the State to “save the fundamental rights of the parties involved in order to prevent the inversion of self-determination into external determination“1. The state has to comply with this constitutional duty to protect also when it comes to the right to informational self-determination.2
The future point in time when an external determination and therefore an infringement of the duty to protect is to be assumed, cannot quite be answered here. The alternative actions for each citizen and the amount and quality of data necessary to be disclosed for the effective execution of fundamental rights remain to be examined.3
Yet if a constitutionally relevant lowering of the level of protection could be found here, still as always a broad margin of possible actions remain in order to achieve compliance with the constitutional duty to protect. Some possibilities of protection and their effectiveness shall be considered now.
Alternatives to consent or consent light?
First it can be discussed if there are other solutions for securing personality rights than the concept of consent. The underlying idea is that citizens cannot be coerced into consenting if the ground for consent is taken by compelling law.
But is data protection and from a constitutional perspective the right to informational self-determination even imaginable without consent or consistent with constitutional doctrine? The right to informational self-determination has yet the literal condition that the right holder must and can decide upon disclosure of information himself. Not being able to further follow this chain of thought here, it is at least to be noted that consent into disclosure of certain information is currently forbidden by law. An employee, for instance, cannot validly consent into a genetic test by his employer according to § 18 GenDG (Genetic Diagnostics Act).4 This is a distinct exception to the principle of private autonomy. It could be discussed to design data protection law – at least between consumer and entrepreneur – as in large parts compelling special private law (Sonderprivatrecht) comparable to the law of employment or tenancy.
However, it should not be taken hold of this supposedly rescuing hand too hastily. Surely, it would not only end the domination by contractual parties in some areas, but also in many cases limit private autonomy und therefore the freedom of the users themselves.
More important appears the following: A set of “one size fits all” provisions cannot do justice to the diversity of personal data and different purposes of data processing. In some cases data elicitation that constitutes a deep interference with the private sphere and therefore seems fit for prohibition, can yet in other cases be desirable and in the best interest of the user. For example, the disclosure of a user’s age or sexual orientation to an e-mail provider or an ordinary social network in contrast to a dating platform are to be viewed differently. Possible communication services seem too diverse especially online to be governed by a set of strict universally applicable provisions. Such provisions would always pose the threat of either providing a protection level too low or of preventing legitimate business models. This troublesome and regulatory difficult path should remain as a last resort.
Additionally, problems may emerge concerning the respectively applicable law and its effective implementation in transnational cases. When it comes to pure online services it is often too easy for corporations to escape from unpleasant national, even compulsory, law or at least from its effective implementation.5
Which means do remain to guarantee an effective execution of fundamental rights in the future? It is to be noted, that some sort of consent to data processing must remain the underlying legal principle of data protection in the future. Existing approaches of simplified presentation of information may help to restore the informational basis of consent. Additionally, it can be tried to set minimum standards in certain areas by the complementary effect of precise, compelling law. In this process an eye should be kept on implementation deficits. A universal European solution can help to provide an acceptable standard of protection in the long run and enable people to make use of their fundamental rights through communication in the 21st century.
2 Cf. concerning this BVerfG, 1 BvR 2027/02, decision from 23th Oct. 2006, para. no. 33.
3 The constitutional court, for example, decided on insurance terms which allowed a deep interference with the insured’s right to self-determination. The abandonment of an occupational capacity insurance by the individual as the only possibility to save one’s right to self-determination was not deemed acceptable by the constitutional court, BVerfG, 1 BvR 2027/02, decision from 23th Oct. 2006, para. no. 39.
4 Also cf. Thüsing, Arbeitnehmerdatenschutz und Compliance, 1. edition 2010, margin no. 389, 396 et seqq.
5 What it means for the state’s duty to protect, when areas like this evade from state authority, must remain uncovered here.
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact firstname.lastname@example.org.
Sign up for HIIG's Monthly Digest
and receive our latest blog articles.
We approach the de-mystification of this claim by looking at concrete examples of how AI (re)produces inequalities and connect those to several aspects which help to illustrate socio-technical entanglements.
“System Risk Indication” (SyRI) deployed by the dutch government for automatically detecting social benefit fraud. The program was shut down due to a severe lack in transparency and unproportional collection...
AI won’t kill us in the form of a time-travelling humanoid robot with an Austrian accent. But: AI is used in various military applications – supporting new concepts of command…