Workshop Einladungskarte

Review of the Workshop: »Cloud Computing and the EU Draft General Data Protection Regulation«

05 August 2013

About 30 legal practitioners, computer scientists and social scientists came together to attend the interdisciplinary workshop »Cloud Computing and the EU Draft General Data Protection Regulation. Standards, Design Considerations, and Operations Recommendations for Privacy-friendly Cloud Computing« held on the premises of Humboldt University, Berlin (HU) on 26 July 2013 ahead of the 87th Meeting of the Internet Engineering Task Force (IETF) which is currently in progress. Jointly organised by the Alexander von Humboldt Institute for Internet and Society (HIIG), the HU, and Cisco as part of the »Global Privacy Governance« project, the aim was to gain a common, interdisciplinary understanding of privacy and data protection, particularly from the point of view of balancing legal requirements with the means of technical implementation. The purpose of the workshop was to draw up a number of specific »Operational Privacy« requirements pertaining to Cloud Computing.

Session 1: Nicolas Dubois and Caspar Bowden

During the first workshop session, Nicolas Dubois from the EU Commission presented the latter’s proposals for reforming the European Data Protection regulations: data protection is to be updated in line with the Charter of Fundamental rights in order to meet the challenges of technical development. Apart from introducing Privacy by Design and Privacy by Default into the General Data Protection Regulation, the focus will be on such other measures as revising the obligations of data processors with regard to risk management and the need to include standard contractual clauses on data security and support for binding corporate rules (BCR). Backed up with sound evidence, Caspar Bowden, the former privacy consultant at Microsoft‘s European branch, not only criticised the past ignorance displayed by European institutions in matters relating to undercover surveillance measures conducted by intelligence services, whose existence has been an open secret for a long time, but also showed that European citizens and organisations are entirely at the mercy of these measures according to the provisions of the US Foreign Intelligence Surveillance Act (FISA), for instance. Instead of protecting their citizens, European countries and the EU Commission were biased towards the interests of industrial companies, particularly those of US American cloud-computing providers. The least that Europe could do under the circumstances is to develop a cloud infrastructure of its own.

The discussion that followed on from this talk dealt mainly with the possibilities and limitations of legal restrictions as well as certain time aspects: How long would it take to build our own cloud infrastructure? For how long are encoded data safe? For what period of time are data retained?

Session 2: Alexander Dix and Alissa Cooper

Dr. Alexander Dix, Berlin’s Commissioner for Data Protection and Freedom of Information, opened the second session with a talk on the legal demands made on technical standardisation, which has hitherto focused on data processors being able to set their own standards. This practice lowers the level of security and has to be overturned, he said. Dix also called for an international convention to regulate what secret services are allowed to do on the Internet, and what not. To wind up his talk, he drew attention to the Resolution of the Conference of Federal and State Data Protection Commissioners dated 24 July 2013, according to which the authorities will not issue any new licence for the transfer of data to the USA under the terms of the Safe Harbour Agreement. Alissa Cooper from the Center of Democracy and Technology subsequently presented the RFC 6973 »Privacy Considerations for Internet Protocols« that were completed shortly before the workshop commenced. Based on the IETF‘s goals for devising technical protocols for Internet communication, she kicked off with the very limited extent to which privacy supervision could be deployed within the framework of the IETF, explaining that data security is primarily a political problem, whereas the IETF is only concerned with technical matters.

The subsequent discussion revolved primarily around the question of who should standardise what in terms of privacy and data protection, and how they should go about it. The general consensus was that technicians have since turned their attention to this topic instead of just discussing the safety aspects.

Session 3: Fred Baker, Gunter Van de Velde and Jörg Pohle

Fred Baker, a Cisco fellow and former Chairman of the IETF, opened the third session with a preview of proposed Internet requirements for »Operational Privacy«. Of the two threats to privacy. as identified by Baker – what people disseminate about themselves and what can be gleaned from their conduct and their relationship to other people – the latter poses the greater risk. So technological advancement should aim at providing those affected with different options that are both comprehensible and designed to facilitate the choice between spreading or withholding particulars. In his capacity as Chairman of the IETF’s »Operational Security Working Group«, Gunter Van de Velde proceeded to outline the working group’s mission, pointing out that the draft that had been put forward for an RFC was not so much a documentation of best current practices, in his opinion, but constituted more of a problem analysis – a »taxonomy and problem statement«. The draft itself was then introduced by the person who wrote this review, Jörg Pohle from the HIIG, who made special mention of the data security protection targets, by way of a guideline.

This was followed by a discussion about the inter-relationship between the law and technology in implementing privacy and data protection requirements and about particularly vital individual demands, such as the call for an independent supervisory structure.

Operational Privacy Outlook

Although the workshop failed to achieve its ambitious aim to define clear »Operational Privacy« requirements specifically for Cloud Computing activities, it can still be considered a success: firstly because it established a common understanding of the problem and secondly because this preliminary work, coupled with the outcome of the discussions at the workshop, forms a good basis for drawing up a »taxonomy and problem statement« on »operational privacy«, which can then be passed on the IETF’s »Operational Security Working Group« level, possibly with a view to producing a document on best current practices in the longer term.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Related articles

dream_g38bukj7r32

Artificial intelligence art – who owns the copyright?

If your pet dog Hans takes a selfie, does he own the copyright? A recent decision by the U.S. Court of Appeals for the Ninth Circuit (“Ninth Circuit”) is instructive....
Lima Pix_Drone vs Cow_Flickr

Rural and digital: Revolution in agriculture?

To reinvent itself, agriculture is slowly adapting to industry 4.0 as a model: drones, autonomous tractors and robots are the first step. But so far, missing platforms and poorly developed...
RDR_EU_launch_16x9

Scaling research and impacting change

How do the world’s most powerful internet, mobile, and telecommunications companies treat their users’ freedom of expression and privacy? On 2 May 2018, the Ranking Digital Rights 2018 Corporate Accountability...

3 Comments

  1. Stephan Engberg on 6 August 2013 at 10:15 am

    Even through I dont question the intentions, I don’t see this workshop even getting near solutions – problems is that the starting point in the assumption that control is ever transferred to a cloud that can never be secure or trustworthy.

    1) Problems are MUCH bugger than mere “surveillance” -.the question is sustainability of markets and democracy as such.
    https://ec.europa.eu/digital-agenda/sites/digital-agenda/files/Stephan.pdf

    2) Solutions are in isolation – not in trust and regulation that can never be enforced.
    http://digitaliser.dk/resource/896495/artefact/New+Digital+Security+Models.pdf

  2. Jörg Pohle on 7 August 2013 at 7:17 pm

    Thank you very much for your criticism. Indeed, the problem is much bigger than mere “surveillance”.

    The assumption is that we live in a modern society that is based on the division of labor with organizations providing services that are necessary for the well-being and even the survival of individuals and the society alike. To provide these services, organizations need to make decisions. As information is for the production of decisions, organizations need information, more often than not information about individuals. As there is a fundamental structural power imbalance between organizations and individuals, the rights of the individual — dignity, freedom, self-determination etc. — must be protected. This structural power imbalance constitutes the threat model for data protection.

    In view of this background isolation would be at most a partial solution. Additionally, non-identifiability (or anonymity) as a solution is not without controversy, as Paul Ohm pointed out in 2009: “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” (http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1450006).

  3. Stephan Engberg on 7 August 2013 at 9:25 pm

    Politely
    a) A reference to division of labor does nothing towards arguing of dis-.empowerment.
    There is no reason to assume any server system need the ability to identifiy citizens or link non-related transactions to benfit for division of labor. On the contrary – economic, security and rights arguments all points towards the opposite.

    b) Papers pointing out that de-identification can never work is missing the point. We need to eliminate (redesign) the source eliminating identificaiton and linkaiblity in the first place.instead of vasting time trying to “de-identifiy”.

    When we are not even aiming for solutions, we are just worsening the problems.

Leave a Comment