Legal Hackathon: Privacy and Security by Design for the IoT
Germany’s first Legal Hackathon “Building Standards of Privacy- and Security-by-Design for the IoT” took place in Berlin at the AoIR 2016: Internet Rules!, the annual conference of the worldwide biggest network of internet researchers. The Legal Hackathon elaborated on a data protection standard for personal data that shall be collected via a public wifi system and used for purposes in a Smart City environment.
On 5 October 2016, the Legal Hackathon “Building Standards of Privacy- and Security-by-Design for the IoT” took place in Berlin. The format of Legal Hackathons originates from the movement of Legal Hackers, which was started in New York in late 2011, and is, now in Germany, conducted by the Startup Law Clinic of the Alexander von Humboldt Institute for Internet and Gesellschaft and the project Innovation and Law. Legal Hackathons focus on the notion of hackathons (so called collaborative software- and/or hardware development events) by concentrating on the implementation of legal requirements of particular Internet Technology Law. This format reacts, hence, to the fact that in areas of technological innovation legal problems can be less and less solved by classic means of legislation or legal consultancy but rather, more effectively, in processes in which technology enterprises, policy makers, scientists, and legal service providers interact cooperatively.
The Internet of Things (IoT) is such an innovative area, which is governed by Internet Technology Law that not only provides for a promising economic opportunity but also for technological and legal challenges in relation to innovative data-driven services and products. As a first example, the Legal Hackathon “Building Standards of Privacy- and Security-by-Design for the IoT” treated, in the first instance, the question of how a public wifi access system may be set up in a way applying the following three requirements: first, the (personal) data collected via this system must be available, in a non-discriminatory way, to all enterprises acting in the field of Smart Cities and thus interested in that data; second, the provision of the data should be as open toward innovation as possible; and third, the end users connecting to this wifi access system must be protected effectively against related risks. In particular, the last requirement poses an essential challenge because a public wifi system makes it possible, in principle, to track every person moving around in Berlin. During the course of time, the tracking mechanism can therefore lead to a comprehensive movement pattern of each individual in Berlin. The technical reason behind this is that the sender of the public wifi access points are able to identify each device on the basis of its unique IMEI and/or MAC address. This functionality exists irrespective of whether or not the owner and/or user of that device has switched on the wifi search function of his or her device. If it is combined with further data, such as the name and address of the device owner, the movement patterns can be related to uniquely identifiable individuals. On the one hand, this makes many useful applications possible, such as for smart city traffic management (e.g. a parking spot finder) or personalized advertising. However, on the other hand, access to such data might be the ideal starting point for a society of “total surveillance” (Big Brother is watching you). The question therefore is: how should such a public wifi access system be set up, in favor of innovative services, without denying the necessary protection against its data protection risks?
One solution for this challenge is to set up a common standard specifying the conditions for the processing of that specific personal data. Companies must then apply the standard, i.e. apply these conditions, in order to get access to the data. Such a standard ensures that both end-users of the public wifi system, as well as the companies seeking access to the data, can trust that the processing of that data is legal. The European Data Protection Regulation, which comes into force the 25 May 2018 substituting national data protection regulations in the European Member States, provides for rules how such standards can be set up. Article 25 of the regulation establishes the so-called “privacy-by-design” principle as: “Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.” Article 32 of the regulation provides for a similar provision regarding the so-called “security-by-design” principle. Indeed, these provisions do not specify how the data controller has to precisely implement the privacy- and security-by-design principles. In order to tackle this uncertainty, Articles 40 to 43 regulate how data controllers can set up, together with data protection authorities, so-called codes of conduct and/or certificates. Such codes of conducts and certificates are standards that specify, as described previously, the legal pre-conditions for the processing of personal data with respect to a particular case.
The organizers of the Legal Hackathon believe that the results were so promising that the format will be extended to further methods such as Gamathons, as well as further areas, such as Smart Homes, Wearables, and FinTechs. Interested parties are invited to participate and/or to propose further areas under:
Maximilian von Grafenstein LL.M.
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact email@example.com.
Sign up for HIIG's Monthly Digest
and receive our latest blog articles.
No technology is neutral. Dating apps like Tinder and Grindr can perpetuate stereotypical assumptions about sexual preferences and reinforce a racist flirting culture. Can the law intervene?
AI provides powerful tools to tackle climate change in various applications – but it is not a silver bullet. It can support the mitigation of climate change, for instance, by…
Call for Contributions: Artificial Intelligence and the Human – Cross-Cultural Perspectives on Science and Fiction
The Alexander von Humboldt Institute for Internet and Society (HIIG), the Japanese-German Center Berlin (JDZB) and Waseda University (Tokyo) invite to an international Japanese-German conference on the topic of AI...