Nothing to hide?
In our interconnected and digitised society, personal data is the new gold. Therefore, we increasingly have to deal with complex matters like privacy and data protection. But how to explain these concepts? That’s why developers, scientists, designers and artists came together at “Game Jam: Unveil the Privacy Threat”. The objective was to develop serious games that could exactly achieve that! In the first part of our review, HIIG researcher Maximilian von Grafenstein provided a summary of all the game concepts. Now, we want to give you a small glimpse of the best #PrivacyJam moments:
Despite my sleepiness, very happy to be at #privacyjam this fine Saturday morning! Good luck gamers, see you tomorrow!
— ghost ravioli (@jilliancyork) October 7, 2017
Before getting started with the development of game concepts, three use cases helped all participants to recollect the many facets of privacy.
— Lorena Marciano (@MarcianoLorena) October 7, 2017
Use case 1: I’ve got nothing to hide!
In her keynote, Jillian York (Director of International Freedom of Expression at Electronic Frontier Foundation) gave five reasons for why the statement “I’ve nothing to hide!” is wrong:
Many people think they have nothing to hide and thus do not need to protect their privacy. The reasoning behind this is that only people who have done something illegal would want to conceal their behavior; but, so the thinking goes, there’s no social need for protection for illegal behavior!
However, this only relates to a tiny part of privacy protection. In fact, privacy laws doesn’t just protect those (allegedly) engaged in “illegal” behavior. Privacy also protects a person against the loss of reputation that can occur when information is disclosed and/or used in the wrong context. As early as the 17th century, the French high priest and statesman Cardinal Richelieu stated: “Give me a letter of six sentences written by the most honorable man, and I will find something sufficient to hang him.” Of course, we don’t hang people anymore. However, what this kind of reasoning indicates is that there can always be somebody who wants to use personal information against someone. It is this misuse of personal information that privacy seeks to protect against. Some people even say that it actually doesn’t matter what you may have to hide; what matters is your ability to decide whether to hide something or not. This ability is guaranteed by privacy. Privacy is hence an essential precondition for the enrollment of an autonomous personality.
Use case 2: Oops… wrong recipient!
The second speaker, Michelle Dennedy (Cisco), illustrated how challenging it is to sensitise employees to privacy.
The most common privacy threat in companies arises when an employee accidentally sends personal information about somebody else to the wrong recipient. This may sound like a no-brainer, but in fact, it is one of the biggest challenges in implementing effective privacy protection policies within companies.
There are two typical constellations that give rise to slightly different challenges for companies trying to mitigate this threat: In the first constellation, an employee uses an email client, and in the second constellation, he/she grants access to a file depository. In both cases, an employee typically sends an email, or grants access to a repository, to the wrong recipient because the email client or repository incorrectly autocompletes the address based on the first few letters. The employee forgets to double-check the name and… oops, the information is sent to the wrong recipient. But there are differences between the cases: The differences between both cases refer to how such an employee can react. If the employee has granted access to a file repository, in principle, he or she can still restrict access retrospectively. In contrast, if the employee has sent an email, the information is definitely “gone”, and he or she can only ask the recipient not to read and/or open the content. However, in both cases employees often do not react at all, or they do not appropriately, because they fear negative consequences if his or her colleagues or superior find out about it.
Use case 3: Unraveling the anonymity paradox!
Jonathan Fox (Cisco) demonstrated the challenges of data anonymisation in his keynote “Unravel the anonymization paradox!”.
If personal information were anonymized, all our privacy concerns would be gone! But what does “anonymized” mean? This question is one of the hardest to resolve in the privacy debate. At present, privacy experts are grappling with the paradox that – in the big data era – there is no anonymous data anymore. All data can always be related to an individual by means of data analysis technologies. The reason for this is that data is only considered “anonymized” if it cannot be related to an identified or even to an identifiable individual.
In order to understand this paradox, imagine more than three million Berlin citizens – and another million tourists – are carrying around their personal devices every single day. Imagine that there is a Berlin-wide wifi system, which is publically available for all people who have switched on their devices’ wifi by default. This wifi system collects the movement data of all these devices over a longer period of time. Can you imagine how useful this data would be for urban traffic management and many other innovations? But wouldn’t it be creepy if this data could also be misused against an individual later on? Imagine that this data is thus anonymized in order to mitigate these risks. In the process all personal identifiers (i.e. the MAC address and IMEI) of the devices captured by the wifi system – which could in principle lead to an identification of the owner or even carrier of a device – are “hashed” (i.e. substituted by a specific hash value for each identifier). This hash value does not, per se, contain information referring to the owner or carrier of the device. However, it is still possible to capture the device’s movement pattern by referring to this hash. This movement pattern becomes more and more precise over time. Now imagine a person who gets access to that movement pattern (e.g. an employee of the provider of the wifi system or another data-driven company) and suddenly discovers that this device must be owned by somebody he knows very well. The reason for this is that this device “leaves” the building where he lives at the same time every morning and “moves” to an address where only lawyers work: In terms of probability, that person must be his wife!
This risk of re-identification of “anonymized” data exists generally, where it is combined with further information. It is hard to say which information will be added and hard to say what the consequences of an identification are. It is hard to say under which conditions this risk is low enough in order to be socially acceptable.
— Booster_Space (@Booster_Space) October 7, 2017
— M Dennedy (@mdennedy) October 7, 2017
— Booster_Space (@Booster_Space) October 7, 2017
— Katharina Beitz (@kunstet) October 8, 2017
— Klaus Lenssen (@klaus_lenssen) October 8, 2017
— Booster_Space (@Booster_Space) October 8, 2017
And the winners are…
Taking the perspective of an activist in a repressive state, the players of “Pieces of Data” learn how data collected by their smartphones can harm them, and even other parties – and how to protect themselves against such threats. “Because this game was so advanced in its development, it won!”, Maximilian von Grafenstein states.
|This Game Jam was part of our research project Privacy by Design in Smart Cities.|
|This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact email@example.com.|
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact firstname.lastname@example.org.
Sign up for HIIG's Monthly Digest
and receive our latest blog articles.
Sina Beckstein analyses the panel discussion “Competing with digital giants”, which took place as a part of the Internet Governance Forum Germany (IGF-D). Subject of the discussion was the competitiveness…
Discussing the implementation of automated decision making systems as savior of overburdened legal decision makers is en vogue. But if employed instead of human decision makers and with rising complexity…