Zum Inhalt springen
24 April 2018

Datenethik: Facebook und Cambridge Analytica

Facebook befindet sich unter Beschuss: Der Social Media-Gigant musste kürzlich zugeben, Daten von rund 50 Millionen NutzerInnen an Cambridge Analytica weitergegeben zu haben. Diese sollten zur gezielten Anwerbung von WählerInnen im Rahmen der Präsidentschaftskampagne Donald Trumps benutzt werden. Linh Nguyen von WikiTribune hat mit ExpertInnen gesprochen um herauszufinden, auf welche Weise Cambridge Analytica diese persönlichen Daten verwendet haben könnte, und ethische Aspekte rund um „Big Data“ zu diskutieren…

According to Jörg Pohle, a researcher at the Humboldt Institute for Internet and Society, the negative press surrounding Cambridge Analytica, “to say it bluntly: [is] because of Donald Trump.”

The data firm is also under scrutiny for its psychometric testing. How far is too far in this new world which gives insight into behavior, attitudes and intentions?

Cambridge Analytica has denied wrongdoing but has suspended its chief executive. Authorities and Facebook are investigating how the company obtained psychometric data on openness, conscientiousness, extraversion, agreeableness and neuroticism (OCEAN) from Cambridge University which had exploited agreements with Facebook on academic research.

Linh Nguyen from WikiTribune spoke to experts to better understand how Cambridge Analytica may have used the personal data of tens of millions of private citizens and to discuss the ethics of “Big Data”.

WikiTribune talked to:


The “big nudge”

WikiTribune: Profiling of users based on their data is widespread. Why is Cambridge Analytica under fire and when did it cross ethical lines?

Kaltheuner: They’re being investigated for how they obtained data, and that’s why they’re under fire … But micro-targeting ads, also based on psychometric data, is something that is actually done by quite a lot of companies. [Privacy International] is generally concerned about any industry that feeds off exploitation of people’s data.

Zwitter: The ethical problem begins when one profiles for themes that are sensitive regarding individual and group data protection, such as political affiliation, sexual orientation, or health, i.e. information that can be misused for political or criminal ends, and that was collected without the consent of the subject. Cambridge Analytica went beyond profiling and employed what is called “Big Nudging.”

The term comes from Nudge Marketing. One uses “nudges” that can be conscious or subconscious triggers to elicit a certain response in people. It’s one thing when you know that you’re being nudged. It’s another if it’s used to manipulate people by using unconscious desires and psychological mechanisms undermining their personal freedom of choice.

‘This eruption now is pretty much the product of double standards’ – Pohle, Humboldt Institute

Mittelstadt: Profiling … it’s sort of a shortcut to know what sort of person you are, what person you’re being perceived as. Whereas with [psychometrics], it seems to be going a step further, in the sense that companies are going to actively try to change your behaviour for a very, very specific end. It’s pushing you to take an action that would undermine one of your fundamental rights as a citizen to elect who represents you. And so I think it’s what it’s trying to push you towards, what it is they’re trying to influence, that makes this distinct from just normal profiling.

Pohle: [on why CA is under fire] To say it bluntly: because of Donald Trump. When Barack Obama and his team started … using social media data, user-generated content and meta-data on personal interactions and social relationships for targeting people in their election campaigns, almost everyone was cheering and applauding them for using these very data … to better political micro-target voters. So, all in all, I would say, this eruption now is pretty much the product of double standards.

Can we really expect politicians to crack down on companies such as Facebook when they too rely on such data to get ahead?

Pohle: Not really. But on the other hand, that very much depends on the amount of public uproar. If the public outcry is loud enough, politicians might be pressured into cracking down on these companies even against their own interests.

‘It’s hard not to use these tools, if other politicians use them too’ – Zwitter, Groningen

Kaltheuner: It’s really important that politicians set an example by promoting data protection. They should also be using data in a way that complies with the law — the fact that the UK’s Information Commissioner’s Office is looking into political parties and campaigns shows that this is, sadly, not a given. It’s also important to keep in mind that this isn’t about the big social media platforms — data brokers, data analytics companies should all be part of this conversation [too]. Because these are not consumer-facing, they get less attention than they should.

Zwitter: I think the incentive comes from the fact that social media providers like Facebook and Twitter increasingly fill a public service in … communication. However, this public service is not organized by the state, nor is it regulated. It is guided by the companies’ own “terms of use” … Theoretically speaking, this should be enough reason to at least put some obligations on the service provider. Practically, of course, politicians increasingly seem to rely on these new and disruptive techniques (incl. fake news and twitter bots, which are also part of the digital political toolkit). It’s hard not to use these tools, if other politicians use them too.

At the same time we are experiencing the devastating effects of this development already: the gradual loss of credibility of information, news and of scientific facts. How can we still make democratic choices if we cannot trust the information we receive?

Mittelstad: Of course, political campaigning will be much more data-driven now: a new standard has been set. That doesn’t mean that the politicians will be directly reliant on Facebook to supply the relevant data. There’s also the question of what stand will be taken by governments and politicians towards big tech companies. When a bad story comes out about a tech company, then everybody’s happy to jump on the bandwagon and vilify that particular company. But to affect real change in the sector, you need a concerted, broad-reaching approach, not ad hoc reactions. It will be interesting to see what role the GDPR plays in all of this going forward.

Will people change their online behaviour because of scandals like this?

Kaltheuner: I think it’s very good that this case is making people think about things like privacy and data protection and how companies are monitoring, tracking and profiling you without your knowledge or consent. However, I strongly believe that we shouldn’t place the burden on individuals. Companies need to protect privacy by default.

Mittelstadt: People will often not care about privacy … until they have a reason to. A perceived lack of concern for privacy is not a justification to violate it. There’s terminology being thrown around here [with the Cambridge Analytica issue] – whether this was manipulation or persuasion of voters. If it’s presented as persuasion, that comes off as much more acceptable. If it’s manipulation, there’s a psychological component to it, where something about you is being exploited in order to get you to take a preferred action … it depends on how people will perceive this sort of profiling.

I think the more interesting response to look at, though, will be the people that end up engaging less and less with [Facebook] over time, or sharing less and less information about themselves, as opposed to just deactivating their accounts.

McNamee: Yes. According to the National Telecommunications and Information Administration (NTIA), they have already started. This is the start of society understanding what profiling actually means. It is logical for people to think that if they give Facebook five pieces of data, they are just giving Facebook five pieces of data. Profiling involves giving Facebook five pieces of data, Facebook comparing these five pieces of data with similar data from a billion other people and generating five new pieces of data about them, that they will never have access to, and using this data to manipulate your economic, social or political choices.

Scandals like this are crucial for this fundamental and essential change in people’s perception of their privacy, security and protection of their data.

Pohle: No. The amount of people who changed their online behaviour after the Snowden revelations was really small, and the same can be expected after this Cambridge Analytica case.

How will Europe’s data protection rules, the GDPR, change this?

Kaltheuner: So in this particular case, the only reason why we’re talking about data protection, is that Cambridge Analytica is based in the UK (that’s why the Data Protection Act 1998 applies to them). The U.S. doesn’t have data protection laws in the sense that we have them in Europe. GDPR won’t change that. GDPR changes the extra-territorial reach and the consent permission is stronger, the transparency position is stronger.

Mittelstadt: On the GDPR, I think it is very much a step in the right direction. If all this had happened after the GDPR had been in force, then it would be a much more interesting conversation about how supervisory authorities in Europe respond, for example by levying significant fines against Facebook for, essentially, letting users’ data be re-purposed without explicit and specific consent. The extent to which the stronger powers given by the GDPR will actually be used by regulators to fine or control tech companies is a very interesting question, too. We just don’t know yet.

Dieser Artikel wurde zuerst bei WikiTribune veröffentlicht.

Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte info@hiig.de

Jörg Pohle, Dr.

Forschungsprogrammleiter: Daten, Akteure, Infrastrukturen

Aktuelle HIIG-Aktivitäten entdecken

Forschungsthemen im Fokus

Das HIIG beschäftigt sich mit spannenden Themen. Erfahren Sie mehr über unsere interdisziplinäre Pionierarbeit im öffentlichen Diskurs.

Forschungsthema im Fokus Entdecken

Du siehst eine Tastatur auf der eine Taste rot gefärbt ist und auf der „Control“ steht. Eine bildliche Metapher für die Regulierung von digitalen Plattformen im Internet und Data Governance. You see a keyboard on which one key is coloured red and says "Control". A figurative metaphor for the regulation of digital platforms on the internet and data governance.

Data Governance

Wir entwickeln robuste Data-Governance-Rahmenwerke und -Modelle, um praktische Lösungen für eine gute Data-Governance-Politik zu finden.

HIIG Monthly Digest

Jetzt anmelden und  die neuesten Blogartikel gesammelt per Newsletter erhalten.

Weitere Artikel

Das Foto zeigt Hände nebeneinander. Das symbolisiert die Integration von Geschlecht und Inklusivität in digitale Kulturpolitiken.

Geschlecht und Inklusivität in digitalen Kulturpolitiken: Erkenntnisse aus Berlin und Barcelona

Können Berlins und Barcelonas integrativer Umgang mit der Digitalisierung als Blaupause für neue europäische Kulturpolitiken im digitalen Zeitalter dienen?

Das Bild zeigt bunte Puzzleteile. Sie repräsentieren, dass KI für den Umweltschutz nur ein kleiner Teil von vielen sein kann, um unseren Planeten zu schützen.

Ein kleiner Teil von vielen – KI für den Umweltschutz

Welche Rolle spielt KI in Anwendungen für den Umweltschutz? Dieser Blogbeitrag wirft einen Blick auf deutsche Projekte, die KI zu diesem Zweck einsetzen.

Eine Hand hält eine digitale Karte auf einem Smartphone. Dies repräsentiert GIS-Technologie und Geodaten.

Wege durch das Großstadtlabyrinth: GIS-Technologie und die Grenzen zwischen digitaler und physischer Infrastruktur

Mit der Entwicklung von GIS-Technologie stellt sich die Frage, ob digitale Karten wie physische öffentliche Infrastrukturen behandelt werden sollten.