Siris böse Schwester. Wenn der niederländische öffentliche Dienst Ihre Daten stiehlt
Die “System Risiko Anzeige” (SyRI) diente der niederländischen Regierung zur automatisierten Erfassung von Sozialhilfebetrug. Das Programm wurde allerdings wegen Intransparenz und unangemessener Datenerhebung verboten und beendet. Dies demonstriert, wie die Automatisierung des öffentlichen Diensts versagt, wenn die Implementierung fehlschlägt.
In 2014, the Dutch Ministry of Social Affairs and Employment initiated a project for detecting social benefit fraud. The program SyRI (“System Risk Indication”) was supposed to serve this goal by means of automation and data collection. Early in 2020 however, the district court of The Hague ordered an immediate stop of the program and the Ministry followed the judgement a couple of months later. While nobody questioned SyRI’s purpose, its actual implementation and practice was highly criticized. First by NGO’s, Civil Rights Organizations, and representatives of the UN and later by the court. The case of SyRI is representative of how automated decision making in the public sector and therefore presumably in the public interest can fail, if it lacks the appropriate transparency and attention to privacy.
All they need is all your data
In 2014 SyRI’s legal basis was passed in form of an amendment to an act from 2001. This amendment regulated which data could be used and how a SyRI-project proceeded. The amendment mentioned 17 different categories of data which were allowed to be used for proactive risk evaluation. SyRI was allowed to cross-reference data about work, fines, penalties, taxes, properties, housing, education, retirement, debts, benefits, allowances, subsidies, permits and exemptions, and more. This data could be taken from a wide range of public authorities, including the Dutch Tax and Customs Administration, the Labour Inspectorate, the Public Employment Service and others. Even though the Dutch Data Protection Authority (DPA) raised concerns in 2012 and again in 2014, the amendment was passed by the government. One key concern of the DPA was the possible conflict with Article 8 of the European Convention on Human Rights (ECHR). According to Article 8 everyone has the right to respect for his or her private and family life, home and correspondence. The concern already foreshadowed the court’s decision some years later.
SyRI in action
It is difficult to say how often SyRI was actually applied. One difficulty is that there have been SyRI-like projects before 2014, such as project Waterproof. While there is information on some of the projects, the government withholds many details. According to the court’s investigation, between 2008 and 2014 there have been 22 projects in which SyRI or its precursors have been used and from 2015 on there have been 5 more SyRI-projects.
A typical SyRI-project started with a request by two or more administrative bodies towards the Ministry. The request had to specify purpose, required data, and relevant risk indications. Once the request had been accepted, the data was collected and personal information got replaced by pseudonyms. After checking the data against the risk profile, the decrypted information about the flagged persons was sent to the Ministry for a second analysis. However, the SyRI legislation did not include an obligation to notify the data subjects individually that a risk report had been submitted. There only was an obligation to announce the start of a SyRI project beforehand by way of publication in the Government Gazette. Only if the data subject explicitly asked for access, the Ministry would grant it.
Better watch your water use
On what basis were people flagged? What indicated a risk for SyRI? Again, this question is difficult to answer due to a lack of transparency. Even the court noted that the government provides almost no information. Allegedly, knowing about the indicators could be used to game the system. Nevertheless, some indicators could be identified. One SyRI-application was detecting cohabitation fraud. This means that people received benefits for singles but were in fact living together which, if registered, would result in less money. There were several types of information that SyRI took as an indicator for this kind of fraud. One was the registration of multiple cars under one name within a short period of time. Another was when the paid taxes on waste disposal of a single individual seemed rather typical for multiple persons. Furthermore, in previous projects to SyRI, like project Waterproof, for example, we know that low use of water was taken to indicate that the respective person lived with his or her partner in another flat. It is possible to likely that such indicators were in place again for SyRI.
It is worth mentioning that SyRI might not only have checked for discrepancies in the data, as the government claims. According to the plaintiffs, there are hints that SyRI might have deployed Artificial Intelligence, more precisely Machine Learning, to automate analysing data points to detect “suspicious” behavior. This, however, could not be proven since the government did not allow for a sufficiently thorough investigation of the software itself.
Automate Public Service – but not like this!
The lack of transparency and data protection led the court to its judgment on SyRI. The judgment was mostly based on Article 8 of the ECHR and the GDPR. Furthermore, the court shared a concern that was already made public by the UN Special Rapporteur. SyRI was predominantly used in the poorer neighborhoods of, for instance, Capelle aan den ijssel, Eindhoven, Haarlem, and Rotterdam.
However, the judgment was not negative in all its aspects:
[The] court shares the position of the State that those new technological possibilities to prevent and combat fraud should be used. The court is of the opinion that the SyRI legislation is in the interest of economic wellbeing and thereby serves a legitimate purpose as adequate verification as regards the accuracy and completeness of data based on which citizens are awarded entitlements is vitally important.
The judgment casts light on a topic that is more general than the case of SyRI. There is an emergence of digital welfare states around the world, i.e. states are increasingly using new technologies to perform public services. Correctly done, this can help prevent fraud, or render public services more accessible and effective. However, as the case of SyRI shows, when the actual implementation has severe flaws – be it on the legal or the technical level – public services can quickly shift to surveillance and biased targeting.
Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte email@example.com
Forschungsthema im Fokus
HIIG Monthly Digest
Jetzt anmelden und die neuesten Blogartikel gesammelt per Newsletter erhalten.
Können Maschinen autonom sein – oder ist das ein Privileg des Menschen? Diese kategorische Frage dominiert viele Diskussionen über unser Verhältnis zu den (vermeintlich) intelligenten Maschinen.
Persönliche Daten sind im Gesundheits- und Pflegesektor besonders sensibel und schützenswert. Wie sollte hier eine gute Data Governance aussehen?