On 5 February, The Hague Court pronounced a historic ruling annulling the data collection and the elaboration of risk profiles of (poor) Dutch citizens to detect social security fraud (SyRI). In its judgment, the Court based its analysis under Article 8.2 ECHR, and not on the GDPR.
First of all, what SyRI is? It is a legal instrument that the Dutch government uses to prevent and combat fraud in the field of Social Security and taxes. To achieve this, it collects personal data creating citizen risk profiles through proprietary (opaque) algorithms.
According to the legislator, the data can be linked and analyzed anonymously in a secure environment, so that risk reports can be generated. All citizens of the Netherlands are, according to SyRI, suspicious.
How is SyRI activated? It is deployed by the Minister at the request of municipalities, the national tax authorities, the Immigration and Naturalization Service and regulators.
These authorities can enter a partnership in which they exchange data. With the deployment of SyRI, the files held by these government bodies are linked in order to identify fraud in the aforementioned areas, and thus increase the chances of being captured.
That is, personal data of citizens are exchanged.
What is the legislation that legitimizes the use of SyRI?
The SUWI Act, modified by the SUWI Decree, referred to as ‘the SyRI legislation’.
The starting point is that the designated bodies participating in an association are obliged to provide each other with the necessary information. Then, they are joint controllers in the data processing.
If there is a joint venture in which the participating authorities want to use SyRI, they make a request to the Minister for this purpose. In this case, the necessary information must be provided to the Minister, and the latter becomes responsible for the processing of the data.
What data can be processed by SyRI? Well, here begins the most controversial part (Article 5a.1 paragraph 3 of the SUWI Decree):
a. Work data, being data with which work performed by a person can be determined;
b. Information on administrative measures and sanctions, being information that shows that a natural person or a legal person has received an administrative fine or that another administrative measure has been taken;
c. Tax data, which is data used to determine the tax obligations of a natural or legal person;
d. Data on movable and immovable property, being data with which the possession and use of certain goods can be determined by a natural or legal person;
e. Data on grounds for exclusion from assistance or benefits, being data showing that a person is not eligible for benefits;
f. Trade data, being data that can be used to determine the nature and activities of a legal person;
g. Accommodation data, being data with which the (actual) place of residence or location of a natural or legal person can be determined;
h. Identifying data, being with a natural person: name, address, place of residence, postal address, date of birth, gender and administrative characteristics and with a legal person: name, address, postal address, legal form, place of business and administrative characteristics;
i. Integration data, being data that can be used to determine whether integration obligations have been imposed on a person;
j. Compliance data, being data with which the compliance history of the laws and regulations of a natural or legal person can be recorded;
k. Educational data, being data with which the financial support for the funding of education can be determined;
l. Pension data, being data with which the pension rights can be determined;
m. Reintegration data, being only the data that can be used to determine whether reintegration obligations have been imposed on a person and whether these are complied with;
n. Indebtedness data, being data with which any debts of a natural or legal person can be determined;
o. Benefit, allowance and subsidy data, being data that can be used to determine the financial support of a natural or legal person;
p. Permits and exemptions, being data that can be used to determine the activities for which a natural or legal person has requested or obtained permission;
q. Health insurance data, being exclusively the data that can be used to determine whether a person is insured under the Health Insurance Act.
17 categories of data that, in turn, contain even more information.
The personal data processing consists of two phases: the processing (phase 1) and the analysis (phase 2).
In the first phase, the IB brings the files together and pseudonymized them. Among other things, personal and company names, social security numbers and addresses are replaced by a code (pseudonym).
After this, the processor applies the first step in the selection of the risk to these encrypted data: the source file is automatically checked against the risk model with all the indicators.
This generates possible results. A potential result indicates an increased risk of fraud.
The IB creates a key file that indicates which personal or company name, social security number or address belongs to a specific pseudonym.
When certain natural persons, legal persons or addresses are classified as higher risk depending on the risk model, they are decrypted again using the key file.
Then, they are transferred to the Minister for the second phase of the risk analysis. If a natural or legal person with a higher risk is not subject to a risk report, their data will be destroyed within four weeks after the end of the analysis.
The information in the register of risk reports is subject to a retention period of two years after the registration of the risk report.
Who supervises this entire process? The Dutch Data Protection Authority, ensuring compliance with the GDPR, and as an external privacy regulator that oversees compliance with SyRI legislation, among other things.
And where is the conflict? The court must assess whether the SyRI legislation complies with the requirements of Article 8, paragraph 2 of the ECHR (Right to Respect for Private Life) to justify the mutual exchange of personal data.
Specifically, the following points:
1. The mutual exchange of personal data between administrative bodies.
2. The provision of personal data to the Ministry.
3. The provision of personal data to the IB (The Foundation for Information), designated as responsible for the processing to link files in SyRI.
4. The processing of personal data by the IB, including profiling.
5. The provision of personal data by the IB to the Minister.
6. The realization of risk reports the registration of notifications.
7. The fact that the subjects of the data are only informed about the processing of their personal data in SyRI if they are subject to a risk report and, then, only at the request of the same.
8. The regulation of the supervision of the deployment of SyRI, in particular the fact that the Minister is the only party that oversees the deployment of SyRI.
The Court established that the SyRI legislation does not comply with the requirement established in Article 8.2 of the ECHR. The interference in the exercise of the Right to Private Life in a democratic society must be necessary and PROPORTIONAL in relation to the intended purpose.
The Court affirmed that the SyRI legislation does not comply with the ‘fair balance’ that must exist under the ECHR between the social interest that the legislation serves and the violation of the private life that the legislation produces.
As you can see in the previous paragraph, the Court takes into account the Principles on which the GDPR is based. Specifically, the Principles of Transparency, Purpose Limitation, and Data Minimization.
These last two principles together make up the PROPORTIONALITY Principle applied in data collection, the MOST IMPORTANT PHASE in the processing of personal data.
And, as we could see back in this article, a huge amount of personal data is collected and processed. A totally disproportionate to the very purpose of the data processing.
The Court once again insists on the importance of the right to respect for private life, and establishes that it also protects the right to personal autonomy, to personal development and self-development and the right to enter into relationships with others and outside world.
Finally, in the case of the processing of personal data, the right to respect for private life also affects the right to equal treatment in equal cases and the right to protection AGAINST DISCRIMINATION, stereotypes and stigmatisation.
And, precisely, these are the rights that are violated in the indiscriminate use of personal profiling and automated decision-making processes. This is the reason why this sentence is so important, and marks a huge precedent.
The type of technology used is also discussed in the sentence. This part is extremely interesting, since the fact that the algorithm is proprietary plays against the State, because the Court considers that it cannot prove the accuracy of the State’s position on what exactly SyRI is because the model has not been made public and the risk parameters that make up the risk model.
The same goes for SyRI legislation. It does not show how the SyRI decision model works and which are the indicators used by the SyRI project.
The Court does not intend to make a full disclosure of the algorithm, but it did expect much more solid information about the objective criteria but it expected much more information about the objective criteria that the model and scores were developed and the way in which particular risks for individuals were addressed.
Moreover, the SyRI legislation explicitly says SyRI can be used for predictive analysis, Deep Learning and Data Mining. Explicitly providing for the possibility of adapting a risk model based on an EVALUATION (profile), and new risk models with new indicators can also be developed.
And the Court considers implicit to the SyRI instrument, that it uses risk profiles based on existing facts.
In addition, the SyRI legislation does not establish any obligation of notification to data subjects whose data are recorded in SyRI, nor the obligation to inform them separately about their risk profiles.
Another factor that is still surprising is how the Court deals with Article 22 GDPR, applied to decisions based solely on automated decision-making processes. The Court presents its reasoning directly under Article 8.2 of the ECHR.
And, what about the Impact Assessment related to data protection established in Article 35 GDPR? The Court considers that it should have been carried out under article 35.1 GDPR.
The State defends itself by saying that it was not necessary to carry out the Impact Assessment related to data protection based on the exception of Article 35.10 GDPR.
The only Data Protection Impact Assessment that was carried out was one with respect to the SyRI law that allows the realization of specific SyRI projects, and was carried out before the adoption of the GDPR.
But the Court again ignores questions about GDPR’s compliance, arguing directly on the basis of Article 8 ECHR.
Therefore, the Court considers that the SyRI legislation, as regards the deployment of SyRI, is contrary to Article 8.2 ECHR.
Therefore, this is a great step forward in combating discrimination caused by automated decision-making and personal profiling. We also see that the Court limits the creation of profiles that contain significantly stronger safeguards, and in line with the ECHR principles of Article 8.
The fact that the Court has not relied on Data Protection Law, but on the European Convention on Human Rights, is to be taken into account. Can it be a jerk of ears to the GDPR for leaving citizens unprotected? Maybe indirectly, it is.
The good news is that there is a precedent that can mark a before and after, and that precedent is supported by Article 8.2 ECHR, Right to Respect for Private Life.