Policy Monitor

Court of Justice of the EU – Case regarding credit scoring and automated decision making (Schufa I & II, C-26/22 and C-64/22, C-634/21)

In December 2023, the Court of Justice issued two judgments. Firstly, it was examined whether a form of credit scoring falls under Article 22 of the GDPR. Secondly, the Court addressed the lawfulness of an extended retention period for personal data from a public register. These judgments are among the first by the Court concerning automated decision-making as defined in Article 22 of the GDPR.

What: Case of the CJEU

Impactscore: 1

For who: Businesses using algorithms or other automated processes to produce risk scores or similar outputs, businesses creating such algorithms

URL: https://curia.europa.eu/juris/document/... (C-26/22 and C-64/22)

https://curia.europa.eu/juris/document/... (C-634/21)

Key takeaways for Flanders

Although the practice of credit scoring is an infrequent practice in Flanders (and by extension all of Belgium), the Schufa judgments show that both the users of such algorithms and their providers need to think carefully about the assistance of algorithms in making impactful decisions regarding a data subject. Indeed, the prohibition contained in Article 22 of the GDPR enjoys a broad interpretation, according to the Court of Justice. If an algorithm thus affects decision-making in a significant way, ancillary information obligations and the possibility for data subjects to object must always be taken into account.

In 2023, the Court of Justice of the EU published several rulings concerning an automated credit scoring system used by a German company, Schufa Holding AG. An individual was denied a loan by a bank, and the decision was (partly) based on a credit score assigned by Schufa. The individual not only contested this outcome but also requested more transparency about the factors considered in the scoring process.

One factor included in the algorithm for determining the score was the decision to discharge the individual’s remaining debts by the government. The two complainants in the cases had received such discharge, after which this information remained available in public registries for six months. However, Schufa retained this information for three years after updating their database, which is significantly longer than the government’s retention period of six months. The complainants argued that if the government limits the availability of this information to facilitate a fresh start and prevent prolonged consequences, then Schufa should not retain this information for longer and base their credit score on this data for three years. The Court agreed, and since then Schufa has already reduced the retention period for this type of information to six months, in accordance with the public availability of the data in public registries.

Article 22 of the GDPR grants individuals the right not to be subjected to decisions based solely on automated processing, including profiling, that have legal consequences or similarly significant effects on them. In other words, this article prohibits such automated decision-making processes. It is essential to clarify what constitutes a ‘decision’ in automated processing and when such a decision has significant legal implications for the individual.

Prior to the court rulings, it was suggested by Schufa that credit scoring itself might not be covered by Article 22 of the GDPR. This was because the score alone does not constitute the decision to actually grant an individual a loan for example. It is the bank employee who would later make a decision based on the assigned ranking. Thus, Schufa argued that the credit score was not a ‘decision’ as defined by Article 22 of the GDPR.

However, the court took a different stance, concluding that the credit score itself could indeed be considered a decision that significantly impacts the individual, as the score played a ‘determining role’ in the decision whether to grant credit. The Court adopted a broad interpretation of the term ‘decision’, finding that it could encompass ‘a number of acts which may effect the data subject in many ways’.

Notably, Schufa’s own press release on their website continues to argue that as long as the credit score is used only as an “advisory” tool and direct legal consequences do not depend solely on the credit score, the practice remains permissible, despite the Court’s rejection of this argument in the case. The company advises that if the score is clearly linked to the outcome of the procedure, one should use one of the exceptions provided in paragraph 2 of Article 22 GDPR, particularly the necessity for entering into or performing a contract between the data subject and a data controller, or the data subject’s consent. Additionally, there is a possibility to provide an exception in national law. Schufa urges the German legislator to establish a clear law on this matter, given that the existing legislation’s legitimacy has been questioned. The question remains whether the business model of such companies is sustainable now that the court has effectively challenged it. After all, the court did conclude that ‘scoring’ is permissible, but only under certain conditions. A balancing act will thus be required.

Lastly, the judgment emphasises the transparency obligations outlined in the GDPR (Article 13(2)(f) and Article 14(2)(g) GDPR). It must be clear to individuals how an algorithm reaches its conclusions (Article 15(1)(h) GDPR), and they must allow to fight the outcome of an algorithm. This obligation is also mirrored in Article 86 of the AI Act where an affected person has a right to a meaningful explanation on the role of a high-risk AI system when this is used in certain individual decision-making and the decision produces legal effects or similarly significantly affects a person in a way that they consider to have an adverse impact on their health, safety or fundamental rights and the main elements of the decision made for high-risk AI systems.

It can be concluded from the judgments that there is a conflict between the right of the data subject to be provided with sufficient information and the business model of such companies, where the elements taken into account and the ‘weights’ assigned to them in the assessment can be crucial to remain at the forefront of the market. How an algorithm is written may fall under the protection of intellectual property law. It will therefore be necessary to look for a way of providing information that is sufficiently transparent but also does not simply throw the peculiarities and design-decision to/of an algorithm on the table.