Skip to content

CJEU rules that a credit score constitutes automated decision making under the GDPR

On 7 December 2023, the Court of Justice of the European Union (CJEU) issued a landmark judgment on Article 22 of the General Data Protection Regulation (GDPR), focused on decision making based solely on automated processing that produces legal effects concerning the data subject, or similarly significantly affects the data subject (ADM).  The case involved a leading German credit reference agency, Schufa (Case C‑634/21).  The judgment has significant implications for organisations using any kind of ADM, including scoring systems and systems that use Artificial Intelligence (AI). Allen & Overy previously published an update on the Advocate General’s Opinion on the case back in March 2023.

In a further GDPR judgment in relation to Schufa, the CJEU also ruled that they could not lawfully retain information from a public register related to debt, for a time extending beyond the public register’s retention period. Please see our blog on this judgment here. 

In this blog we unpack the key findings of the Schufa ADM judgment and take a look at the wider implications.

Background to the Schufa ADM judgment

The case emerged under the following circumstances. A loan application was refused based on the data subject’s Schufa score, used as a core component in the loan decision. Following a data subject access request (Article 15 GDPR), Schufa provided the data subject with the score and explained, in high level terms, the methods used to calculate that score. However, Schufa cited trade secrecy as justification for not providing information related to the weighting that lay behind the scoring system.

The data subject then took a complaint to the Data Protection and Freedom of Information Commissioner for the Federal State of Hesse, Germany (the HBDI).  The HBDI found that there was no established case that Schufa’s credit score processing was non-compliant with Section 31 of the German Federal Data Protection Act (Bundesdatenschutzgesetz, BDSG) which governs the requirements of calculating scores. It also confirmed that Schufa does not need to share the details or mathematical formula related to how information about an individual is weighted to create the individual’s Schufa score.  The data subject appealed the HBDI’s decision to the Administrative Court, Wiesbaden, Germany.  That court then made a reference to the CJEU.

Reference to the CJEU

The key question of the reference was as follows: “whether Article 22(1) of the GDPR must be interpreted as meaning that the automated establishment, by a credit information agency, of a probability value based on personal data relating to a person and concerning his or her ability to meet payment commitments in the future constitutes ‘automated individual decision-making’ within the meaning of that provision, where a third party, to which that probability value is transmitted, draws strongly on that probability value to establish, implement or terminate a contractual relationship with that person.

The pivotal issue in the case focused on Article 22 GDPR and whether the Schufa score constituted a decision solely based on automated decision making and whether that decision produced legal effects concerning the data subject or it similarly significantly affects them and thus whether Schufa should have shared more details on the logic behind the decision.  

Under Article 22 GDPR, the data subject has the right not be subject to such ADM.  There are exceptions under Article 22(2), specifically: (a) the decision is necessary for the performance of contract between the data subject and a data controller; (b) the decision is authorised by Union or Member State law to which the controller is subject and there are safeguards to protect the data subject; and (c) the decision is based on the data subject’s explicit consent. If relying on Article 22(2)(a) or (c) controllers will have to offer human intervention and a way for the data subject to express a view or contest the decision.

The key elements of the ruling 

The CJEU found: 

  1. The broad scope of the concept of ‘decision’ is confirmed by Recital 71 GDPR. That concept is broad enough to encompass calculating a credit score based on a probability value.
  2. A credit score based on probability value in this context affects, at the very least, the data subject significantly.
  3. By calculating a credit score, a credit reference agency makes an automated decision within the terms of Article 22 GDPR when a third party draws strongly on the probability value or score to establish, implement or terminate a contractual relationship with the data subject. The CJEU noted a risk of circumventing Article 22 of the GDPR and a lacuna in protections if a narrow approach was taken and the Schufa score was only regarded as preparatory.
  4. That Article 22(1) GDPR provides for a prohibition in principle, i.e. the infringement doesn’t need to be invoked individually by a data subject making a request.
  5. The question of whether Section 31 BDSG constitutes a legal basis in conformity with EU law will now have to be examined by the referring Administrative Court of Wiesbaden.

Wider implications 

The judgment is part of a trend towards broad and data subject weighted interpretations of the GDPR by the CJEU, turning it into consumer rather than privacy law. The CJEU’s judgment on Article 22 GDPR was set out in broad terms and can be applied to other situations related to scoring systems.  The judgment refers to a contractual relationship with the data subject in general terms rather than just the specifics of loan agreement process, indicating the potential breadth of relevance.

The judgment could therefore have implications for a range of scoring processes and decision making; for example, credit reference agencies also provide services for employment checks and anti-money laundering services (AML) can also offer digital services based on scores.  

Although the judgment has broad implications, it does not follow that a large number of automated scoring systems are immediately caught by Article 22 GDPR, or are immediately unlawful. It will depend on how the score relates to the final decision made and what role it plays as a factor under consideration. The controller may also have a legal basis to carry out that ADM and safeguards in place as specified under Article 22 GDPR.  There is now a significant expectation that data protection authorities need to provide guidance on what “draws strongly” means in practice.  

What comes next from data protection authorities? 

A number of German Data Protection Authorities (DPAs) have issued statements following the judgment.  The Hamburg DPA issued a statement, hailing the “ground-breaking importance for AI-based decisions” and also noting that it has “has consequences far beyond the scope of credit agencies – because it is transferable to the use of many AI systems”. The DPA also gave an example of AI analysing which patients are particularly suitable for a medical study, as an illustration of where the judgment may also make a difference.  

The Data Protection Authority for Lower Saxony has also issued a statement and indicated: “The CJEU's view on the interpretation of the term 'automated decision' could also have further implications, for example on systems that prepare decisions with the help of algorithms or artificial intelligence or make them 'almost alone', so to speak".  

Given the hype around, and importance of, AI, particularly generative AI, we can therefore expect further decisions and guidance from DPAs in 2024, setting out how the judgment will apply in a range of scenarios. 

The European Data Protection Board (EDPB) may also need to update its guidelines on automated decision making and profiling.  The guidelines were adopted in 2018 and will soon be six years old.  While the Schufa CJEU judgment does not contradict the EDPB’s guidance, for example the EDPB had already found that Article 22 operated as prohibition in principle,  the EDPB may want to expand on the wider implications and explain in more detail what the judgment means in practice, including the concept of “draws strongly” in decision making.

Controllers will need to work through the following steps:

  1. Identify any processes and systems using Schufa scores or any other scores or probability values, including when deploying AI systems.

  2. Assess whether these systems, including when contracted to third parties, make any decisions that “draw strongly” from the score and would thus now be caught by Article 22(1  If so, can adjustments be made to ensure that reliance on the score in the final decision falls below the “draws strongly” test?If the answer remains that Article 22(1) applies, the controller will need to find a specific legal basis under Article 22(2) to carry out the ADM and apply Article 22(3) safeguards. If the controller is using special category data, the further conditions of Article 22(4) apply.

    It may currently be the case that certain organisations do not have a contract in place and have not obtained explicit consent as a relevant legal basis.  Some EU Member States could seek to provide new national legislation as a legal basis alongside appropriate safeguards. 

  3. If Article 22(1) applies, the controller will need to meet the transparency requirements of Articles 13 or 14 and 15 GDPR.  The controller may need to update existing privacy notices and information pop-up windows. The information must include, at least, meaningful information about the logic involved, as well as the significance and the envisaged consequences of the processing for the data subject.

    The guidance provided by the UK data protection regulator, the Information Commissioner’s Office (ICO), on explainabilty and AI can likely be helpful in this situation.

  4. Contracts may also need to be re-examined in a number of different contexts – both the contract between the data subject and the organisation making the final decision, and the contract between the two organisations. 

  5. Following the judgment, the controller may need to review its data protections impact assessments and other documentation such as legitimate interest assessments and records of processing activities.  

Schufa has issued a press release about the judgment.  It welcomes the clarification that the judgment provides and notes that: “The overwhelming feedback from our customers is that payment forecasts in the form of the SCHUFA score are important for them, but are usually not the only decisive factor for concluding a contract”. This indicates that “draws strongly” test will be a key consideration for companies. 

Litigation risk

Lastly, in light of the EU Representative Actions Directive (2018/1828) and the increasing trend for data litigation, there is a risk that compensation claims may be launched not just against Schufa but also against organisations using the Schufa score.  Allen & Overy’s blog from May 2023 assesses the recent CJEU jurisprudence on compensation, including the finding that mere infringement of the GDPR does not confer a right to compensation. Allen & Overy’s blog on the collective redress action for consumers in Germany from March 2023 considers the incoming implementation of the EU Representative actions under the Directive (EU) 2020/1828. A key question will be whether a de minimis threshold of equal damage across a class action claim is likely in Article 22 GDPR cases. 

Looking ahead

As we look ahead to 2024 we can expect the Schufa judgment to play an important role in how AI is used in automated decision making and where the boundary falls between automated and partially automated decisions.  Companies should look out for new guidance and enforcement decisions from DPAs in the year ahead.

 

Related expertise