Skip to content

Several cases pending in Germany have led to the European Court of Justice (ECJ) dealing with individual questions of the admissibility of score calculations by credit agencies and their downstream use by companies. The questions submitted to the ECJ by the VG Wiesbaden deal with the admissibility of storing information on residual debt discharges (§§ 286 et seq. InsO), the classification of score values as “prohibited profiling” and transparency requirements in connection with score value calculations. The last two points will be dealt with in the following.

Several cases pending in Germany have led to the European Court of Justice (ECJ) dealing with individual issues relating to the admissibility of score calculations by credit agencies and their downstream use by companies. The questions submitted to the ECJ by the Administrative Court of Wiesbaden deal with the permissibility of storing information on residual debt discharges, the classification of score values as “prohibited profiling” and transparency requirements in connection with score value calculations. The last two points will be discussed below.

Facts of the case and questions raised to the ECJ

A data subject had not received a desired bank loan – presumably also because of an insufficient credit score that the bank had previously queried with Schufa. The person concerned first turned to Schufa with requests for information and deletion and, after these requests were not sufficiently fulfilled by Schufa in his opinion, to the Hessian Data Protection Commissioner (HBDI). The HBDI refused to take action against Schufa, referring to the legal conformity of the Schufa procedure. The subsequent court proceedings against the HBDI were suspended by the Wiesbaden Administrative Court, which referred the following question, among others, to the ECJ for a preliminary ruling:

Is Article 22 (1) of Regulation (EU) 2016/679 1 to be interpreted as meaning that the automated establishment of a probability value concerning the ability of a data subject to service a loan in the future already constitutes a decision based solely on automated processing, including profiling, which produces legal effects concerning the data subject or similarly significantly affects him or her, where that value, determined by means of personal data of the data subject, is transmitted by the controller to a third-party controller and the latter draws strongly on that value for its decision on the establishment, implementation or termination of a contractual relationship with the data subject?

Or to put it briefly: Is a credit score determined by a credit agency and used by a company as the basis for a decision to grant credit already a prohibited automated decision within the meaning of Article 22 (1) of the GDPR?

Among other things, the Advocate General at the ECJ submitted his opinion on this question on March 16, 2023. Apart from the questions submitted, the Advocate General also felt compelled to say something about the transparency requirements for score calculations.


Score calculation = prohibited automatic decision after profiling?

At the heart of the reference proceedings is the classification of score values under data protection law. According to Article 22 (1) of the GDPR, a data subject has the right “not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” Profiling that automatically leads to a decision adverse to the data subject is thus prohibited.

In his opinion, the Advocate General concludes that a score calculation by a credit agency is to be classified as such “prohibited profiling” if inquiring companies “according to established practice” base their decisions, inter alia, on the conclusion of a contract “decisively” on this score.

The quoted passages already make it clear that the Advocate General had a concrete, very narrowly defined set of facts in mind when formulating his opinion. Also, in the derivation of the opinion, the facts presented by the Administrative Court of Wiesbaden is emphasized extraordinarily prominently at different points:  “According to the referring court”, the decision on entering into a contractual relationship is “practically determined to such a considerable extent by the […] score value […] that the latter, as it were, filters through to the decision of the third party responsible”. “According to the referring court”, the decision on whether and how to enter into a contract is “actually […] determined by the score value”. Companies do not have to make their contractual decisions dependent on a score value, but they would “as a rule […] significantly [do so].” And so on and so forth.

The picture painted by the referring court, however, corresponds to only a small subset of practice. This is problematic for the finding of law. Even the Advocate General seems to feel uneasy in the face of such distorted portrayals by the referring court, as he explicitly emphasizes that his assessment is “subject to the assessment of the facts, which is incumbent on each national court in each individual case.”

And the reality? Score values are an important factor in contract decisions, but by no means the only one. The wide-ranging nature of the industry, which makes the conclusion of contracts such as loans and installment deliveries, or the selection of payment methods (e.g., purchase on account) dependent on score values, makes it impossible to take a simplistic view. Anyone who has ever bought a property on credit knows that, for banks, numerous other factors are decisive for the credit decision in addition to the credit score.

And; even if score values should be the “essential” factor here, this still does not lead to an applicability of Article 22 of the GDPR. According to this, the automated decision must be made “exclusively” (sic!) by machine. The proposal of the European Parliament in the context of the legislative process of the GDPR, to include also only “primarily” automated decisions in the scope of application of Article 22 of the GDPR, has just not prevailed. If the decision is only partially or only predominantly based on a machine evaluation, Article 22 of the GDPR does not apply. The statements of the Advocate General are legally incorrect at this point.

Significant impairment in the case of score-based decisions?

Profiling is only prohibited if it automatically leads to a decision that has a legal effect on the data subject or significantly affects him or her in a similar way. The Advocate General addresses the associated, extremely complex legal issues in just two paragraphs. That is undercomplex.

On the basis of the facts of the case, it is indeed logical that the rejection of a credit request is in any case accompanied by a considerable impairment for the person concerned. However, in the numerous other constellations in which score values influence business decisions, there is regularly a lack of legal or otherwise significant impairing effects. The mere refusal to enter into a contract or to accept certain contractual terms and conditions does not have any legal effect, nor is the person concerned significantly impaired in a comparable way as a result. The principle of private autonomy, and in particular the freedom to conclude contracts only on the basis of unilaterally defined modalities, would be effectively undermined if a prohibition to this effect were introduced via the detour of Article 22 of the GDPR. Apart from the provision of primary care, there is freedom of contract. Prohibitions directed at this and at the drafting of contracts are generally contrary to fundamental rights.

It is also far-fetched to assume a significant infringement in cases where consumers are offered the conclusion of a contract on the basis of a score only if they are granted certain payment methods, such as those that do not entail any credit risk for the merchant (cash on delivery or payment in advance, but not purchase on invoice). In such constellations, it is not a matter of rejecting a contractual offer or discriminating against the consumer, but rather of a minor modification of the offer by the merchant within the scope of his entrepreneurial freedom, but also within the scope of his duty, which is in the interest of both contracting parties. To the extent that the consumer is left with other reasonable payment methods, the subjectively perceived potential for disruption does not reach the threshold of “significant” impairment required by Article 22 (1) of the GDPR; mere consequences perceived as inconvenience are precisely not to be covered. The Advocate General also explicitly points this out: The provision covers “only serious effects” (para. 34 of the Opinion).

No rule without exception

Even if one were to agree with the Advocate General’s comments that the creation of a score by a credit agency already constitutes a “decision” within the meaning of Article 22 (1) of the GDPR, it would in any case have to be examined in a second step whether the general prohibition on the use of such a “decision” is not justified by way of exception pursuant to Article 22 (2) of the GDPR. The wording of the provision is clear: The prohibition of use of Article 22 (1) of the GDPR “does not apply” if one of the exceptions mentioned in paragraph 2 of the provision applies.

The Advocate General does not say anything about this, presumably also because the referring court (for reasons?) does not want to hear anything about possible exceptions to the prohibition of use of Article 22 (1) of the GDPR. Before the ECJ, it is like in real life: The answer one receives is largely determined by the question.

In any case, with regard to the loan cases to be decided by the ECJ, Article 22 (2) (a) of the GDPR virtually imposes an exception to the prohibition on use in paragraph 1. According to this, automated decision-making may be used if the decision is “necessary for the conclusion or performance of a contract between the data subject and the controller”.

The automated decision on the granting of a loan, which is to be considered by the ECJ and is based on a credit scoring carried out in the context of the initiation of a loan agreement, is likely to be a main case of application of this exception. Such a decision is obviously necessary within the meaning of Article 22 (2) (a) of the GDPR, since it determines, for example, the maximum loan amount and the amount of the installments for interest and repayment on the basis of the applicant’s actual ability to pay. The Advocate General puts himself on the track here by pointing out that “the processing of a loan application is a step preceding the conclusion of a loan agreement” (para. 35 of the Opinion), but ultimately does not follow this. However, the Wiesbaden Administrative Court will have to deal with this issue at the latest.

Pimp your creditscore

Finally, the Advocate General comments – here without being asked – on transparency obligations in connection with the creation of score values. This duty of transparency is enshrined in law in Article 15 (1) (h) of the GDPR, according to which the data subject must be provided with information about the existence of automated decision-making within the meaning of Article 22 (1) of the GDPR, including “meaningful” information about the logic involved and the scope and intended effects of such processing.

In the original proceedings, Schufa had reportedly only made available general information on the score calculation it had carried out itself and otherwise invoked trade and business secrets. However, if the Advocate General has his way, the obligation to provide “meaningful information about the logic involved” must be understood as providing “sufficiently detailed explanations about the method for calculating the score value and the reasons […] that led to a certain result”. The controller should provide the data subject with information “in particular on factors taken into account in the decision-making process and their weighting at an aggregated level” (para. 58 of the Opinion).

Although rather marginal in the Opinion, this point has some explosive power. Up to now, it was recognized that, due to the rather abstract concept of “logic”, only the underlying mechanics of a score calculation had to be presented, but not the weighting of individual factors, and certainly not the concrete calculation formula. But if everyone knows in the future that, for example, ordering excessively on invoice has a negative effect while several credit cards tend to have a positive influence on one’s score, what significance do such scores still have?

If everyone were able to model their own credit score in the future, the importance of the credit bureau system as an indispensable part of the economic order, which is repeatedly emphasized by the (German) legislator, would probably face a significant break. The determination of creditworthiness and the provision of credit reports form “the foundation of the German credit system and thus also of the economy’s ability to function.” This, and also the protection of consumers against overindebtedness, which is “in the interests of both consumers themselves and the economy,” is worth nothing anymore if everyone could control their score at will. However, if objectively processed information on the creditworthiness of market participants no longer existed, broad sections of the population would be virtually excluded from lending due to the risks that cannot be calculated for lenders. Non-contact mass transactions, which have gained in importance particularly in the recent pandemic, for example in the areas of e-commerce or telecommunications, would be impossible, or at least considerably more difficult, without objectively processed information on potential customers. This classification of credit agencies as particularly important for the economic order has always been shared by the courts. And according to the ECJ itself, credit information systems support credit institutions in fulfilling their tasks (ECJ, C-565/12) and are suitable for preventing the over-indebtedness of borrowers (ECJ, C-238/05).

It remains to be seen whether and if so, how the ECJ will reconcile this specially propagated appreciation for the credit bureau system with the poorly crafted standard of Article 15 (1) (h) of the GDPR.


Even if the ECJ were to follow the Advocate General’s bumpy comments, this would in no way imply that the Schufa score violates EU law. The media, including the public service media, which came up with this lurid headline as soon as the Advocate General’s opinion was presented, should remember their duty of care as journalists.

In other respects, too, the expectations of many will probably be disappointed: The current Schufa proceedings are not suitable for bashing either the Schufa score or the credit agency system as a whole. The facts underlying the questions referred are narrow and reflect, if at all, only a small section of the practice. And if the referring Administrative Court of Wiesbaden ultimately rules against Schufa, the proceedings will certainly not have come to an end.