Individuals have the right to receive meaningful information about solely automated decisions with significant effects under the General Data Protection Regulation (GDPR).  This includes decisions that will impact an individual’s finances or employment.  But how much information are individuals entitled to receive?  Should they be given the underlying algorithm, or merely a high-level explanation, or something in between?  And what if the information requested is a trade secret, or includes another individual’s personal data?

In Case C-203/22 Dun & Bradstreet Austria GmbH, the Court of Justice of the European Union (CJEU) looked at these questions.  The court confirmed that:

  • the data subject is entitled to the procedure and principles applied to use their personal data with a view to obtaining a specific result; and
  • where that information includes trade secrets or third-party personal data, the controller must provide it to the competent supervisory authority or court to carry out a balancing exercise on the relevant rights and interests.

The case

An individual was refused a pay monthly mobile phone contract on the basis that their creditworthiness was not sufficient.  The individual submitted a request for information about the logic involved in the automated decision on their credit score under Article 15(1)(h) GDPR.  The Austrian data protection authority (DPA) ordered Dun & Bradstreet Austria GmbH (D&B), who had calculated the score, to disclose meaningful information about the logic involved. 

Austrian law included an exemption from the right to access under Article 15 GDPR where the access would compromise a business or trade secret.  D&B appealed, relying on this exemption.  In a subsequent action on whether the information had been disclosed as required by the Austrian courts, the Viennese Administrative Court referred questions to the CJEU.  These covered the extent of the information that must be disclosed under Article 15(1)(h) GDPR and the position on trade secrets and third-party personal data.

Disclosure of an algorithm not necessary (or helpful) – but controllers must provide information on how the individual’s personal data was used in the automated decision

The CJEU emphasised that Article 15(1)(h) existed primarily to allow the data subject to exercise their right under Article 22(3) to express their point of view on the decision and contest it.  The controller therefore needed to describe the procedure and principles actually applied in such a way that the data subject can understand which of his or her personal data have been used in the automated decision-making at issue.  The information must be sufficiently transparent and intelligible.  In a credit reference context, it might be sufficient to inform the data subject of the extent to which a variation in the personal data taken into account would have led to a different result.

Following the Advocate General’s Opinion and the Article 29 Working Party’s guidance on automated decision-making, the CJEU found that disclosure of an algorithm was not necessary.  The court even went a step further and suggested that providing either the algorithm or a detailed description of all the steps in automated decision-making would not constitute a sufficiently concise and intelligible explanation.

But what about black box models?

With many of the algorithms currently in use in financial services, controllers will have the information to provide to data subjects (although may not wish to disclose their trade secrets).

However, it is not clear at this stage what controllers would need to provide where using a “black box” model, where even those developing it may not fully understand how a decision is reached.  It may be time and resource intensive to provide the kind of local or counterfactual explanation that the CJEU envisages in this scenario, and even then, explanations may still be approximate.  In the UK, the ICO has provided further context on providing a meaningful explanation when using a complex AI system (but, naturally, this is not applicable in the EU).

Trade secrets and third-party personal data

The CJEU emphasised that, as set out in recital 4 GDPR, the right to the protection of personal data is not an absolute right and must be balanced with other fundamental rights.  Specifically, the data requested might contain the data of a third party, or trade secrets.

In both cases, the CJEU found that it would be necessary to disclose the disputed information to the competent court or supervisory authority, which would carry out a balancing exercise on a case-by-case basis. 

The Austrian exemption on the disclosure of trade secrets was not permitted, as Member States cannot prescribe the result of what must be a case-by-case balancing exercise.

Our take

The confirmation that the underlying algorithm need not – and indeed should not – be disclosed will be welcome to controllers.  Interestingly, a research note from the UK Financial Conduct Authority on explaining AI’s role in credit decisions supported the view that more information will not necessarily be more meaningful.  The researchers found that participants who were given more information about the inner workings of an algorithm’s decision-making displayed worse judgement on average.

However, the level of information that the CJEU expects where no trade secrets or third-party personal data is involved is extensive.

Of course, frequently there will be trade secrets involved, and many uncertainties remain about how the balancing exercise the CJEU requires would play out.  These include who the competent authority or court will be for a global business, the scope of disclosure required, whether the competent authority or court is equipped to safeguard the trade secret, how they will do this and what assurances can be given – if any – prior to disclosure. 

If a decision ordering disclosure is made, it is also unclear whether or how any data controller resisting disclosure of the trade secret would appeal the decision and how the rights of the data subject would interplay.  Finally, there is also a question as to whether the competent authority or court has the technical expertise to understand the trade secret and undertake an informed assessment.

At this stage, there is also no clarity on the extent to which using black box algorithms will make compliance with Article 15(1)(h) challenging. 

Pending judgments and DPA decisions may provide more clarity on Article 15(1)(h) – pressure group noyb has already made a complaint to the Swedish DPA concerning an access request made to Swedbank on how mortgage rates are calculated.  The first AI Act CJEU referral was also made in relation to a similar right to a meaningful explanation about the role of a high-risk AI system in decision-making.

However those cases play out though, this is likely to be an impactful judgment for financial service firms and corporates.