Last year, we wrote about the A-G opinion in Case C-203/22 of Dun & Bradstreet Austria. The case centres on how much information companies must provide to individuals about how their commercially sensitive algorithms work, in a situation where an individual exercises their right to request “meaningful information about the logic involved” with such algorithms.
The question is now more relevant than ever, since many companies want to be able to automate many processes using AI (think CV-screening for recruitment and using chatbots for customer service). But the promised efficiencies look a little bleaker, if a company must provide very detailed information about how its proprietary algorithms work to individuals making GDPR data subject access requests or DSARs – especially if this may compromise its trade secrets. Luckily, we now have further clarity on how much information should be provided to individuals about the logic of such systems, following the CJEU’s judgment.
What happened?
A consumer, CK, was refused a mobile phone contract costing just €10 a month on the basis of an automated credit assessment. CK submitted a DSAR to the agency concerned, Dun & Bradstreet Austria asking for more information about how the algorithm had come to this assessment.
The Austrian court initially held that Dun & Bradstreet had infringed the GDPR for failing to provide the customer with ‘meaningful information about the logic involved’ in the automated decision-making in question, but the agency still failed to provide any further information to CK. CK therefore brought a further action to the Administrative Court in Vienna which referred two key questions to the CJEU for consideration:
- What constitutes “meaningful information about the logic involved” in automated decision-making?
- Is a data controller required to provide information relating to company trade secrets as part of a DSAR?
What did the CJEU say?
The CJEU held that “meaningful information” under Article 15(1)(h) of the GDPR, means an individual is entitled to request an “explanation of the procedure and principles actually applied in order to use, by automated means, the [individual’s personal data] with a view to obtaining a specific result”.
Here, the information to be provided must meet the GDPR’s all-important requirements of transparency and intelligibility, i.e., it should be easy for a consumer to understand how the processing of their personal data was used by an algorithm to come to a particular decision.
This requirement is not satisfied where the information provided comprises of “the mere communication of a complex mathematical formula”, such as an algorithm, or “by the detailed description of all the steps in automated decision-making, since none of those would constitute a sufficiently concise and intelligible explanation”.
While it might frustrate some to learn of the high level of detail expected to be provided to data subjects in this context, the flip-side of this statement is there is no requirement to provide the algorithm itself to data subjects. The CJEU held this was not only because having to provide such information as part of a DSAR may compromise vital trade secrets of the controller, but also importantly, because such information is not particularly helpful to consumers from a transparency perspective.
Interestingly, the CJEU held that the explanation provided to individuals may set out “the extent to which a variation in the personal data taken into account would have led to a different result”. This looks like a fairly onerous point to meet, as data controllers would have to explain what hypothetical “other” personal data might have changed the outcome for the data subject.
Trade secrets – do I have to reveal the “secret sauce”?
Returning to trade secrets, businesses will be glad that the judgment confirms that companies do not need to disclose trade secrets to individuals who make an access request under Article 15 of the GDPR. However, pointing to a trade secret does not exempt businesses from explaining to the individual the logic behind any automated decision making unless they can demonstrate this information is also a trade secret. It will be interesting to see if this point is tested – for example in the context of social media companies where the logic behind a recommender algorithm may be viewed as the “secret sauce” behind the company’s success.
Companies also cannot just state they will not provide information to a data subject on the basis that there is a trade secret. Instead, a company must provide the commercially sensitive information to a data protection authority or court and it will be for them to conclude whether this information is a trade secret or should be provided: this may cause some concern for industry.
So, what does all this mean operationally?
Firstly, this is not the only recent case to focus in on automated decision making: we already saw in 2023 the CJEU’s judgment in SCHUFA Holding (Scoring). In that case (which also centred on credit scoring) the court held that preparatory acts and information which were used by a lender to come to a credit decision constituted automated decision making under Article 22 of the GDPR. That judgment, however, did not consider the scope of data subjects’ right of access to “meaningful information about the logic involved” in automated decision making, as this case does.
Importantly, both cases stemmed from data subject complaints and not regulatory investigations. This highlights that when it comes to AI and other automated decision making, individuals are particularly interested in understanding how systems came to a result which impacted them, and companies should expect to receive more data subject requests on this issue. Companies will need a clear process around understanding how specific decisions have been reached by a system. For example, if you are using an AI-based software to screen CVs, it will be important to understand how the software is classifying applications. This is easier said than done, especially where you are using an off-the-shelf AI system you have bought in from a third-party provider. Businesses should carefully consider what kind of contractual measures they might need here.
Secondly, front-load your GDPR compliance efforts in your privacy notice to help discharge your obligations under Article 15(1)(h). It is important to explain how your algorithms and other automated decision-making processes work in a way that makes sense to your audience. Template wording in a privacy notice that makes generic reference to Article 22 of the GDPR is unlikely to pass muster.
Finally, trade secrets need not be revealed to data subjects as part of access requests, but that does not more broadly exempt you from your obligations under Article 15 of the GDPR – so the judgment does not represent a get-out clause to data subject rights more generally!