The French data protection authority, the CNIL, has issued a fine of €40m against French adtech giant Criteo. The penalty is in respect of alleged breaches of the GDPR concerning consent, transparency, access rights and joint control.
The final penalty represents a reduction from the originally proposed €60m. It seems that the CNIL took heed of Criteo’s representations on the scale of the fine (at least in part).
The origin of these enforcement proceedings is a complaint issued by advocacy group Privacy International back in 2018. Privacy International in fact issued three separate complaints (covering data brokers, adtech and credit reference agencies) with data protection authorities in France, Ireland and the UK. The Criteo fine represents the second significant outcome from these original complaints, with the ICO also investigating and taking action against Experian (albeit that is now the subject of appeals).
The Privacy International complaint was also followed by a further complaint issued by Max Schrems’ privacy group (NOYB) in 2019.
The CNIL took up the complaints as lead supervisory authority in 2019 and engaged in a multi-year investigation and enforcement proceeding, culminating in last week’s fine.
Essentially the investigation focussed on Criteo’s processing of personal data in the course of providing ‘advertising retargeting’ services to its publisher and advertiser clients, including through the placement of its cookies on users’ browsers.
Criteo did process personal data
The CNIL found that Criteo collects various pieces of data when operating its services and through the use of its cookies, including a Criteo unique ID, the user’s internet browsing history, IP address, partner identifiers and (in some cases) hashed email addresses.
Criteo argued that these data points constituted mere technical ‘browsing events’ which did not allow the direct identification of the individuals concerned, and as such that the risk of re-identification was low. Criteo did not appear to argue that the data was truly anonymised, but rather that it was processing pseudonymous data only, which had only a very limited impact on data subjects.
The CNIL did not agree. It made clear that pseudonymous data remained subject to the GDPR, but also found that the richness of the data collected by Criteo was, in any event, sufficient to make the re-identification of the individual’s actual identity reasonably likely.
The CNIL has long been one of the more ‘absolutist’ supervisory authorities when it comes to issues of anonymisation and re-identification and so its decision here is unsurprising. Its decision refers to anonymisation in terms of entirely removing ‘the possibility of re-identifying the natural person’, which potentially goes further than a consideration of the level of re-identification risk, although it is possible there could be a translation issue here.
As Criteo has no direct relationship with the data subjects whose personal data it processes, it relies on its publisher clients to obtain consent on its behalf.
Criteo does this by placing contractual commitments on the publishers to obtain valid consent. It also had a right to audit the publisher’s compliance with that commitment. This is a fairly standard approach taken by adtech intermediary vendors.
Criteo also pointed to an enhancement to its verification process initiated in October 2022, whereby publishers’ consent practices were assessed prior to Criteo entering into a contract with them.
However, the CNIL carried out its own assessment of the consent practices of the publishers with whom Criteo worked, and found that more than half of them were not complying with the GDPR.
The CNIL found that reliance with a contractual commitment alone was not sufficient, which is unsurprising and consistent with the EDPB’s consent guidance. It also found that Criteo’s approach to auditing publishers did not go far enough, noting that Criteo had never terminated a publisher’s contract for consent related failures and also that its new October 2022 procedure was adopted too late.
Criteo also sought to argue that as it was a joint controller with its publisher clients, it was the publishers who were best placed to obtain consent and therefore who ought to have primary responsibility for compliance with the consent obligation. The CNIL rejected this argument making clear that in a joint control arrangement both controllers must still establish their respective lawful bases for processing. It also found that French e-Privacy law (and therefore a separate consent requirement) applied to Criteo’s activities, regardless of any joint control arrangement.
The CNIL also took aim at Criteo’s approach to transparency, finding that its privacy notice did not indicate all the purposes for which personal data would be processed and lacked clarity as to the legal basis relied on for the processing.
Criteo’s arguments were: (1) that Article 13 GDPR ought not to apply to it given its indirect collection of personal data; (2) that it was not required to explain the product improvement purpose in detail as this formed a core part of the personalised advertising related services it offered; and (3) any lack of clarity in relation to legal basis had no impact on the data subjects as they could continue to exercise their rights.
Again, the CNIL rejected Criteo’s arguments. It found that:
- Article 13 (as opposed to Article 14) did in fact apply because the operation of the Criteo cookie on a publisher’s website results in a direct collection of personal data by Criteo.
- There was clear uncertainty within the Criteo privacy notice as to whether Criteo relied on consent or legitimate interests for its activities.
- Improvement of Criteo’s technologies through machine learning constituted a separate purpose from the general provision of advertising services, and as such needed to be called out in the privacy notice.
Whilst Criteo has recently updated its privacy notice to address some of the points above, the CNIL found that this came too late for the purpose of this enforcement proceeding.
The CNIL also found that Criteo took an unduly restrictive approach to responding to subject access requests, for example by excluding information about Criteo’s ID matching process. It also found that Criteo did not give sufficient explanations to data subjects about the technical information it did provide, such that the provision of this information was unintelligible to the requester.
The CNIL found that Criteo’s approach to users who sought erasure of their data did not meet the requirements of the GDPR.
On receiving an erasure request, Criteo’s approach was to undertake a ‘deactivation procedure’ which prevented the ongoing linking of the user’s identifier in such a way that it could no longer be used for personalised advertising. However, the identifiers were not actually deleted from Criteo’s databases. For the CNIL this was not sufficient for compliance with Article 17.
As stated above, the CNIL found that Criteo was a joint controller with its publisher and advertiser clients.
However, its agreements were found to be insufficient as they did not properly allocate responsibility between the various joint controllers for all aspects of GDPR compliance. In particular, the agreements did not cover core obligations concerning data breach notification, the exercise of data subject rights and the completion of DPIAs.
Amount of fine
Criteo did have some success in bringing the fine down from €60m to €40m.
It pointed out that the proposed fine amounted to nearly 3% of global turnover (near the GDPR’s maximum) whereas, by contrast, previous decisions by the CNIL against Google and Facebook amounted to only 0.07% and 0.06% of global turnover respectively.
The CNIL maintained its findings that the breaches were serious, but agreed to a reduction to something closer to 2% of Criteo’s global turnover.
One final notable point is that Criteo did not receive any reasoned objections from any other concerned supervisory authorities. This is in stark contrast to recent decisions taken by the Irish DPC which have been the subject of vociferous objections and EDPB dispute proceedings.
Whilst the CNIL’s findings are not particularly surprising given its previous approach to adtech related enforcement, there are a number of points here which will be read with interest by other adtech companies. In particular:
- There is a very high bar set by the CNIL when it comes to verification and audit of partner consent practices. The expectation is that intermediaries who rely on consent must actively check the validity of such consents.
- Similarly, when it comes to transparency the CNIL was not satisfied with muddled wording on lawful bases which did not allow data subjects to discern which basis applied to which purpose. Partly as a result of this regulatory posture (which we have also seen elsewhere), there is an increasing trend towards tabular privacy notices, which can be helpful in setting out the interaction between data categories, purposes and lawful basis.
- In some cases the CNIL refers to improvements made by Criteo (e.g. a new privacy notice and a new approach to auditing publishers). There is always a risk in enforcement proceedings that improvements can be used against controllers as a means of demonstrating earlier non-compliance. We usually look to push back hard against this type of regulatory position by maintaining (if possible) that the earlier approach was appropriate at that time and that ‘continuous improvement’ is an ongoing obligation for all controllers.
- When it comes to DSARs, regulators are clearly expecting data subjects to be able to understand the data provided to them. This can be extremely difficult in an adtech context where sometimes the data is only intelligible with a broader understanding of how the ecosystem operates. However, the onus is on controllers to try and go further in being helpful to data subjects by providing as much background and context as possible.
- Joint control does seem to be found here, there and everywhere when it comes to adtech. Even if controllers don’t want to acknowledge joint control in their agreements, such agreements should include some allocation of responsibility when it comes to core issues such as data subject rights (even if only to say that both parties should be responsible for the rights requests they receive).