Can you read minds yet? The ICO publishes its report on the use of neurotechnology

22.06.2023

In an attempt to stay ahead of the game, the ICO has prepared a report on the use of neurotechnology, a technology that is experiencing a period of rapid proliferation and one that the ICO expects to become commonplace in due course. For example, Neuralink, the brain implant company owned by Elon Musk, received FDA approval to conduct clinical trials on humans in May 2023.[1]

The report has been published as part of the ICO25 strategic plan to share opinions on emerging technologies in advance of them becoming widely available, with the hope that it will encourage privacy by design and build the public’s trust in such technologies.

Are we talking about mind reading technology here?

Not quite. In the absence of any agreed definitions in law or industry, the ICO has defined neurodata as:

‘first order data gathered directly from a person’s neural systems (inclusive of both the brain and the nervous systems) and second order inferences based directly upon this data’

Neurotechnology includes devices and procedures that directly record and process neurodata.

Unfortunately for any fans of science fiction, the ICO has made it clear that first order data captured from neural systems is currently only a binary response – ‘mind reading’ is therefore expressly carved out from the scope of the report.

The report also goes on to discuss the various forms of neurotechnology currently available. For example invasive v. non-invasive technologies, active v passive devices and closed v open loop systems. For those without a background in neurotechnology (i.e. most lawyers), the report is a useful starting point for getting your head around the technology.

I’m not in the medical sector, should I still care?

Yes. While the most obvious applications of neurotechnology might be in the medical sector, for example where it is proposed as a treatment for conditions such as Epilepsy or Parkinson’s disease,[2] there are a number of other areas that the ICO considers likely to be impacted by the technology. This includes:

  • as a tool for analysing athlete performance, including concentration levels and reaction times;
  • in the gaming industry, as a means of offering hands-free game control;
  • in order to obtain insights for neuromarketing, allowing organisations to analyse how consumers react to adverts and tailor their campaigns accordingly; and
  • to monitor employees, for example to monitor levels of alertness when driving HGVs.

The ICO also suggests some further, more ‘futuristic’ applications, including the development of neuroenhancement technology (technology used to improve or enhance brain functionality, such as improving reaction times or concentration levels) and the ability to scan candidates in recruitment campaigns for desirable traits. While these examples might seem fanciful at this stage, the rate of development means that the ethics and legality of such applications may need to be addressed sooner than we think. Notably, where an individual’s behaviour can be modulated or altered, there are heightened risks concerning the processing of such sensitive personal data, as well as a risk that individuals do not understand how or why an organisation is using it.

What are the key privacy issues?

The ICO has pulled together a list of seven key regulatory issues in this area, covering fundamental compliance matters such as when neurodata will be special category data (whether categorised as health data or, less commonly, biometric data), what legal bases could be relied on, challenges surrounding accuracy, data minimisation and data subject rights, as well as broader issues such as the risk of neurodiscrimination. We consider some of these issues in more detail below.

1. Consent and the appropriate basis for processing

While some groups are calling for explicit consent to be the only available legal basis for processing neurodata relating to an individual, the ICO rightly calls out that it may be very challenging to ensure that individuals are made fully aware of and understand the nature of the processing (and what that processing will reveal) in order that they may be in a position to provide informed and freely given consent, as required under the UK GDPR. The collection of neurodata is essentially involuntary, meaning individuals may have limited control over what information they are consenting to the processing of. This may render consent an unattractive legal basis to rely on. Further, for certain sectors, for example recruitment, consent is unlikely to be freely given, since there is an inherent power imbalance in the recruitment process, which generally renders a consent legal basis invalid.

Where, for example, neurodata is collected and processed for medical purposes, it will be considered health data (which is recognised as one of the special categories of personal data under the UK GDPR) and therefore a condition under Article 9 of the UK GDPR will also need to be met in order to process that data. In the absence of explicit consent, it can be quite challenging to meet the requirements necessary in order to rely on other Article 9 conditions and this is therefore an issue that will need to be considered carefully.

2. Discrimination

Active discrimination, such as treating neurodivergent individuals less favourably in a recruitment campaign, is an obvious risk. Further, where the technology uses AI and is developed using non-representative or biased datasets, it may lead to unfair results.

The fact that neurotechnology enables organisations to collect large scale, complex datasets about an individual means highly detailed inferences about sensitive issues, such as mental health, can be drawn. Organisations should therefore ensure that such risks have been carefully assessed and appropriately mitigated before the technology is put into use.

3. Accuracy and data minimisation

Neurodata, for example an emotional response, is not a permanent state and can therefore fluctuate frequently. This leads to the risk that organisations collect large amounts of neurodata that very quickly becomes out of date. Where neurodata is used to make decisions about individuals, there is a substantial risk that decisions are made based on historic, inaccurate neurodata, or are based on limited data points, which do not tell the full story. This could lead to, for example, an incorrect diagnosis in the medical sector, or unfair disciplinary action in an employment setting.

The data minimisation principle will also need to be factored in, meaning that organisations will need to ensure that their processing is proportionate to the activities they are conducting. Finding the balance between having an accurate and complete set of neurodata, without carrying out excessive processing, will likely be a challenge for many applications of neurotechnology.

4. Data subject rights

The large scale and complex nature of the neurodata collected presents some additional challenges when responding to data subject rights requests. For example, in the case of an access request, how will the organisation provide the dataset in a ‘commonly used electronic form’? How will organisations ensure individuals are able to correct inaccurate neurodata if they do not understand what information is held about them? And what impact will a request to delete a data subject’s neurodata have on the accuracy of the AI system? Evidently, there is a lot to think about.

Where is this heading?

While the report does not form part of the ICO’s formal guidance, the ICO has committed to preparing specific guidance on the use of neurotechnology in due course. That said, we would be surprised if we see this guidance anytime soon, given that there are a number of serious legal and ethical questions for the ICO to grapple with. Organisations operating in the neurotechnology space may therefore need to tread carefully in terms of addressing data protection compliance, until some more substantive guidance becomes available.

============================================

[1] https://www.reuters.com/science/elon-musks-neuralink-gets-us-fda-approval-human-clinical-study-brain-implants-2023-05-25/

[2] https://royalsociety.org/-/media/policy/projects/ihuman/report-neural-interfaces.pdf