The European Commission recently published a detailed report on the ethical issues associated with connected and autonomous vehicles (CAVs). The report’s aim is to get the right parties engaged with these ethical issues as the CAV industry emerges.
The report dedicates one of its three chapters to data and AI ethics, which provides an insight into some of the data protection issues carmakers are expected to overcome. Since these recommendations address technology in use by the CAVs currently on the road, carmakers would be wise to check the recommendations against current practices. This article explores three data protection issues highlighted by the report and the Commission’s practical recommendations for responding to them.
-
Put consent under the microscope
What the report says:
Carmakers need to get consent to process personal data for purposes that are not necessary for the proper functioning of the CAV. For example, advertising or research and development, especially when these purposes involve data sharing with third parties. The report is emphatic about this, suggesting no leeway to rely on ‘legitimate interests’ for these types of activities. In addition, giving consent cannot be a precondition of using a CAV service.
The report stands with other EU guidance on connected vehicles in this way. Worryingly, the report does say explicit consent. This seems a little over-zealous and it may be that the authors did not mean to use this word in the same way that it is used in the GDPR.
Carmakers should avoid relying on consent-based user agreements. It is often too hard for the driver to know exactly what they are consenting to over time. This is due in part to the AI and machine learning techniques employed by CAVs, but also because of the external infrastructure with which CAVs interact. Therefore, carmakers should request consent on an ongoing rather than a one-off basis.
However, requesting consent may not always be appropriate. If a driver is under duress or needs to make a quick decision, then an imbalance of power between the carmaker and the driver might exist and the consent may not be valid. Relying on legitimate interests in these situations isn’t discussed, but the feel from the report is that users should have plenty of opportunity to opt out of such processing.
What the Commission recommends:
- Identify processing activities that should rely on consent. Any use of personal data for marketing purposes should be consent-based (eg user segmentation or similar profiling). Consider the unintended effects that any AI and machine learning have on a driver.
- Re-visit in-vehicle interfaces to ensure these allow for the easy management of consent. Check that withdrawing consent does not result in the denial of an essential CAV service.
- Determine whether current consents are “take it or leave it”, or apply too generally to the services. If so, consider using continuous and agile consent mechanisms when the processing purpose changes or expands.
- Consider allowing drivers to select a general ‘data strategy’ to provide an ethical alternative to consent. Giving drivers the ongoing ability to manage at a high-level how their CAV uses their data in certain situations may open the door for carmakers to rely on legitimate interests in situations where consent isn’t appropriate.
-
Address transparency concerns
What the report says:
CAVs exchange data with other CAVs, with the carmaker’s servers and with any other connected infrastructure on the road. It is often difficult to explain this continuous multi-party data sharing to drivers, especially the effect this has on their rights.
Providing appropriate transparency notices to data subjects other than the driver presents a particular challenge. These other data subjects can include individuals on the street as the CAV moves through a public place or, in some advanced CAVs, the other passengers in the CAV. There may also be situations where these different classes of data subjects have competing rights.
As the CAV infrastructure increases, there will be road zones where potentially intrusive data collection occurs. For example, in a controlled public area for reasons of safety, security or traffic management. Although the CAV may have to share the data with the operator of the zone by law, it might be the CAV’s route management decisions that have taken the driver there. In such cases, carmakers need to make drivers aware about the consequences of entering that zone (ie, that significant data collection will occur) so the driver can take action to avoid this if they want.
What the Commission recommends:
- Collateral processing of pedestrians’ images and movements is increasingly difficult to avoid, so carmakers should develop sophisticated anonymisation techniques to lessen the associated risks and use state-of-the-art security measures to keep any data safe. This is especially important where carmakers want to use this data for internal R&D purposes.
- Use novel techniques to deliver transparency notices to drivers in real time, provided this does not interfere with the safe operating of the CAV. For example, when a route will take the driver through a zone where potentially intrusive data collection occurs, the CAV can display a dash warning or use a haptic notification such as a seat vibration before the driver reaches that zone.
-
Improve methods for explaining AI and machine learning
What the report says:
Algorithms can create new personal data about the driver, or make automated decisions about them. Drivers need to understand the impact that any algorithm has on them. Explanations need to be clear and easily understood by anyone. Carmakers cannot assume any prior level of understanding.
If the CAV technology results in significant automated decisions affecting an individual, carmakers must ensure such decisions are traceable and explainable. Humans should be capable of examining the circumstances that lead to the decision and, where necessary, intervene in the decision.
What the Commission recommends:
- Get creative with explanations of how algorithms works. Test the effectiveness of these explanations against focus groups. In addition, use those focus groups to gauge where the extra sensitive areas are.
- Develop an audit programme that regularly assess algorithms for efficacy and bias.
Looking forward
The report directs many actions toward researchers, rather than carmakers. Reading these actions is interesting because it tells us what the regulatory bodies may expect or want to see from carmakers in the future. For example, the report regularly mentions the development of “industry standards” and “certification mechanisms”, showing a desire for the industry to collaborate when addressing some of the more complex issues covered in the report. The hope is that universally accredited approaches will help carmakers be confident to undertake a greater volume of useful CAV processing activities without relying on consent. However, certification mechanisms are already available under the GDPR but to-date there has been little uptake by similar industries at the technological frontier. Let’s hope the automotive industry breaks this unhappy trend.
You can read the full report here, or a handy condensed factsheet here.