01/03/2024
In this article we consider the Information Commissioner's Office (ICOs) recent enforcement notice, issued against Serco Leisure, and its newly published guidance on the use of biometric data.
Serco Leisure’s use of biometric technologies
Serco Leisure (Serco) runs the day-to-day management of a number of leisure centres, on behalf of certain community leisure trusts and local authorities in the UK. Serco has been monitoring employee attendance using facial recognition technology (FRT) at 38 leisure centres, with two of those facilities additionally using fingerprint scanning, for a number of years. Serco is a joint controller with the relevant trust or local authority at 32 of those leisure centres. In total, over 2,200 employees were affected.
The referral to the ICO came from an ICO employee, who observed FRT was being used at one of the facilities. Serco also received a complaint from an employee regarding the use of FRT. When those concerns were raised, the employee was offered a discussion regarding their privacy concerns with a FRT software representative, but was not offered an alternative sign-in method and was told they would have to continue using the FRT system.
Serco maintained “biometrics is the sole technology capable of eliminating buddy punching and falsified time cards” and that biometric solutions are “more accurate and secure than cards or keys, because a fingerprint or face scan cannot be lost, stolen or (easily) replicated.”
The ICO disagreed with this analysis, arguing that less intrusive means were available and should have been used.
What does the ICO say?
Serco was ordered to cease processing employees’ biometric data and to destroy all biometric data it (and the third party processor) is not otherwise legally obliged to retain.
The ICO issued this enforcement notice the same week in which it published new guidance for organisations on how they can process biometric data lawfully.
Article 6: Lawful basis
The lawful bases Serco relied on were Art 6(1)(b) contractual necessity and 6(1)(f) legitimate interests for the processing of the biometric data.
Serco argued that the processing was necessary to ensure employees were paid correctly for their time worked. The ICO emphasised that any processing cannot be considered to be necessary when less intrusive means of attendance monitoring could have been implemented. Serco contended that without FRT the system was open to abuse, but failed to show that widespread abuse was taking place. Moreover, the ICO pointed out that even if abuse of the system was apparent, Serco failed to show why disciplinary action could not have been taken to tackle this problem, rather than relying on the processing of biometric data.
The legitimate interests argument failed for similar reasons. Legitimate interests will not apply if a controller can reasonably achieve the same result in another less intrusive way i.e. Serco failed to show that the processing was necessary to achieve the intended purpose. In carrying out the balancing test, Serco failed to give appropriate weight to the intrusive nature of biometric procession and its risks to data subjects.
The ICO also criticised Serco because employees were not given clear information about how they could object to the processing, nor about what other methods of attendance monitoring could be used. Furthermore, the imbalance of power between employer and employees meant that employees may not feel able to object, even if they had been told that they could object to how their data was being processed.
Article 9: Special category data
Serco relied on Article 9(2)(b), processing necessary for carrying out obligations in the field of employment, to process special category data. At the point at which FRT was introduced, Serco failed to identify the relevant right conferred by law and did not have an appropriate policy document in place, as required by Schedule 1 of the Data Protection Act 2018.
As with the lawful basis, Serco failed to demonstrate that the processing of biometric data was necessary to comply with the relevant laws (which were identified later) and so Article 9(2)(b) was not an appropriate processing condition – the processing was not a necessary means by which Serco could give effect to the contract between it and its employee.
Key takeaways
If you are considering processing biometric data in your organisation, remember:
- The threshold for the processing of biometric data is high
Due to the wholly unique nature of biometric data, the risks of harm to a person in the event of inaccuracies or a data breach are extremely high – it is not possible to use alternative means of verification with FRT or fingerprint scanning in the same way as password details or a PIN number may be changed. In terms of legal basis, it is difficult to identify circumstances in which any legal basis other than explicit consent (Article 9(2)(a)) for processing special category biometric data could be considered as a feasible option. Unless there is a clear substantial public interest in processing using such techniques, data controllers are likely to need to rely on the explicit consent of the data subject to process biometric data such as FRT – this requires an affirmative “opt in” approach to a clear statement of consent, which is a high bar for data controllers to achieve. - Get your governance in place before you start processing
Serco was criticised for not having carried out a DPIA, legitimate interests assessment and having the correct policy documents in place before commencing use of biometric technologies. The ICO’s guidance is clear that organisations must mitigate any potential risks that come with using biometric data, such as errors identifying people accurately and bias if a system detects some physical characteristics better than others. - Make sure to consider all possible alternatives
If there is a less intrusive means of achieving your purpose, you should be using that instead. If you are processing special category data and are unable to secure the data subject’s “opt in” consent (and there is no other applicable processing condition), you will have to make an alternative option available, unless you can clearly demonstrate in your DPIA that the chosen method of processing is strictly necessary. - Ensure your employees are able to object
If you are processing biometric data, ensure data subjects are informed how to object to that processing and offer them an alternative. Remember the power dynamic between employer and employee – the employee must feel able to object to the processing even if they are offered an alternative. - Consider if you are a joint controller
Joint controllership occurs where two or more controllers jointly determine the purposes and means of processing. Be aware if you are outsourcing services; ensure that your service providers are using any biometric technologies lawfully and in compliance with the ICO’s guidance. In the Serco case, the trusts and local authorities were considered negligent in their decision to allow the implementation of the technology, and so each received an enforcement notice in respect of the processing. - This is now firmly on the ICO’s radar
The ICO, in its enforcement against Serco, has put data controllers on notice that biometric technologies cannot be deployed lightly. Going forward, the ICO will scrutinise organisations and act decisively if they believe biometric data is being used unlawfully.
ICO resources
Biometric data guidance: Biometric recognition
Can we use biometric data for time and attendance control and monitoring?
If you would like to discuss how any of these issues might affect your organisation, please get in touch with our Information Law & Privacy team.