The technologies used to monitor employees have changed rapidly in recent years. Employers have the option to deploy not only basic functionalities such as keystroke monitoring and webcams but increasingly more sophisticated options which include the use of biometric and facial recognition technology. The obligation to monitor workers in compliance with data protection laws exists regardless of the technology used to monitor workers, however, there are further considerations in respect of Facial Recognition Technology (FRT) and biometric data.
Almost a year ago, on 12 October 2022, the Information Commissioner's Office produced draft guidance on monitoring employees at work which was out for public consultation until 11 January 2023. Following this public consultation, the ICO published its final guidance on monitoring in the workplace on 3 October 2023.
The guidance covers a wide range of concerns in relation to employee monitoring and includes helpful checklists for organisations. It does not cover the sharing of personal data by employers with law enforcement. The guidance applies to systematic (where an employer monitors all employees in the same way) and also occasional monitoring (for example where an employer implements a short-term temporary monitoring strategy in response to a particular event or concern).
This article will focus on the sections of the guidance which relate to the use of biometric data and Facial Recognition Technology (FRT) to monitor workers.
Biometric data
At a high level, biometric data for (UK) GDPR purposes is personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual which allows or confirms the unique identification of that individual. Biometric data includes fingerprints, iris scanning, retinal analysis, facial recognition templates and voice recognition templates. Biometric data is considered special category data whenever it is processed "for the purpose of uniquely identifying a natural person". It is also important to remember that even when biometric data is not being used to uniquely identify someone it may still contain special category data relating to someone – for example it could reveal an individual's racial or ethnic origin. Biometric data is a higher-risk category of personal data and use of this type of information is considered particularly invasive.
The ICO recently published draft guidance on the use of biometric data which we have written about here.
What do employers need to do if they want to use biometric data?
The new monitoring guidance looks specifically at whether employers can use biometric data for time and attendance control and monitoring. Traditionally this has been conducted via the use of swipe cards, passwords and pin codes. Technology has developed and now some organisations use biometric data to grant workers access to their workplace (one example being fingerprint access to offices). Any use of biometric data by employers does pose a risk to workers' data protection rights and freedoms. Owing to the fact that biometric data is more closely identified with a specific person, the risk of harm in the event of inaccuracies or a security breach is much greater.
Before deploying the use of biometric data for the purpose of uniquely identifying a worker, employers:
- must conduct a DPIA covering the risks and possible risk mitigations in relation to processing biometric data before the processing begins
- should consider whether there are any alternatives to using biometric data in order to achieve the desired objectives
- must have in place security measures which are appropriate to the risks of unauthorised access or disclosure of workers' biometric data (in addition it is good practice to consider whether enhanced security measures are required)
- must document (in the DPIA) the rationale for choosing to rely on biometric data, including any consideration of other less intrusive means and why the employer believes they are inadequate
- should tell workers how the system works, what personal data the employer is collecting, how the personal data will be used and the nature and purposes of the monitoring.
Workers can object to the use of biometric data for time and attendance related purposes where the lawful basis relied on by the employer is:
- public task (for the performance of a task carried out in the public interest);
- public task (for the exercise of official authority vested in you); or
- legitimate interests.
Lawful basis for processing biometric data
Where biometric data is used in FRT, this will be considered special category personal data. To process special category personal data employers are required to identify a lawful basis under Article 6 GDPR and, in addition, a condition for processing under Article 9 GDPR. This may prove challenging as relying on consent is difficult in an employer/ employee context due to the disparity of power in the relationship. The ICO's draft guidance on biometric data states that in most cases, consent may be the only valid condition for processing special category biometric data and that where there is an imbalance of power, the controller should carefully consider whether relying on explicit consent is appropriate.
Where employees are not provided with an alternative option to the processing of their biometric data, consent will not be an appropriate lawful basis for the employer to rely upon.
Where the employer is relying on consent as a lawful basis, workers can withdraw this consent at any time. Employers should, in these instances, provide workers with an alternative access method and ensure that it does not cause the workers any detriment.
This means employers should provide an alternative for those who do not want to use biometric access controls, such as traditional swipe cards or pin numbers. Workers who choose to use an alternative method should not be disadvantaged.
Facial Recognition Technology
FRT identifies or otherwise recognises a person from a digital facial image. Typically, cameras are used to capture an image and facial recognition software measures and analyses the images to produce a biometric template which enables the user to identify individuals. The software may incorporate elements of artificial intelligence and machine learning processes. By and large, FRT uses biometric data. Where biometric data is used in FRT, this will be considered special category personal data.
The monitoring guidance acknowledges that using FRT comes with higher risks to data protection rights and freedoms when compared to standard video technology, particularly where an employer makes inferences about a person's likely behaviour from the use of FRT. There are also concerns in relation to the accuracy of facial recognition technologies, in particular for those from ethnic minority groups.
If an employer is using FRT it must carry out a DPIA as this processing presents a high risk to workers. The ICO has produced an FRT checklist which contains a list of steps employers should consider including, among other requirements, conducting a DPIA, consideration of camera resolution and placement, and the effectiveness of the training data for the FRT.
Key takeaways from the guidance
Care must be taken by employers in respect of the use of biometric data and/or facial recognition technology for the purposes of monitoring workers. This data and technology create a high risk profile for processing data and employers should take note of the enhanced obligations in respect of this type of monitoring.