14 août 2024
Facial recognition technology (FRT) can be immensely useful for efficient building operations, from easing security queues to making canteens cashless, but a recent reprimand from the Information Commissioner's Office (ICO) highlighted the importance of following the UK GDPR.
Chelmer Valley High School, in Essex, narrowly avoided an ICO fine when it employed FRT to make its canteen payments system cashless without following due process, but there are some important lessons to be learnt for other real estate owners and operators.
Typically, FRT involves using cameras to capture an individual's image and then facial recognition software to produce a biometric template from that image. This enables the easy identification of individuals, but it also generates the sort of biometric data that constitutes a special category personal data under the UK GDPR. The processing of this is likely to result in high data protection risks as those individuals' rights and freedoms could be affected, particularly considering the threat of bias and discrimination.
The formal ICO reprimand is very clear that the school breached the UK GDPR when it failed to carry out a data protection impact assessment (DPIA) before capturing images of students and using FRT. A DPIA is required under the UK GDPR whenever organisations conduct any high risk processing activities. The ICO Head of Privacy Innovation, Lynne Currie, stressed that a DPIA is ''required by law – it's not a tick-box exercise''. In fact, the ICO had already published a list of processing activities that require a prior DPIA, and this clearly included the processing of biometric data of vulnerable individuals (such as children), or using innovative technology.
As it happened, the school was relying on assumed consent as its lawful basis for processing. It had sent a letter to parents and asked them to return an "opt out" slip if they did not want their child to participate. The ICO confirmed however this was not adequate - consent for this should be "opt in": freely given, specific, informed, unambiguous and involving an affirmative action. A DPIA conducted at the outset of the technology's implementation could have identified and managed the higher risks.
Building owners and operators considering using FRT will be keen to avoid such a high profile reprimand from the ICO. Without the mitigating factors in Chelmer Valley's case, a UK GDPR fine could reach up to £8.7 million, or (if higher) 2% global annual turnover. Speak to us whenever you want to adopt such technology for clear and practical advice on the requirements of UK GDPR and the rigours of relying on consent as the lawful basis for processing. No one wants a detention.
Find out more: