We have seen the impact of the cost-of-living crisis on our wallets, but have we also considered the impact it is having on our privacy?
Consider here the retail sector. One consequence of higher prices has been an increase in retail theft and crime. The British Retail Consortium, (BRC) published research last month revealing levels of shoplifting and retail theft across ten of the UK's largest cities have risen by 27%, with the increase in some cities up as much as 68%. The BRC Crime Survey covering 1 April to 31 March 2022, separately reported more than 850 incidents of violence and abuse against UK retail workers.
The retail industry has called for greater policing prioritisation of retail crime across the UK. A letter signed by 91 retail leaders to the Home Secretary Suella Braverman on 29 September 2023, highlighted the police's own records for one major retailer, showing that the police had failed to respond to 73% of serious retail crimes that were reported. The BRC's own survey reported 44% of retailers rating police response as 'poor or 'very poor'. The absence of a reliable police response also appears to have emboldened some individuals, and we have seen recent press reports of incidents of organised gangs targeting and looting retailers without apparent fear of arrest.
Retailers concerned to protect their staff and their stores, find themselves turning to technology solutions to help with their current difficulties, including the use of facial recognition camera technology.
Facial Recognition Technology (FRT) allows a person to be recognised from a biometric template created from a digital facial image. FRT can be used in a number of beneficial ways that can provide us with a helpful layer of identity security, such as when unlocking our phone or accessing our online bank account. In these scenarios we are aware of the availability of the functionality, and it is at our choice and under our control, if we allow our image to be used to create a biometric template for matching purposes.
However, CCTV camera systems incorporating live FRT in a retail context, do not typically operate on the basis of prior knowledge, choice or control. The cameras scan the faces of passers-by, creating a digital template and comparing, (within a fraction of a second) that template against images on a watch list of, for example, known shoplifters.
Some people may be prepared to trade a level of surveillance to help prevent and detect crime but the impact on our personal privacy will be lasting and severe without clear controls over how our biometric data is collected and used, the accuracy or fairness of the algorithms forming part of the matching process or the circumstances under which a person ends up on a watch list.
It is worth noting here that the UK has no law or regulations that are specific to the use of FRT. The current government has resisted calls to provide legislative clarity and controls - rather its policy has been to regulate the use of FRT within the existing regulatory framework. In practice this means that general UK data protection law is the main law of relevance, although in the case of matters under consideration by public bodies, (such as the UK regulator of data protection, the Information Commissioner's Office (ICO) or the courts) they must also interpret and have due regard to the objectives of Human Rights and/or Equality legislation.
The UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA), regulate the processing of data relating to an identified or identifiable living individual, (where the person is identified either directly or indirectly) (personal data). In addition, certain data is classified as special category data for which further protections and lawful justifications for processing are required. Biometric data is treated as special category data under the UK GDPR which defines it as:
"personal data resulting from specific technical processing relating to the physical, physiological characteristics of a natural person, which allow or confirm the unique identification of that natural person such as facial images or dactyloscopic [fingerprint] data".
The ICO has previously expressed significant concern about the use of FRT in public space settings which is seen as presenting a higher scale of risk to individuals, particularly because of the lack of choice or control individuals have over such surveillance. In a blogpost published in 2021 with reference to the ICO's Opinion on the use of FRT in public spaces published on the same day, the ICO raised deep concerns about the potential for misuse of FRT stating:
"We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights without having our biometric data collected and analysed with every step we take."
In particular, the ICO flagged the lack of clear grounds under UK data protection law permitting the lawful use of FRT by private and commercial organisations to monitor spaces accessible by the public. This is separate from use of FRT by the police, where notwithstanding recognition by the Court of Appeal that the technology infringed individual's privacy, it was considered that there were certain lawful and public interest grounds permitting limited, appropriate and responsible use of the technology by law enforcement.
Early pre-pandemic adopters of FRT within the private retail and property sectors found themselves subject to serious investigation by the ICO and real threats of enforcement. Unsurprisingly those subject to investigation chose to stop trials or halt FRT roll out in the face of critical ICO scrutiny. Fast forward a few years however and the messaging from the ICO seems to have shifted. The ICO concluded an investigation earlier this year into FRT technology provided to the retail sector by a security company. As part of an ICO blog post referencing that investigation published in March 2023, the ICO's Deputy Commissioner for Regulatory Supervision appears to reflect a more nuanced approach stating:
"Innovative solutions helping business prevent crime is in the public interest and a benefit to society. Data Protection law recognises this, allowing personal information – in this case facial images – to be used if there is a legitimate interest, such as for the detection and prevention of crime. However, these benefits must be balanced against the privacy rights of the individual".
The ICO has recently consulted on a first phase of a draft biometric guidance and will consult again early next year on biometric classification and data protection. The phase 1 draft guidance assumes that in most cases explicit consent is likely to be the only valid basis for lawful processing of special category biometric data but recognises that explicit consent may not be required where biometric data is "necessary" for crime prevention or detection purposes and for reasons of substantial public interest, provided the biometric data use is targeted and proportionate to deliver the specific purposes and that these cannot be achieved in a less intrusive way. It is worth noting however that of the different compliance case study examples provided, within the draft guidance, none address the use of FRT systems by a retailer, with the result that the high-level guidance raises more questions than answers.
In October 2023, the ICO published a blogpost on how data protection law can help retailers tackle shoplifting. This focuses less on the use of FRT and more on what information can be shared and with whom in the context of what is necessary and proportionate data processing to detect and prevent crime. This blog coincided with the government's launch of its Action plan to tackle shoplifting, but again, this focuses on CCTV use rather than the use of FRT.
So, what should a retailer thinking of introducing FRT cameras do? well tread very carefully. Despite a possible shift in the regulatory positioning of the ICO, the messaging appears mixed and any legal argument justifying such an intrusion into the privacy of the public looks very thin in the absence of robust technical and organisational safeguards. It remains the case that a Data Protection Impact Assessment (DPIA) and appropriate data policy will be essential first steps in any proposed use of FRT. In addition, if and until the UK Data Protection and Digital Information Bill (2) is finalised (potentially removing the current obligation to consult the ICO on certain high-risk processing DPIAs), it remains a legal requirement to consult with the Information Commissioner prior to the introduction of any such technology in retail spaces accessed by the public.