18 September 2019
Facial recognition technology (FRT) is developing rapidly although it remains imperfect. Increasingly, governments, public authorities and private companies are looking to use it, in particular, for security and law enforcement, but also, potentially for research and marketing purposes. While the applications of the technology are many, there are restrictions on its use dealt with under a framework of legislation and guidelines, the usefulness of which is beginning to be tested in the courts as the debate around the impact of FRT on privacy continues.
The Divisional Court has turned down an application for judicial review in R (Bridges) v Chief Constable of South Wales Police and Others  EWHC 2341 (Admin). The application was brought by Mr Bridges in relation to the use by South Wales Police of 'AFR Locate', a pilot project which used Automated Facial Recognition (AFR) technology to capture digital images of the public and compare the biometric data from those images against a database of 'persons of interest'. If there was no match, the data would be immediately deleted, with the source CCTV footage retained for 31 days as permitted.
The Court held that while the use of the technology interfered with the Claimant's Article 8(1) right to privacy under the European Convention on Human Rights, that interference was justified, and that SW Police had complied with both then applicable and current data protection law in its use of the technology.
This decision involved a public authority so it does not transfer directly to commercial use of FRT. There are, however, a number of useful points which are relevant to wider uses of FRT:
Following the judgment, the ICO said it welcomed the Court's conclusion that the use of LFR systems involves the processing of sensitive personal data and compliance with the Data Protection Act 2018 (DPA18). The ICO has finished its investigation into the first police pilots of LFR and will consider the Court's judgment when finalising its recommendations and guidance. In the meantime: "any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply". The ICO may also take the Court's reasoning into account when conducting its recently announced review into use of LFR in the King's Cross area.
While this judgment involves the use of LFR by a public authority for law enforcement purposes, it is relevant to the use of FRT more widely in its consideration of what constitutes personal and sensitive personal data, its assessment of the application of the data protection principles when using FRT, the use of DPIAs and the relevance of retention periods. We can expect much more on this issue for businesses as well as in relation to the public sector and law enforcement. Additional policies and guidance from the ICO will be an essential aid to compliance.
Mr Bridge's application which was supported by Liberty and others, was made on the grounds that the use of the AFR technology:
There was also a claim under the Equalities Act which is beyond the scope of this article.
The Court concluded that Article 8(1) ECHR was engaged. AFR is significantly intrusive and involves processing of biometric data which is "intrinsically private". The use of the Claimant's biometric data went beyond the expected and unsurprising (a test set out in S v United Kingdom). The fact that the biometric data was retained for a very short period unless there was a match was not relevant; even momentary processing of the data would be sufficient. Accordingly the use of AFR locate did interfere with the Claimant's Article 8 rights.
The Court then considered whether the use of AFR was in accordance with the law under Article 8(2) ECHR and concluded that it was because:
The Court went on to conclude that the use of AFR Locate by the police struck a fair balance and was not disproportionate (using the test in Bank Mellat). In particular, AFR Locate was used:
In addition, any interference with Article 8(1) rights would have been limited owing to the prompt deletion of biometric data and its use in accordance with granular data retention periods set out in the associated DPIA.
The Court first considered whether the use of AFR Locate involved processing personal data as defined in the DPA98. It concluded that it was not personal data under the 'indirect identification test in Breyer, but that it did involve the processing of personal data using the test discussed in Vidal Hall – in other words that the person was sufficiently identified by the processing as to be 'individuated'.
Given that personal data was processed, it had to be processed in accordance with the first data protection principle that personal data be processed fairly and lawfully. The Court held that the processing did comply with that principle and that it was in the legitimate interests of the police to process it taking into account the common law obligation to prevent and detect crime.
The Court found that the use of AFR involves the processing of biometric data which means it is sensitive processing within the meaning of s35(8) DPA18. The processing in this case could be justified on the basis that it was strictly necessary for law enforcement purposes and for the common law duty of preventing and detecting crime and was, therefore, in the public interest. The police also satisfied the s35(5) requirement of having an "appropriate policy" in place which satisfied the s42(2) requirements. While the Court commented that the document was brief and lacking in detail, it did not give a view on the adequacy of the policy.
The Court also concluded that the police had satisfied the requirement under s64 DPA18, to carry out a DPIA. Interestingly, the Court said it was not up to them to determine whether or not the assessment met the requirements of s64 where the data controller had exercised reasonable judgment based on reasonable enquiry and consideration: "When conscientious assessment has been brought to bear, any attempt by a court to second-guess that assessment will overstep the mark".