作者

Debbie Heywood

Senior Counsel – Knowledge

Read More
作者

Debbie Heywood

Senior Counsel – Knowledge

Read More

2021年7月19日

Radar - July 2021 – 3 / 3 观点

Using live facial recognition technology in public places

What's the issue?

The use of live facial recognition technology is extremely controversial.  Its use both in law enforcement and for other purposes is regulated and stands to become even more so in the EU which has published a draft AI Regulation.

Facial recognition technology clearly involves the processing of personal and often special category data. As such, its use comes within the remit of data protection regulators. 

The new EC draft AI Regulation categorises the use of real-time biometric identification systems in publicly accessible spaces for law enforcement purposes as 'unacceptable risk' except when used for certain restricted purposes.  However, the European Data Protection Board and the EDPS are calling for an outright ban on facial recognition technology, whatever its purpose.

What's the development?

The ICO has published a Commissioner's Opinion on the use of live facial recognition (LFR). It focuses on the use of LFR in public places by private companies and public organisations and builds on the Opinion on the use of LFR by police forces which covers use of LFR in law enforcement.

The ICO investigated or assessed 14 examples of LFR deployments and proposals and conducted wider research. The Opinion focuses on the use of LFR for the purposes of identification and categorisation. It does not address verification or other one-to-one uses. It defines public places as any physical space outside a domestic setting. It does not cover the online environment.

Following an analysis of applicable law, key risks and use cases, the ICO concludes that key requirements for controllers are, at a broad level, that any use of personal data must be lawful, fair, necessary and proportionate. This is magnified given processing of biometric data and automatic processing, and where there is a broader risk to the rights and freedoms of individuals.

What does this mean for you?

Essentially, this means there is a high bar to lawful use of LFR in public places for automatic and indiscriminate use.  Of the 14 examples of LFR deployments and proposals analysed by the ICO, not one organisation was found be fully compliant with data protection law in their use of LFR and all either chose to stop or did not proceed with its use.

This does not mean that LFR can't be used.  It means that care needs to be taken and the guidance followed before it is deployed and while it, or the data it generates, is used.

Read more

The guidance states:

  • the controller must identify a specified, explicit and legitimate purpose for using LFR in a public place
  • the controller must identify a valid lawful basis and meet it
  • the controller must identify and meet the conditions for processing special category data and criminal offence data where required
  • the use of LFR must be necessary and should be a targeted and effective way to achieve the controller's purpose
  • the controller must consider alternative measures and demonstrate that they cannot reasonably use a less intrusive means of achieving their purpose
  • the use of LFR must be proportionate and the controller's purpose sufficiently important to justify any privacy intrusion
  • the LFR system should be technically effective, sufficiently statistically accurate, and address risk of bias and discrimination
  • the controller must be transparent
  • the controller must conduct a DPIA (there is an annexe to the Opinion which goes into detail about how to conduct the DPIA and reach conclusions)
  • the controller's assessment must consider the impact of the use of LFR on the rights and freedoms of individuals
  • the controller must comply with the data protection principles and be accountable for their use of personal data.

When using LFR surveillance, controllers must:

  • ensure the use of watchlists complies with data protection law and meets the same requirements of lawfulness, fairness, necessity and proportionality
  • where there is collaboration with law enforcement, ensure roles and responsibilities (including controllership) are clear with appropriate governance and accountability measures. This is in addition to meeting other legal requirements.

When conducting a DPIA, which must be before the processing begins, controllers:

  • should follow the guidance in the annex to the Opinion
  • must consult the ICO if their DPIA indicates user of LFR would result in a high risk the controller cannot mitigate.

The ICO recommends technology developers, LFR vendors and service providers as well as the wider industry:

  • adopt data protection by design and default
  • take steps to address and reduce risks of bias and discrimination in LFR systems and the algorithms that power them
  • be transparent and consider adopting common standards
  • educate and advise controllers on how systems work and be transparent about the data protection obligations.

The ICO will be further investigating use of LFR and will be primed to investigate complaints, potentially referring to the Opinion in its assessment of issues.

本系列内容

技术、媒体与通信 (TMC)

A big month for data exports

2021年7月19日

作者 Debbie Heywood

技术、媒体与通信 (TMC)

DCMS guidance on online safety for businesses

2021年7月19日

作者 Debbie Heywood

技术、媒体与通信 (TMC)

Using live facial recognition technology in public places

2021年7月19日

作者 Debbie Heywood

Call To Action Arrow Image

Latest insights in your inbox

Subscribe to newsletters on topics relevant to you.

Subscribe
Subscribe