Obviously there isn't a simple answer to this question but, broadly, facial recognition technology is biometric software. It can identify a person by taking a digital image and mapping their features mathematically. The map is saved as a 'fingerprint'. This does not necessarily lead to identifying an individual but the next step is for deep learning algorithms to compare the fingerprint against others in order to create a match (or rule one out). Alternatively, it may be used to categorise the individual on the basis of, for example, age, gender, ethnicity or behaviour.
Some uses of FRT remain in the future but potential commercial uses currently focus on a number of areas including:
Security is not just the concern of the law enforcement and intelligence services (whose use of FRT is outside the scope of this article). Private organisations, for example, entertainment venues, retail space and business premises owners, are also keen to use FRT to help detect and deter crime. A step on from CCTV, FRT involves pro-active rather than retro-active identification. On one level, this can be seen as creating 'Minority Report' levels of intrusion where people may be singled out as potential rather than actual criminals. On the other hand, RFT may actually store fewer images than CCTV footage as images which are not tagged as potential risks may be swiftly deleted.
FRT presents a huge opportunity for advertisers to target not only the placement of ads but their content and frequency. For retailers, technology which recognises an individual as having a particular interest in a certain type of product or as being in a particular age category, can help them sell more effectively and build ever more detailed profiles of their customers and potential customers.
Social media apps and sites like Facebook and Instagram and photo storage apps like iPhoto and Google Photos, store huge numbers of images of individuals together with a large amount of associated metadata (including geographical location and time of photo). Facebook has come under regulator scrutiny as a result of its photo tagging technology. The Article 29 Working Party (now the European Data Protection Board) wrote to Facebook in April 2018, asking for clarification on the consent procedures and privacy settings around the use of facial recognition to support its tag suggestion, photo review and profile photo review functions.
In some cases, for example, in financial services and on Apple's latest smartphone, FRT is used as a means of authentication, a potentially more secure and less labour intensive (and impossible to forget) form of password.
The GDPR came into effect on 25 May 2018, introducing new obligations around the processing of personal data and enhanced rights for individuals. The GDPR definition of personal data is extremely wide, covering any information relating to an identified or identifiable natural person who can be identified directly or indirectly. This essentially means that any data which identifies an individual or which can be combined with other data to identify an individual counts as personal data. In FRT, the digital images, the fingerprint and any outputs (like reports and profiles), will all be personal data.
Much FRT-generated data will also be sensitive or special data to which more stringent rules apply. Biometric data for the purpose of uniquely identifying a natural person, and data revealing racial or ethnic origin, fall into this category. Processing this kind of data is prohibited unless specified conditions are met.
Not all FRT data will be personal data. The technology relies heavily on getting the right camera angle and inevitably many of the images it captures will not identify individuals even when combined with other data. That does not, however, alter the fact that a great deal of sensitive personal data will be captured.
It's also worth noting the geographical reach of the GDPR. It covers the processing of personal data in the context of the activities of an establishment of a controller or processor in the EU, and the processing of personal data of individuals in the EU, where the controller or processor is not in the EU but is offering goods or services in the EU, or is monitoring the behaviour of individuals in the EU.
The GDPR has an all-pervasive influence on use of FRT where it applies and raises a number of particular issues.
Where the GDPR applies you need a lawful basis to process personal data for each purpose for which you process it. There are a number of lawful bases which may be available (set out in Article 6 GDPR) including consent (more on that later), legitimate interests which are not outweighed by the rights of individuals, and processing necessary for performance of a contract or necessary to comply with a legal obligation.
In addition, where the data is special data, the processing also needs to fall within an exception under Article 9(2) GDPR. If none of the Article 9 exceptions apply, you cannot process the special data at all. For the purposes of commercial FRT data, the most likely available one under the GDPR is where the processing is done with the explicit consent of the individual (although other criteria may be available in limited circumstances depending on what the FRT is being used for). Note that according to the UK ICO's guidance on GDPR, your Article 6 basis does not have to match your Article 9 exception.
The Data Protection Act 2018 (DPA), makes further provision around the exceptions which allow the processing of special data in the UK. In addition to explicit consent, you may be able to rely on processing for fraud prevention, preventing or detecting unlawful acts or for research. If you seek to rely on a DPA exemption, you are also likely to require an "appropriate policy document", in addition to a GDPR record of processing. This will need to set out the grounds for the processing and detail protective measures.
Consent is one of the thorniest issues in the GDPR and given the enhanced requirements, we are going to take a detailed look at how the changes may impact on the use of FRT technology. Most businesses should, by now, be aware that consent is no longer the 'catch all' it once was and is now one of the harder lawful bases on which to rely. There is no definition of 'explicit consent' in the GDPR but "consent" is a defined term. Consent must be freely given, specific, informed and an unambiguous indication of the individual's wishes, involving a clear affirmative action.
In addition, consent will not be considered as freely given where there is a clear imbalance in power between the individual and the data controller, nor where it is a condition of providing services.
Let's put this in the context of FRT technology used to deliver targeted advertising in a shopping centre. In order for consent to be "informed", the customer needs to be given certain information prior to the processing taking place. This means the customer will need to be told before entering the shopping centre, what is going to happen to their data. We are all familiar with CCTV signs, but FRT is potentially more sensitive and so the customer would need to be told more than simply the fact the technology will be used (see transparency below). Even if the information were delivered in the right way at the right time, can the consent be said to be "freely given" if the customer is unable to enter the shopping centre without consenting? That falls squarely into the remit of making receiving services conditional on consent so any consent under these kinds of circumstances will not be valid under the GDPR. Add to that the requirement for explicit consent to the processing of special data, the requirement to record consent and to target it appropriately to the relevant individual, and the difficulties become apparent.
In the context of 'single person' authentication on, say, a smartphone, explicit consent becomes easier to achieve. This is because the image being scanned is only being matched against the fingerprint of that individual. It is easy to explain what is happening to the data (provided it is only being used for authentication purposes) and easy to capture consent. Taking the customer through a few screens on a smartphone and then asking them to indicate consent by ticking a box, and preferably backing it up with an email link to click on, would be one way to do this.
Those seeking to rely on consent, either for the purposes of Article 6 or Article 9, should look at guidance published by the UK's ICO and by the Article 29 Working Party.
One of the foundations of the GDPR is that personal data must be processed "lawfully, fairly and in a transparent manner". Data controllers must not only process personal data in this way, they must be able to demonstrate compliance.
Article 12 GDPR, sets out the information requirements on data controllers. Further guidance on transparency has been published by the Article 29 Working Party transparency.
Information has to be given to data controllers in a concise, transparent, intelligible and easily accessible form, using clear and plain language. This needs to be tailored to the individual receiving the information. Again, how easy it is to meet the requirements will depend on the circumstances – reasonably straightforward in relation to authentication technology but much harder at an entertainment venue where there will be people of different ages (including children) and possibly non-English speakers.
One of the great advantages to those using FRT is the profiling it allows them to do, whether this is for advertising purposes or simply to improve their products and services. The GDPR has plenty to say on the subject.
Article 21 gives individuals the right to object to profiling done on the lawful basis of it being in the data controller's legitimate interests to carry it out. Data subjects also have the right not to be subject to a decision based solely on automated processing including profiling, which produces legal or significant effects on them.
In addition, data subjects have a blanket right to object to their personal data being used for direct marketing (which may be targeted as a result of FRT profiling).
If you do think you can fulfil the lawful basis and special data requirements there are a number of steps you can take to minimise risk both to the business and to the individuals whose data is being processed.
Data Protection Impact Assessments (DPIAs) must be carried out prior to beginning high risk processing. The GDPR explicitly states that this applies to processing using new technologies where the processing is likely to result in a high risk to the rights and freedoms of individuals and it is safe to say that FRT will fall within this obligation.
The point of a DPIA is to assess the privacy risk associated with the planned processing and mitigate risk in a structured and recorded format as set out in the GDPR. Where the DPIA indicates that the processing would result in a high risk to individuals without mitigating measures, the data controller must consult their regulator before carrying out the processing. The regulator must reply within eight weeks (which may be extended by a further six in complex situations), with advice about whether the planned processing would infringe the GDPR and how to mitigate the risk. Where the regulator does not think the risk can be mitigated, it can, effectively prevent the processing from taking place or use further enforcement powers if the processing goes ahead contrary to its recommendations.
The more sensitive the data being processed and the purposes for which it is being processed, the more important it is to focus on security. The regulators will surely expect anyone deploying a new technology like FRT to have the very highest levels of security.
Anonymising as much data as possible is one way to minimise risk. It won't be possible to avoid processing personal data altogether but depending on what it is being used for, it may be possible to anonymise much of it, for example, reports derived from FRT data which use categories of information rather than personal information.
To be anonymised, it must be impossible to re-identify the data and this may not be workable in practice. An alternative would be to pseudonymise the data. Pseudonymisation is defined in the GDPR as the processing of personal data in such a manner that it can no longer be attributed to a specific data subject without the use of additional information, provided the additional information is kept separately and protective measures are taken to make sure the personal data is not attributed to an identifiable individual. Pseudonymised data is still personal data under the GDPR but its processing is less likely to result in risks to the rights and freedoms of individuals.
Another way of minimising risk is to have strict data retention procedures in place. It is a requirement under the GDPR that personal data be retained for no longer than is necessary in relation to the purpose for which it is processed. With some uses of FRT, for example, where it is used to build up category profiles (such as age groups, ethnicity), the personal element of the data may only be needed very briefly; the value to the business using the FRT lies not in identifying particular individuals but in the data derived on classes of individuals. In these cases, the personal data (the fingerprint) can be deleted very quickly.
Accountability is a core concept of the GDPR. You not only have to comply, you have to be seen to comply. Much has been made of the greatly enhanced enforcement powers regulators now have. Our view, however, is that a business which understands the GDPR and is doing its best to comply, is unlikely to incur the harshest levels of penalties. Being able to demonstrate your GDPR journey will be vitally important in keeping the regulators 'on side'.
There can be no doubt that the GDPR introduces new hoops for the data controller of commercial FRT personal data to jump through. In some circumstances, the law will restrict what the technology is capable of doing. This may be frustrating for businesses but it is surely a good thing for our personal privacy.
Those wishing to use FRT must be careful to comply with all relevant aspects of the GDPR and to minimise risk, including by carrying out a DPIA and having appropriate security and retention procedures in place. There is a lot to consider and legal advice should be taken.
To find out more about the GDPR, visit our Global Data Hub, which has a wide range of checklists, articles, webinars and compliance tools to help you navigate data privacy issues.
If you have any questions on this article please contact us.
Artificial intelligence algorithms and big data are already being used in hospitals and other areas of healthcare and diagnostics. What exactly does this mean for patients and for those seeking to overcome the legal challenges?
1 of 6 Insights
The term "artificial intelligence" (AI) is increasingly attached to emerging technology. While newspaper headlines focus on developments like driverless cars and robot assistants, the real revolution for businesses is currently happening in the context of process automation.
2 of 6 Insights
3 of 6 Insights
5 of 6 Insights
Countries including the US and China have traditionally led the way in developing strategies to foster and develop AI technologies, but 2018 has seen the UK and EU taking meaningful steps to position themselves at the forefront of the AI market.
6 of 6 Insights