This website will offer limited functionality in this browser. We only support the recent versions of major browsers like Chrome, Firefox, Safari, and Edge.

Search the website

Implementing facial recognition technology? What businesses need to consider to comply with UK data privacy laws

Picture of Amanda Leiu
Passle image

Businesses are increasingly turning to facial recognition technology (FRT) in various sectors - from retail and transport to financial services - for purposes aimed at crime prevention, security, convenience, and operational efficiencies.

Sainsbury's recently announced that it will be trialling FRT in two of its stores as part of efforts to tackle retail crime. This follows several other retailers, including Home Bargains, Farmfoods, Sports Direct, B&M, Flannels, Spar and Iceland who have adopted FRT for similar purposes.

However, the growing use of FRT into public-facing environments has attracted increased scrutiny from regulators, civil society and the media and raises a number of legal and ethical questions, particularly from a privacy perspective.

Facial Recognition Technology

Facial recognition technology identifies a person from a digital facial image. Cameras are used to capture an image, and facial recognition software measures and analyses facial features to create a biometric template which is then compared against stored data to produce an identification. FRT typically involves the use of complex AI systems to enable the system to detect, analyse and match facial features with increasing accuracy over time.

Common uses of FRT such as unlocking our mobile phones, typically involve a “one-to-one” process. This means that the individual participates directly and is aware of why and how you are using their data. 

Live FRT in public spaces is different, and is typically deployed in a similar way to traditional CCTV. This means it is directed towards whole spaces rather than specific individuals, and captures data automatically and indiscriminately, often without individuals’ awareness or control. This means it has greater potential to be used in a privacy-intrusive way.

Use of FRT under UK data privacy laws

FRT generally involves the use of personal data and biometric data, which is regulated by the UK GDPR, Data Protection Act 2018 (DPA 2018) and the recently enacted Data (Use and Access) Act 2025 (DUAA).

There needs to be a lawful basis for processing this data (for example, consent, legitimate interests, or processing which is necessary for performance of a contract or to comply with a legal obligation). The DUAA introduces a new ‘recognised legitimate interests’ lawful basis which allows organisations to process data for specific purposes such as crime prevention, safeguarding, and public security without conducting the usual balancing test. This may be relevant to commercial deployment of FRT once this provision comes into force (via secondary legislation expected in the coming year).

Biometric data is also likely to constitute “special category” data under the UK GDPR, which means organisations are required to identify both a lawful basis for processing under Article 6 and satisfy one of the additional conditions under Article 9 of the UK GDPR. 

Businesses using FRT to detect or prevent crime are also likely to be processing criminal offence information. Where criminal offence data is also processed (e.g. identifying suspected shoplifters), additional conditions under Schedule 1 of the DPA 2018 must be met.

As well as identifying a lawful basis for processing, businesses should ensure that any use of FRT aligns with the core principles of data protection, namely: 

  • Transparency: data subjects should be informed about how their data is being used in a concise and accessible manner, for example, through privacy notices and signage.
  • Fairness: processing using FRT must be fair, and should not create a risk of bias or discrimination.
  • Purpose Limitation: data collected for a specific purpose (i.e. crime prevention) should not be used for other purposes which are incompatible with the original purpose. 
  • Data minimisation: data should only be collected where adequate, relevant and limited to what is necessary for the specified purpose. 
  • Data retention: data should be stored for no longer than is necessary for the purpose for which it is processed; data collected by FRT systems should be retained no longer than required to make an identification, if any.
  • Security: appropriate and adequate data security measures should be in place to protect data from unauthorised or unlawful processing, and accidental loss destruction or damage.

Other considerations

Beyond data protection concerns, the use of FRT presents broader legal and ethical risks that organisations should carefully consider:

  • Misidentificationdata from the Metropolitan Police suggests that although the overall risk of misidentification is low, the risk of misidentification becomes much higher once someone has actually been flagged. This can lead to disproportionate and potentially harmful consequences, especially where FRT is used to inform enforcement or exclusion decisions without adequate human oversight.
  • Risk of discrimination: FRT may produce bias against certain groups based on physical characteristics, such as gender and ethnicity. These disparities can result in biased outcomes, undermining the fairness of the technology and could put organisations in breach of their obligations, including under equalities legislation. Public bodies may be subject to additional obligations which they are at risk of breaching in using FRT (the “public sector equality duty” in the UK, for example).
  • Human rights concerns: the Equality and Human Rights Commission has criticised the use of FRT by the Metropolitan Police, arguing that its policy on the use of FRT and the way it is being deployed breaches human rights law, risking the rights to privacy, freedom of assembly and freedom of expression. 

Practical steps for compliance

Businesses thinking about using FRT should consider these key practical steps:

  • Understand how the FRT system and underlying technology works, including its technical capabilities and limitations. Document this in writing, including where applicable, the system’s ability to identify false and true matches, and record false positive or false negative rates where appropriate.
  • Carry out a data protection impact assessment (DPIA) prior to any live deployment of FRT to identify appropriate lawful basis / condition and assess risks.
  • Document justification for the use of FRT, and the decision-making behind these justifications (if not covered in the DPIA)
  • Clearly inform individuals about the use of FRT, why it is being used and how it works, and ensure that individuals can access relevant privacy information.
  • Display clear signage in all locations where FRT is used.
  • Think about fairness and bias - take steps to understand and mitigate the risk of bias and discrimination in the use of FRT, including ensuring a sufficient volume and variety of training data to support accurate performance.
  • Ensure you have appropriate policies in place covering how you will be using the technology and governance measures.

For advice on the proposed changes to UK data protection laws and other changes introduced by the DUAA, please contact Martin CookMadelin Sinclair McAuslandAmanda Leiu, or another member of our Commercial & Technology team. 

This article was written by Harriette Alcock and Amanda Leiu.