Implementing facial recognition technology? What businesses need to consider to comply with UK data privacy laws

This website will offer limited functionality in this browser. We only support the recent versions of major browsers like Chrome, Firefox, Safari, and Edge.
Businesses are increasingly turning to facial recognition technology (FRT) in various sectors - from retail and transport to financial services - for purposes aimed at crime prevention, security, convenience, and operational efficiencies.
Sainsbury's recently announced that it will be trialling FRT in two of its stores as part of efforts to tackle retail crime. This follows several other retailers, including Home Bargains, Farmfoods, Sports Direct, B&M, Flannels, Spar and Iceland who have adopted FRT for similar purposes.
However, the growing use of FRT into public-facing environments has attracted increased scrutiny from regulators, civil society and the media and raises a number of legal and ethical questions, particularly from a privacy perspective.
Facial Recognition Technology
Facial recognition technology identifies a person from a digital facial image. Cameras are used to capture an image, and facial recognition software measures and analyses facial features to create a biometric template which is then compared against stored data to produce an identification. FRT typically involves the use of complex AI systems to enable the system to detect, analyse and match facial features with increasing accuracy over time.
Common uses of FRT such as unlocking our mobile phones, typically involve a “one-to-one” process. This means that the individual participates directly and is aware of why and how you are using their data.
Live FRT in public spaces is different, and is typically deployed in a similar way to traditional CCTV. This means it is directed towards whole spaces rather than specific individuals, and captures data automatically and indiscriminately, often without individuals’ awareness or control. This means it has greater potential to be used in a privacy-intrusive way.
Use of FRT under UK data privacy laws
FRT generally involves the use of personal data and biometric data, which is regulated by the UK GDPR, Data Protection Act 2018 (DPA 2018) and the recently enacted Data (Use and Access) Act 2025 (DUAA).
There needs to be a lawful basis for processing this data (for example, consent, legitimate interests, or processing which is necessary for performance of a contract or to comply with a legal obligation). The DUAA introduces a new ‘recognised legitimate interests’ lawful basis which allows organisations to process data for specific purposes such as crime prevention, safeguarding, and public security without conducting the usual balancing test. This may be relevant to commercial deployment of FRT once this provision comes into force (via secondary legislation expected in the coming year).
Biometric data is also likely to constitute “special category” data under the UK GDPR, which means organisations are required to identify both a lawful basis for processing under Article 6 and satisfy one of the additional conditions under Article 9 of the UK GDPR.
Businesses using FRT to detect or prevent crime are also likely to be processing criminal offence information. Where criminal offence data is also processed (e.g. identifying suspected shoplifters), additional conditions under Schedule 1 of the DPA 2018 must be met.
As well as identifying a lawful basis for processing, businesses should ensure that any use of FRT aligns with the core principles of data protection, namely:
Other considerations
Beyond data protection concerns, the use of FRT presents broader legal and ethical risks that organisations should carefully consider:
Practical steps for compliance
Businesses thinking about using FRT should consider these key practical steps:
For advice on the proposed changes to UK data protection laws and other changes introduced by the DUAA, please contact Martin Cook, Madelin Sinclair McAusland, Amanda Leiu, or another member of our Commercial & Technology team.
This article was written by Harriette Alcock and Amanda Leiu.