Live Facial Recognition
The Home Secretary defends the rollout of facial recognition to all police forces – How are the courts dealing with this?

The Evolution of Digital Evidence in UK Courts
In the past year (2024–2025), UK legal cases regarding video analysis have primarily focused on the admissibility of normally acquired footage but recently courts have been asked to examine different types of video acquired imagery such as AI-generated or manipulated evidence, the legality of live facial recognition (LFR) by police, and the authentication of digital footage.
The legal landscape is having to rapidly adapt to "deepfake" technology and the increased use of automated surveillance.
Understanding Live Facial Recognition Technology
Live facial recognition technology captures and analyses the faces of individuals passing in front of real-time CCTV cameras and extracts unique biometric data from each face, comparing it against a "watchlist" of people sought by the police, which often contain thousands of individuals.
Legal Challenges and Court Cases
The Met Police use of the technology to police major events such as the Notting Hill Carnival sparked challenges in the High Court by the Equality and Human Rights Commission (EHRC). A judicial review, R (Thompson and Carlo) v The Commissioner of Police of the Metropolis, was brought regarding a wrongful stop in February 2024, raising issues of human rights and data protection, with interventions arguing that the "regulatory wild west" of LFR requires stronger legal framework compliance.
ECHR claim that Live Facial Recognition (LFR) triggers disproportionately high alert rates for Black men when compared with the wider London population.
The latest news that the Home Office has embraced the technology and intends to roll it out to additional police forces indicates that this system is moving into widespread operational use before concerns about accuracy, proportionality and potential discrimination have been fully resolved. As automated surveillance becomes more common in policing, the likelihood of misidentification and unreliable alerts increases.
AI-Generated Evidence and Deepfakes
In another matter relating to imagery, a highprofile case of image tampering was appealed in the High Court in His Majesty's Advocate v LM [2025] HCJAC 3. The judge considered evidence in which a "paedophile hunter" group used AI to edit a photograph in order to make an adult decoy appear younger. The altered image was then presented in criminal proceedings. This incident highlights the increasingly concerning use of AI to create evidential material rather than simply analyse it and illustrates how digital manipulation poses serious challenges to the integrity and reliability of court evidence.
The Information Commissioner's Office (ICO) has successfully been involved in litigation against Clearview AI Inc. concerning the mass scraping of facial images and the question of whether such activity falls within the extraterritorial scope of UK GDPR. The ICO's guidance warns that AI tools are capable of producing convincing fake material, often referred to as deepfakes, and stresses that judges must take active steps to verify the authenticity of any digital evidence presented in court.
And what about the scraping of the internet for faces?
The "Liar's Dividend" Phenomenon
Judicial guidance increasingly warns that AI tools can produce highly convincing fabricated material ("deepfakes"), meaning courts must take additional steps to verify the authenticity of any digital evidence. This growing awareness has contributed to a surge in applications seeking expert verification of video content. Courts are now navigating the "liar's dividend", a phenomenon where the mere existence of deepfake technology enables parties to cast doubt on genuine video evidence, even when it is authentic.
The Need for Expert Verification
With LFR systems expanding across the UK, the need for expert witness services and authoritative digital evidence verification is becoming essential. By instructing Forensic Video Services, investigators and courts can ensure that any video or imagery, whether produced by automated systems or conventional CCTV, has been professionally examined and accurately interpreted.
Forensic Video Services Specialisms
This article has shown why there is a clear need for independent companies such as Forensic Video Services (FVS) that are capable of providing accurate and unbiased assessment of digital evidence. Our specialisms include:
- forensic video analysis
- CCTV enhancement
- forensic video enhancement services that ensure evidential material is presented in its true form and supported by a clearly defined forensic evidence chain.
As a forensic imaging laboratory, we routinely examine factors that can influence identification outcomes, including compression artefacts, poor lighting, pixel distortion and metadata anomalies. Our facial mapping experts, vehicle identification analysis specialists, and tampering and manipulation detection services provide the level of clarity and technical validation required in court. By applying scientific methods and maintaining strict evidential standards, we help legal professionals, investigators and courts reach reliable conclusions based on authentic and properly interpreted digital evidence.
References
Contact Forensic Video Services
For expert witness services or digital evidence verification, please contact our laboratory directly.
About the Author
Forensic Video Services Team consists of qualified forensic video analysts with extensive experience in digital evidence verification, LFR analysis, and AI-generated content detection. Our experts have testified in numerous cases involving automated surveillance systems and are committed to maintaining the highest standards of forensic practice. With over 25 years of experience, we are the UK's leading specialists in forensic video analysis, imagery analysis, and digital evidence authentication.
Share this article
Help others discover this content