Published Date : 8/5/2025Â
Fans of police facial recognition will know Notting Hill. Not for the timeless romance of Anna and Will, but the rather less exotic tale of Live Facial Recognition (LFR) in UK policing. It was 2016 when LFR was first trialled in Notting Hill: the technology was patchy, the law untested, and the public largely unaware that there was even a policy fence to be sat upon.
The technology has improved dramatically since then, and the police have put their algorithm through independent testing by the National Physical Laboratory. But policing globally is still haunted by LFR’s shaky start and the statistics from those early days are wheeled out as frequently as George Orwell. As I’ve said before, in AI terms that was the Pleistocene era; challenging police chiefs with data that old is jousting with fossils. When they deploy LFR again this year, the Metropolitan Police will have more reason to trust the technology.
Things have also matured legally. The law – which the Court of Appeal in the Bridges case confirmed includes policy – has developed, though many say not far enough for clarity and certainty. The government and regulators have taken a pragmatic approach to what remains a highly controversial technology. Artificial Intelligence (AI) is transforming our lives, and the impact of tools like facial recognition will ultimately depend on balancing three things: the technological (what can be done), the legal (what must/must not be done), and the societal (what people will support/expect to be done). The third element is the bit that’s often missed: what people support their police doing with AI-enabled capabilities and what they worry about them doing with it. With a democratic policing model that relies on consent, this is a key part of the picture.
The original LFR trial at the Carnival generated enough concern for the Mayor’s Office to assure Londoners it wouldn’t be back the following year. This was a time of tech-exuberance by law enforcement around the world, internet scraping, using algorithms designed for earthquakes to predict criminality, and other actuarial policing ideas more at home in Diagon Alley than on the streets of capital cities. Clumsy experimentation has haunted the adoption of FRT in its several forms – live, retrospective, and officer-initiated – ever since; law enforcement bodies are still hitting resistance to its adoption, which is becoming a proxy measure of public trust generally. Building ever-smarter biometric systems and then telling the citizen it’s good for them is unlikely to produce the conditions that the accountable use of new crime-fighting tools like facial recognition capability are going to need. The triptych approach calls for versatility from the police and those who hold them to account.
Trying out controversial policing technology on the population has obvious pitfalls (imagine if they had set the voltage for Taser by shoot-and-see deployment) but there’s no substitute for operational wins as a demonstration of potential. With that in mind, the story shifted this year from Notting Hill to Denmark Hill, a very different part of London where, in January, the police had a near-perfect use case. Convicted paedophile, David Cheneler, was under a court order stopping him from having contact with children under 14 years old. A police LFR camera van spotted a man walking in a street with a 6-year-old girl and matched his face with its watchlist image of Cheneler. Several features made that case a biometric success for policing; as confidence in the technology and those using it grows, others will surely emerge.
Balancing the issues in the future will need dynamic policy, patient reflection, and constant vigilance from all three perspectives. For example, some might wonder why we have data protection and human rights policies for LFR that make the police tell everyone in advance where and when they’re going to use it. Mandatory advertisement limits the effectiveness of the technology, but what’s the alternative? The appropriateness of such policies will need to be worked through, and if communities and their police are to get the most from biometric capabilities, it is vital that both approach the challenges together. A few things will help. First, a shared understanding of what accountability in the police use of AI really looks like. Second, looking at every situation through the triptych. And third, ditching the “if you’ve done nothing wrong, you’ve nothing to worry about” argument. If you don’t know why, go back to the first one.
When Hugh Grant and Julia Roberts were avoiding the cameras in NW11, police use of facial recognition was scarcely talked about – it was 1999 and the great tech fear was the Millennium Bug. Since then, UK policing has successfully introduced technological capabilities that are now everyday kit, like Taser and body-worn video. Faced with street robbery and knife crime on a steep upward curve in the capital, shoplifting and phone theft shattering all previous records, and reduced budgets, policing needs help. Technological support is available, and there are some powerful options, one of which is facial recognition technology.
The Rome2Rio website plots the distance from Notting Hill to Denmark Hill at 7.1 miles: in policing technology terms, it’s much further. The trials are over (though not the litigation), and the technology has proved itself, particularly as a deterrent to the alarming increase in retail crime. Now the police must show that they have evolved with it. As the cameras return to Notting Hill this month, the focus on the police will be intense. They have made progress from all three perspectives – the possible, the permissible, and the acceptable – but whether the readers of Horse and Hound will be relieved remains to be seen.Â
Q: What is Live Facial Recognition (LFR)?
A: Live Facial Recognition (LFR) is a technology used by law enforcement to identify individuals in real-time by comparing their facial features with a database of known individuals.
Q: Why was the first LFR trial in Notting Hill controversial?
A: The first LFR trial in Notting Hill in 2016 was controversial due to the technology being patchy, the law being untested, and the public being largely unaware of the trial's existence.
Q: What improvements have been made to LFR technology since 2016?
A: Since 2016, LFR technology has improved dramatically, with the police putting their algorithms through independent testing by the National Physical Laboratory to ensure accuracy and reliability.
Q: What is the triptych approach in the context of LFR?
A: The triptych approach in the context of LFR involves balancing the technological, legal, and societal aspects of using facial recognition technology in policing.
Q: What was the significance of the LFR case involving David Cheneler in Denmark Hill?
A: The LFR case involving David Cheneler in Denmark Hill was significant because it demonstrated a near-perfect use case of the technology, where a convicted paedophile was identified and apprehended while violating a court order.Â