AI Navigate

UK police force presses pause on live facial recognition after study finds racial bias

The Register / 3/20/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisIndustry & Market Moves

Key Points

  • The UK police force has paused the use of live facial recognition following new research that found racial bias in who is flagged by the technology.
  • The study found statistically higher misidentification rates for Black people, a finding that prompted the pause and raised questions about AI-driven surveillance fairness.
  • The move could influence other agencies and policymakers, signaling renewed scrutiny and potential regulatory shifts around facial recognition tech.
  • The report underscores ongoing debates about bias, governance, and the role of AI in public safety.

UK police force presses pause on live facial recognition after study finds racial bias

Cams statistically more likely to ID Black people, says new research

Fri 20 Mar 2026 // 13:35 UTC

A UK police force has suspended its deployment of live facial recognition (LFR) technology after a study revealed it was statistically more likely to identify Black people on a watchlist database.

Essex Police said it had paused use of the technology to update the system with the help of the algorithm software provider. Another similar study identified no bias, it said.

The report from Cambridge University researchers found the Essex police system was more likely to correctly identify men than women and was statistically significantly more likely to correctly identify Black participants than participants from other ethnic groups.

privacy

Microsoft doesn't want cops using Azure AI for facial recognition

READ MORE

Police forces can use LFR to identify people on a pre-configured watchlist, usually made up of criminals, people of interest, or missing vulnerable individuals.

The study [PDF] used 188 volunteers to act as members of the public in a controlled field experiment during a real police deployment. Because the researchers knew exactly who was present, it was possible to measure both correct and missed identifications.

It found that at the "current operational setting" used by Essex Police, the system correctly identified around half of the people on the watchlist who passed the cameras and that incorrect identifications were "extremely rare."

"Of the six false positive identifications observed in this test, four involved Black individuals. Given that observations of Black subjects constituted 536/2,251 (23.8 per cent) of the sample, the observed imbalance is unlikely to be due to chance alone but this could reflect the limited number of false positive events rather than a true systematic effect," it said.

The finding should be treated as suggestive rather than conclusive, it added.

A spokesperson for Essex Police said that as part of a commitment to its Public Sector Equality Duty, it had commissioned two independent studies which were completed by academia. "The first of these indicated there was a potential bias in the positive identification rate, while the second suggested there was no statistical relevant bias in the results.

"Based on the fact there was potential bias, the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software."

The force added: "We then sought further academic assessment. As a result of this work, we have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals. We will continue to monitor all results to ensure there is no risk of bias against any one section of the community."

Earlier this year, the British government decided the police in England and Wales should increase their use of live facial recognition (LFR) and artificial intelligence (AI) under wide-ranging government plans to reform law enforcement.

In a white paper [PDF], the Home Office launched plans to fund 40 more LFR-equipped vans in addition to ten already in use. It said they would be used in "town centers and high crime hotspots" with the government planning to spend more than £26 million on a national facial recognition system and £11.6 million on LFR capabilities. ®

More about

TIP US OFF

Send us news