BACK_TO_FEEDAICRIER_2
Essex Police pauses facial recognition over bias study
OPEN_SOURCE ↗
REDDIT · REDDIT// 19d agoPOLICY REGULATION

Essex Police pauses facial recognition over bias study

Essex Police has paused live facial recognition (LFR) deployments after a Cambridge study found the system was significantly more likely to correctly identify Black participants than people from other ethnic groups. The report also found men were matched more often than women, even though false identifications were rare.

// ANALYSIS

This is the uncomfortable middle ground for AI surveillance: a system can be operationally useful and still uneven enough to be politically and ethically radioactive. The real question is not whether LFR works in aggregate, but whether the state should scale a tool whose error profile shifts by demographic group.

  • The Cambridge team tested 188 actors in live deployments, so this is field evidence, not a toy benchmark.
  • The bias is about true-positive rates, not just false matches: Black participants were matched more often, while women were matched less often.
  • Essex says it paused to update software and procedures, which suggests calibration can help but does not solve the core question of whether scanning crowds at scale is proportionate.
  • With the Home Office expanding LFR vans, this local pause is really a national governance problem.
  • For AI teams, the lesson is to report demographic slices and operating thresholds, not a single aggregate accuracy number.
// TAGS
live-facial-recognitionregulationethicsresearchsafety

DISCOVERED

19d ago

2026-03-23

PUBLISHED

20d ago

2026-03-23

RELEVANCE

6/ 10

AUTHOR

ateam1984