Overview
- Essex Police halted public deployments after a Cambridge field study found the system was statistically more likely to identify Black participants than other ethnic groups and more likely to identify men than women.
- The controlled experiment used 188 volunteers during an operational deployment, measuring both correct and missed identifications and reporting that false positives were extremely rare at the force’s current settings.
- Essex Police said two commissioned academic reviews reached different conclusions on bias, then worked with the algorithm provider, revised policies and procedures, and now state they are confident to resume use with ongoing monitoring.
- Operational data cited in the coverage show roughly 1.3 million faces scanned from August 2024 to February 2025, leading to 123 interventions and 48 arrests, with one mistaken intervention attributed to the technology.
- The Information Commissioner’s Office noted the pause ahead of its audit and urged forces to routinely test for bias, as the Home Office pursues national expansion including funding for 40 additional LFR vans and investment in facial recognition systems.