The Defence and Security Accelerator (DASA) of the United Kingdom’s Home Office has initiated a new Market Exploration focused on Facial Recognition (FR) technologies intended for policing and security applications. The Home Office is leading this exploration to identify innovative technological solutions that utilize facial recognition capabilities for the benefit of law enforcement and security stakeholders. Submissions for this exploration must be provided by midday on October 12, 2023.
The importance of FR technology in law enforcement continues to grow, with applications aimed at preventing and detecting crime, enhancing security, identifying wanted individuals, safeguarding vulnerable populations, and ensuring public safety. The deployment of ethical and effective FR technologies is a key priority for the Home Office, and the government seeks to collaborate with industry to advance this mission.
The Home Office is specifically interested in higher Technology Readiness Level (TRL) capabilities that can successfully establish identity using facial features and landmarks. The proposed solutions should also support algorithm development, integration, and analytics. The use of secure, accurate, explainable, and unbiased technologies is crucial. The scope of the exploration encompasses various FR applications, including Retrospective FR (RFR), Operator-initiated FR (OIFR), and Live FR (LFR), which involve identity resolution through facial recognition.
The call for submissions aligns with the Home Office’s commitment to responsible FR technology development. One of the central challenges in FR technology has been addressing demographic bias, where certain systems show disparities in accuracy based on factors such as ethnicity and gender. A significant development in assessing this issue came from the United Kingdom’s National Physical Laboratory (NPL) earlier this year, which concluded a comprehensive study. The NPL examined NEC’s NeoFace solution along with the HD5 Face Detector and found that, when used under specific settings maintained by the Metropolitan Police, there was no statistically significant difference in accuracy across various demographic groups. This pivotal finding suggests that bias can be effectively eliminated from the system when the technology is developed, configured, and implemented with care.
“It is essential to acknowledge the concerns surrounding FR technology, particularly those relating to privacy and potential biases,” commented Professor Paul Taylor, National Policing Chief Scientific Adviser. “However, responsible development and implementation of FR systems can address these concerns effectively. By establishing robust governance frameworks, implementing strict data protection protocols, and ensuring transparency and accountability, we can strike the right balance between public safety and individual privacy rights.”
Sources: GOV.UK, Financial Times
–
August 31, 2023 – by the FindBiometrics Editorial Team
Follow Us