A coalition of civil society organizations has urged the UK Home Secretary to implement safeguards for AI systems in policing, advocating for a complete ban on predictive policing and biometric surveillance. The #SafetyNotSurveillance coalition includes 17 groups such as the Open Rights Group, Big Brother Watch, Liberty, and the Network for Police Monitoring.
In its letter, the coalition argues that AI systems using data and algorithms to identify and profile individuals or locations for predicting criminal acts should be banned. The group contends that these systems perpetuate and amplify discrimination based on racial and ethnic origin, nationality, socio-economic status, disability, gender, and migration status. The coalition highlights the need for transparency, accountability, and legislative regulation for all other AI uses in policing.
“AI and automated systems have been proven to magnify discrimination and inequality in policing,” said Sara Chitseko, Pre-crime Programme Manager for the Open Rights Group. “Of particular concern are so-called ‘predictive policing’ and biometric surveillance systems which are disproportionately used to target racialised, working class and migrant communities.”
The letter specifically addresses the contentious use of facial recognition technology by police, which has faced opposition from civil liberties groups, and calls for a clear legal framework from the House of Lords Justice and Home Affairs Committee.
The adoption of facial recognition technology by UK police has expanded significantly in recent years, with several forces moving forward with permanent implementations. Essex Police, following a successful trial in 2023, have recently decided to permanently use live facial recognition technology to apprehend serious offenders. This decision mirrors efforts by other police forces, such as the London Metropolitan Police and North Wales Police, to integrate similar technologies.
In addition to LFR, the UK Home Office has allocated £55.5 million to enhance police capabilities using facial recognition, particularly targeting retail crimes like shoplifting. This funding supports the Retail Crime Action Plan, which leverages CCTV footage to identify offenders through the Police National Database. Despite these advancements, there are ongoing concerns about potential biases and the proportionality of deploying such technology extensively.
Source: UKAuthority
–
July 30, 2024 – by Ali Nassar-Smith
Follow Us