A Court of Appeals in Wales has ruled the use of automatic facial recognition (AFR) technology by South Wales Police to be unlawful, overturning a ruling by London’s High Court in a case that dates back to September of 2019.
As the BBC reports, this latest decision was the result of a challenge of the original ruling by the plaintiff, Ed Bridges of Cardiff, and civil rights group Liberty, on the basis that Mr. Bridges’ human rights had been violated and he had been caused distress by having his facial biometric data captured and stored on two separate occasions without his knowledge or consent.
The court upheld three of the five points that were raised during the appeal process, ruling that there was no clear indication as to where South Wales Police could use the AFR technology and who could be placed on a watch list by the force, that the data protection impact assessment was deficient, and that reasonable steps to ensure the tech didn’t exhibit gender or racial bias were not taken.
“I’m delighted that the court has agreed that facial recognition clearly threatens our rights,” said Bridges. “For three years now, South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge,” he added.
“[This ruling is a] major victory in the fight against discriminatory and oppressive facial recognition,” said Liberty lawyer Megan Goulding. “It is time for the government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it has no place on our streets.”
The ruling comes during a turbulent time in the evolution of facial recognition, with a widening gap between the controversial surveillance applications of the technology – such as the AFR used in the South Wales case – and the increasingly essential, privacy-forward authentication use cases. The mainstreaming of face authentication has accelerated in recent months as the COVID-19 pandemic has forced people and enterprises around the globe to opt for contactless forms of authentication both in remote work and physical office situations. Meanwhile, the surveillance applications are facing increasing scrutiny in the shadow of new and emerging regulations.
Responding to the ruling, South Wales police has said it would not be appealing the findings, and highlighted the usefulness of the tech in helping apprehend suspects.
“The test of our ground-breaking use of this technology by the courts has been a welcome and important step in its development,” said South Wales Police Chief Constable Matt Jukes. “I am confident this is a judgment that we can work with.”
The police force has reportedly been using AFR for three years, trialing it mostly a large events like pro sports games, concert and other large venues within its jurisdiction.
The force said that its use of AFR has led to sixty one arrests of people for offences ranging from robbery and violence, to court warrants, and stated that it was “proud of the fact there has never been an unlawful arrest as a result of using the technology in South Wales,” and that it is still “completely committed to its careful development and deployment.”
Liberty barrister Dan Squires QC countered by arguing that is the potential use of the tech that is the issue, not how its has been employed in practice thus far, saying that there are insufficient safeguards in place within the current legal system to protect individuals from having their biometric data captured unlawfully.
“It’s not enough that it has been done in a proportionate manner so far,” he said.
Though the ruling is expected to impact the use of facial recognition technology beyond South Wales Police – London’s Met has been using similar technology – it does not eliminate it as a tool, and it is still unclear what impact it will have.
The ruling does however indicate that going forward officers will be required to clearly document who they are looking for and any evidence they have the indicates their target is likely to be in an area that is being monitored.
Additionally, any software that is used will need to be tested to ensure it doesn’t exhibit any racial or gender biases, though no further details have been provided thus far on how exactly that will be done.
Source: BBC
–
August 12, 2020 – by Tony Bitzionis
Follow Us