The Los Angeles Police Department (LAPD) is coming under fire for its widespread and flagrantly deceptive facial recognition practices. The news comes after a record request prompted the department to release 368 pages of emails in which officers discussed the technology.
While the LAPD does not have its own facial recognition system, it does have access to a regional database run by the Los Angeles County Sheriff’s Department. The database contains more than 9 million mugshots that officers can search using Los Angeles County Regional Identification System (LACRIS) software from South Carolina’s DataWorks Plus. The company has installed facial recognition engines from Rank One, Cognitec, and NEC.
After the release of the emails, an LAPD spokesperson confirmed that department officers used the facial recognition software 29,817 times between 2009 and 2020, and that 330 officers currently have access to the LACRIS system. Approximately 3,750 of those searches have occurred since February, while the number of officers with access has been as high as 525.
Privacy advocates expressed concern about the high numbers, and about the LAPD’s repeated attempts to hide its use of facial recognition. In 2019, an LAPD spokesperson explicitly told the LA Times that, “We actually do not use facial recognition in the Department,” and the department would later block a public records request related to facial recognition on the grounds that the department does not use the technology.
Given the recent revelations, the decision to mislead the public could further erode trust between the police and the general public, and exacerbate fears about the potential abuse of facial recognition. The LAPD claimed that it only uses the technology to generate leads, and that it does not make any arrests based only on a facial recognition match. In that regard, DataWorks Plus indicated that its system does not produce a singular match, and instead gives an officer multiple potential leads to inform a follow-up investigation.
However, racial bias is still a pressing issue, especially for a database built primarily with mugshots. The algorithms in the DataWorks system were among those tested in an NIST study that found that facial recognition tech is far more likely to misidentify Black and Asian individuals than white subjects.
The LAPD indicated that it has used facial recognition to identify suspects in cases in which there were no witnesses, or in which there were no witnesses willing to talk. The Safe LA Task Force also used it to identify people involved in anti-police brutality protests over the summer, but claims to have limited its scope to those suspected of burglary, arson, and other crimes.
The state of California has banned the use of facial recognition in police body cameras, while several municipalities (including San Francisco) have taken that a step further and banned the use of the technology for law enforcement purposes.
Source: Los Angeles Times
–
September 23, 2020 – by Eric Weiss
Follow Us