A federal judge has dismissed an almost identical pair of Biometric Information Privacy Act (BIPA) lawsuits against Amazon and Microsoft in a ruling that could prove consequential for the biometrics industry as a whole.
The cases revolved around the companies’ use of a dataset compiled by IBM called the “Diversity in Faces Dataset”. The dataset comprises about a million images uploaded to the public photo-sharing platform Flickr.
Initially planned as a 36,000-image dataset on which to train facial recognition algorithms, IBM ultimately expanded its sample size to deliver better results, with the aim of curating a dataset that comprehensively spanned a representative range of demographics such as race, gender, and age – and offering its use for free to other developers.
It was an effort to seriously address the issue of racial bias in particular, with researchers having detected anomalies in the accuracy of various facial recognition systems when performing on different ethnic groups. The disparities are widely thought to result from inadequate training of AI algorithms, particularly with respect to face image datasets that are insufficiently representative of different ethnicities.
In using the dataset, however, Amazon and Microsoft were alleged to have violated Illinois’s Biometric Information Privacy Act, which requires organizations to disclose their use of biometric technologies and to obtain explicit consent from subjects. Plaintiffs Steven Vance and Tim Janecyk focused on the latter requirement in their complaint, arguing that the subjects in the Diversity in Faces Dataset had not consented to being subject to Amazon’s and Microsoft’s biometric technologies.
Judge James L. Robart of the US District Court for the Western District of Washington was skeptical of the argument, however. His dismissal of the Amazon lawsuit is sealed as he works with counsel on potential redactions, but his Microsoft judgment presumably offers an indication of his line of thinking in both cases.
With respect to that lawsuit, Judge Robart noted that Microsoft’s activities did not occur “primarily and substantially” in the state of Illinois. And while it was possible that some of the biometric data ended up being stored in cloud servers in the state, BIPA is primarily concerned with the acquisition of biometric data and not its storage, the judge added.
With BIPA’s expansive scope having ensnared a number of high-profile companies – not just tech giants like Google and Facebook but everything from restaurant chains to cosmetics companies – Judge Robart’s judgment concerning its limits could prove important to a range of companies, and to biometrics vendors in particular.
A senior executive with a major facial recognition company, requesting anonymity, told FindBiometrics the ruling “helps clear up the use of public data in training sets to FRT algorithms, and allows the industry to move forward.”
The executive also observed that this is “the first ruling on the extraterritoriality of BIPA,” and was essentially a ruling against the idea of extraterritorial scope, as the judgment focused on what Microsoft and Amazon were doing with the dataset within the state. “This helps give the industry more certainty around the jurisdiction of a state law such as BIPA,” the executive said.
Indirectly, the ruling could also help to ensure that facial recognition specialists continue to refine their algorithms to reduce and, ideally, eliminate demographic bias, reassuring them that the use of datasets like Diversity in Faces is not de facto illegal – at least not outside of Illinois.
Sources: Law360, Bloomberg Law
–
October 21, 2022 – by Alex Perala
Follow Us