The UK’s Information Commissioner’s Office (ICO) has published its final report for the Regulatory Sandbox project with Onfido. The digital identity specialist entered the Sandbox in the summer of 2019 in an effort to mitigate racial bias while still respecting the privacy of individual civilians.
The problem, according to Onfido, is that UK law limits how and when companies can process data that identifies people based on their race and ethnicity. That creates complications for researchers, who need to be able to analyze racial data in order to identify bias. They also need to use images of people from different racial and ethnic backgrounds to train their algorithms to recognize people from those backgrounds.
With that in mind, the ICO concludes that Onfido should be able to move forward with its research, on the grounds that projects that reduce bias in biometric algorithms serve the public interest, and are therefore permitted according to the latest GDPR guidelines. However, the ICO still placed certain restrictions on that research, emphasizing that a project can only proceed if it does not require the identification of unique individuals. Onfido’s anti-bias research can continue because it can process racial data while other data remains anonymous.
In that regard, the ICO determined that biometric data does not constitute a special category of personal data in anti-bias research projects. Onfido will nevertheless continue to work with the ICO to make sure that it does not violate GDPR guidelines, and committed to transparency when collecting personal information for research.
Onfido announced that it would be retraining its algorithm to reduce racial bias back in August. In doing so, the company noted that it had initially trained its algorithm on a lighter-skinned data set, and is now hoping to amend that oversight.
The NIST has found evidence of racial bias in many of the world’s top facial recognition algorithms, and the issue has taken on greater urgency in recent months due to false arrests and concerns about discriminatory law enforcement practices. The problem is also present in the UK, where the Home Office deployed a facial recognition system with known racial biases.
–
November 10, 2020 – by Eric Weiss
Follow Us