In the wake of another study pointing to demographic bias in commercial facial recognition technologies, SensibleVision is highlighting its 3DVerify solution, which the company says shows no such bias.
The latest study comes by way of MIT and Stanford University. Assessing three commercially available facial recognition systems, researchers found significant differences in accuracy between demographic groups: While the overall error rates came to only 0.8 percent for white men, the spiked to 34.7 percent for black women. That’s a serious issue: With facial recognition on the rise in applications such as border control, for example, there’s a real risk that certain groups of travelers could face much more intensive – and unwarranted – scrutiny from authorities.
Responding to these findings in a statement, SensibleVision, whose technology was not tested by the researchers, asserted that its 3DVerify system “is immune to this kind of bias” because it’s focused primarily on face contours, mapping them out in three dimensions, and it “has been developed with extensive databases featuring people of both genders and a wide range of complexions.” The solution was first announced last June, and can operate in bright sunlight and complete darkness, according to the company.
Of course, other companies are also working hard to address this issue, too. Apple, for example, has emphasized that its iPhone X’s Face ID facial recognition system has been trained on an extensively diverse dataset to ensure that it operates effectively with all of its users. But MIT and Stanford’s latest study clearly points to the need for more work to be done across the industry, if all players want to be able to keep up with rivals like Apple and SensibleVision, let alone ensure that their technology can be ethically applied in the real world.
Source: MIT News
–
March 2, 2018 – by Alex Perala
Follow Us