A General Services Administration study has found mixed results in the performances of five commercially available digital identity verification systems, highlighting the issue of demographic bias.
The study, “A Large-Scale Study of Performance and Equity of Commercial Remote Identity Verification Technologies Across Demographics,” investigated the performance and equity of “RIdV” (for “remote identity verification”) technologies. The study evaluated five commercial RIdV solutions and their ability to equitably verify identities across a diverse range of demographic groups, including age, gender, race/ethnicity, and skin tone. A total of 3,991 participants took part in the study, providing a robust dataset to assess the equity of these systems.
The core of the research focused on the false negative rate (FNR) as a measure of system performance. A false negative occurs when a legitimate user is wrongly rejected by the system, which can prevent individuals from accessing services. The study found that two of the five RIdV systems demonstrated equitable performance, with no statistically significant differences across demographics. However, three systems exhibited disparities, particularly in terms of race and skin tone.
One of the most concerning findings was the higher FNR for Black/African American individuals and those with darker skin tones (Monk scale 7/8/9/10) when using certain RIdV systems. In contrast, another system showed more favorable performance for Asian American and Pacific Islander (AAPI) individuals, raising questions about the uneven performance across racial and ethnic groups.
The study does not name the specific commercial applications or vendors that were assessed. Instead, it anonymizes them, referring to the five vendors as “Dingo,” “Hedgehog,” “Wombat,” “Marmot,” and “Badger.” But, as Nextgov/FCW’s Natalie Alms notes, a 2023 GSA privacy impact assessment for the study named TransUnion, Socure, Jumio, LexisNexis, Incode and Red Violet as vendors that would be used “to collect and analyze the needed data for the equity study.”
Previous research has shown that certain biometric systems exhibit differential performance across demographics, particularly with darker-skinned females and younger users. This study went beyond algorithmic fairness, incorporating the full end-to-end process of RIdV systems, including user interfaces, document verification, and liveness detection. By assessing the entire system, the researchers provide a more comprehensive understanding of how demographic factors influence performance in real-world scenarios.
The study’s findings suggest that while some RIdV systems are equitable, others may create barriers for marginalized groups, potentially excluding them from accessing essential services. That determination highlights the need for further refinement and testing of RIdV technologies to address these disparities and improve their fairness across all demographic groups.
Source: arXiv, Nextgov/FCW
–
September 26, 2024 – by the ID Tech Editorial Team
Follow Us