The Washington Post has obtained a report that indicates that Huawei and Megvii worked together to build a surveillance system that would identify members of China’s Uighur Muslim minority group. The report bears the signatures of Huawei representatives, and was found on the Huawei website, though it was removed after the two companies were asked for comment.
The actual document was labeled as an “interoperability test report,” and it details the results of a series of 2018 tests to evaluate the performance of a surveillance system with Huawei and Megvii technology. The system integrated Megvii’s facial recognition software with Huawei’s hardware and software infrastructure, with the latter company providing the cameras, servers, networking equipment, and technology needed to run the system. The report indicates that the system passed those tests. Huawei and Megvii have since gone on to announce three commercially available surveillance systems, although it’s unclear how those systems relate to the one tested in 2018.
The most eye-catching section of the report concerns an ‘Uighur alarm,’ which was also evaluated as part of the testing process. The alarm would allow system operators to send an automatic notification to the police whenever it spots a member of the minority group.
Privacy watchdogs expressed particular concern about that last detail. Historically speaking, facial recognition algorithms have displayed racial bias, but that bias has usually not been deliberate. It occurs because facial recognition algorithms are trained with non-representative data sets, and they struggle when asked to identify people that fall outside the dominant group.
The Uighur alarm, on the other hand, treats that bias as a feature rather than a bug, insofar as it deliberately seeks to isolate people based on membership in an ethnic group, and then gives administrators the means to use that information as the basis for state action. As a result, the technology would automate discriminatory practices and give oppressive regimes sweeping abilities to target and profile vulnerable minority populations.
“There are certain tools that quite simply have no positive application and plenty of negative applications, and an ethnic-classification tool is one of those,” said Clare Garvie, a Senior Associate at the Georgetown Law Center on Privacy and Technology. “Name a human rights norm, and this is probably violative of that.”
“This is not one isolated company. This is systematic. A lot of thought went into making sure this ‘Uighur alarm’ works,” said IPVM Founder John Honovich. IPVM reviews video surveillance equipment and is the company that found the document on the Huawei website.
The Huawei-Megvii system can categorize people based on age and sex in addition to ethnicity. Megvii claimed that its technology is not designed to target ethnic groups, while a Huawei spokesperson said that the work detailed in the report “is simply a test and it has not seen real-world application.”
The report will likely add to the growing concerns about the Chinese surveillance state. The country has dramatically expanded its use of facial recognition technology in the past few years, despite growing concerns about privacy and data security. Both Huawei and Megvii have faced sanctions in the United States for their potential involvement in government schemes, including the ongoing persecution of the Uighur Muslims in Xinjiang.
Source: The Washington Post
–
December 11, 2020 – by Eric Weiss
Follow Us