Researchers at Georgia Tech have developed an AI system called Chameleon that creates invisible digital masks to protect personal photos from unauthorized facial recognition systems while maintaining image quality. This development comes amid growing concerns about unauthorized data scraping from social media platforms, which privacy regulators from multiple countries have identified as a significant threat.
The system generates a personalized privacy protection mask based on a few user-submitted facial photos. This mask is designed to prevent unauthorized web scraping of facial images, which can be used for identity fraud and other malicious purposes. Unlike traditional privacy protection methods that blur or pixelate images, Chameleon’s approach preserves visual quality while still blocking AI-powered facial recognition systems.
According to testing results, Chameleon demonstrated superior performance compared to three leading facial recognition protection models in both visual quality preservation and protective effectiveness. The system also showed improved speed and resource efficiency, making it practical for widespread adoption across social media platforms and personal photo collections.
The development team includes Professor Ling Liu from Georgia Tech’s School of Computer Science, PhD students Sihao Hu and Tiansheng Huang, and Ka-Ho Chow, an assistant professor at the University of Hong Kong who previously studied under Liu. The team’s work builds on growing efforts to protect personal biometric data, particularly as investment in generative AI continues to rise and concerns about unauthorized data collection intensify.
The researchers are exploring additional applications for the technology, including protecting images from being used without consent to train AI generative models. This addresses a crucial gap in current privacy protections, as many existing AI models have been trained on scraped images without explicit user permission. The team plans to release Chameleon’s code on GitHub to enable further development and promote responsible AI adoption.
The research was detailed in a paper titled “Personalized Privacy Protection Mask Against Unauthorized Facial Recognition,” which was presented at ECCV 2024.
Source: Georgia Tech Research
–
November 13, 2024 – by Cass Kennedy
Follow Us