Australia’s privacy regulator has released detailed guidelines for private sector organizations using facial recognition technology (FRT) in commercial and retail environments. The guidance from the Office of the Australian Information Commissioner (OAIC), which follows several high-profile enforcement actions against major retailers, addresses the handling of biometric data, which is classified as sensitive information under the Privacy Act and subject to heightened protection requirements.
Under the Privacy Act’s provisions, organizations must secure explicit consent before collecting sensitive information, barring specific exemptions. The collection must also be demonstrably necessary for organizational functions. The OAIC emphasizes implementing a “privacy by design” methodology, requiring organizations to verify that less invasive alternatives cannot achieve equivalent objectives.
The guidelines emphasize transparency and informed consent requirements. Organizations must provide explicit, proactive notifications about biometric data collection and usage. Generic references to “video surveillance” are deemed insufficient without specific disclosure of FRT deployment and its implications. This clarification follows a previous OAIC investigation into 7-Eleven’s FRT implementation, where inadequate disclosure was a key concern.
Addressing technical considerations, the guidance stipulates requirements for accuracy and bias mitigation in FRT systems, as outlined in Australian Privacy Principle 10. Systems must maintain data accuracy and address potential discriminatory outcomes through regular testing and review protocols. This is consistent with Australia’s broader National Strategy for Identity Resilience, which emphasizes the responsible deployment of biometric technologies.
The framework mandates comprehensive governance structures, including privacy risk management protocols and regular compliance reviews. Organizations are directed to conduct Privacy Impact Assessments (PIAs) before implementing FRT, or at minimum, document justifications for not performing such assessments.
A recent determination by the Privacy Commissioner examined a retailer’s FRT implementation that converted facial images from CCTV footage into vector sets for comparison against a threat database. Despite near-immediate deletion of non-matching data, the system’s operation was classified as “collection” under the Privacy Act. The retailer’s failure to provide explicit notification, obtain consent, and conduct a PIA resulted in a mandate to destroy collected data.
The Commissioner rejected the retailer’s invocation of “serious threat” and “unlawful activity” exceptions, determining that scanning hundreds of thousands of individuals to identify 448 potential risks constituted disproportionate privacy infringement. This decision follows similar determinations, including the OAIC’s ruling against Clearview AI for privacy violations in Australia.
The determination establishes precedent for organizations considering FRT deployment, emphasizing compliance with Australian Privacy Principles, consent requirements, notification protocols, and privacy policy implementation. The evolving regulatory landscape requires organizations to maintain vigilance regarding privacy obligations when implementing biometric technologies, particularly as Australia continues to develop its framework for digital identity and biometric data protection.
Sources: Norton Rose Fulbright
–
December 12, 2024 – by Cass Kennedy
Follow Us