Retailers may back away from virtual try-on features to head off potential privacy lawsuits. Virtual try-on systems leverage computer vision and augmented reality technology to show shoppers what they would look like while wearing a particular item in an app. Such solutions gained traction during the pandemic, since they give people a way to try on an item remotely, without making a trip to a physical store.
Critics, however, are focused on the fact that many of those systems analyze the user’s facial features to ensure a proper fit. Such apps may not disclose the fact that they are using that tech, which could violate privacy laws if companies are storing or sharing that biometric data. As Bloomberg Law points out, Estée Lauder is already learning that first-hand, insofar as the cosmetics giant has been hit with a BIPA lawsuit in Illinois that alleges that the company has stored the facial images of people who use its Virtual Try-On tool to see the effect of different make-up products.
The Biometric Information Privacy Act has made Illinois the most popular jurisdiction for litigators, to the point that some companies have floated the possibility of discontinuing Virtual Try-On services in the state while continuing to offer them elsewhere. However, other states (including California and New York) do have data privacy laws, and are currently fielding lawsuits associated with virtual try-on features, so retailers probably won’t be able to skirt the issue entirely.
Businesses will likely need to craft better privacy policies to move forward with any level of confidence with regards to litigation. Those policies will need to detail exactly what technology is being used, and whether or not any personal information is being collected during that process. For instance, the eyewear brand Warby Parker acknowledges that it analyzes “data points on the user’s face” to apply a pair of glasses, but stresses that it does not store any facial data or share any information with other parties.
The fact that Warby Parker is not harvesting data should provide a degree of legal protection, since it means that the virtual try-on feature functions more like a social media filter than a true biometric data service (though it is worth noting that social media platforms are facing similar privacy concerns). Organizations that do want to capture real biometric data will need to obtain the informed consent of the user in order to move forward with collection.
Source: Input Mag, Bloomberg Law
–
July 14, 2022 – by Eric Weiss
Follow Us