Biometrics UnPlugged is nearly upon us. The interactive C level executive summit will be kicking off September’s festivities in Tampa, Florida, immediately before the IBIA Opening Reception at the Tampa Museum of Art.
This year, Biometrics UnPlugged is themed around Mobility at the Crossroads of Commerce and Privacy. All summer at FindBiometrics, we have been examining current topics on those themes, with July focusing on next generation commerce and August on privacy.
When it comes to privacy, the discussion of biometrics turns from authentication and security to identification and trust. Today, we are going to take a look at three biometrics that you broadcast to the world in your everyday life. These are features that can be used to identify you and that you have very little control over, thus necessitating a discussion of best standards and practices.
It is important to keep in mind that no biometric technology is inherently invasive. This is a conversation about ethical use. Privacy, when it comes to the following biometric modes, boils down to permissions and transparency. Just as, in a pre-biometric world, people needed to know when they were being surveilled (and thus given the opportunity to avoid surveillance) they need to know when and who is measuring their biometrics in addition to why.
Face
Nothing has become quite as ubiquitous in the age of the Internet as the human face. Even if you are not on Facebook and have never taken a selfie, doing business online requires a certain amount of trust: a profile picture on your LinkedIn profile lets people know you’re not a bot, while a headshot next to your bio on a website’s About Us section gives your content credibility and authority. In the anonymous ocean of the Internet, what you do sinks or swims based on how well you are humanized.
Here is the thing: the fact that your picture is available to the rest of the world also means that it can be compared to other faces and photos. This is a primitive description of facial recognition, but it’s enough to make the point: the Internet has made your face public, meaning that wherever you go, anyone surveying you has the potential to identify you without your knowledge.
There are applications for invisible facial recognition that can greatly improve the human experience. With a comprehensive facial database of pre-cleared travellers the airports of the future could potentially be barrier free, while digital billboards can already measure demographic data of crowds to best curate advertising.
Unfortunately, without regulations and standards in place, the pendulum of potential can also swing in the opposite direction. A facial recognition app exists for Google Glass that allows the wearer to perform real time searches on the faces of people she sees. Because of the wealth of biographic information available on the Internet, being able to pin that to a stranger is not only scary, but dangerous.
Voice
Passive voice authentication is starting to be used in call centers and on IVR systems. Because, for the most part, we exist in a vision-based society, there is not a massive record of voiceprints that can be cross referenced anytime the average person makes a call. That doesn’t mean you’re not broadcasting biometric information any time you use your voice as a communication tool.
To illustrate the privacy concern, consider the application of forensic voice biometrics to identify the once-anonymous philanthropist Mr. Hidden Cash. In June of this year, Kent Gibson, owner of Forensic Audio in Los Angeles, used SpeechPro’s SIS II Voice ID module to match voice from an interview sample with audio from a podcast in order to identify a man who was anonymously hiding bundles of money around California cities. His name and profession was then reported online by Inside Edition.
This is the perfect illustration of how a good and important technology – forensics grade voice matching software – that can be used for public safety, can be used to limit a person’s freedom of anonymity. The last thing that anyone should want is for the average person to be afraid of using their voice for fear of being publicly exposed by strangers.
Behavioral
The last of the broadcasting biometrics on this list is just starting to emerge. Behavioral biometrics, also sometimes known as invisible biometrics, are exactly what they sound like: the measurements of your behavior. Keystrokes, mouse clicks, words per minute, walking gait and quotidian gestures all can be measured and a profile can be built around the way you interact with computers and real life.
Behavioral biometrics, similar to voice, can be used passively to scale online security and to detect fraud. Websites outfitted with BioCatch, for instance, use behavioral biometrics to ensure that you are a human when accessing accounts or shopping online.
It’s not difficult to imagine a malicious software that can measure these invisible traits in order to learn things about users without permission. Browsing habits might not seem like the most vital information for a stranger to know about you, but those browsing habits are yours to give to a trusted service and should not be taken from you without permission.
*
Have something to add? Follow us on Twitter and use the hashtag #BUP2014. Haven’t registered for Biometric UnPlugged? Don’t worry, registration is still open through the event’s website.
—
August 20, 2014 – by Peter B. Counter
Follow Us