ID R&D has released a survey that suggests that the American public may have too much faith in its anti-spoofing capabilities. The survey found that 36 percent of Americans were confident that they could distinguish a real human voice from a computer generated fake.
Of course, the fact that Americans believe they could tell the difference doesn’t necessarily make it so. ID R&D notes that deepfake technology has become more and more convincing, and is now being used to engage in fraud and other forms of cybercrime with increasing frequency. The company also cites research that shows that human ears (and the human brain) may not be able to separate synthetic voices from real ones. That study was carried out by the University of California Riverside, the University of Alabama at Birmingham, and Syracuse University.
“Those of us in the biometric industry have a responsibility to educate consumers about the risks of deepfakes and synthetic voice,” said ID R&D President Alexey Khitrov. He went on to frame biometric tech as a potential solution to the problem, arguing that ID R&D’s voice and face anti-spoofing technology can spot differences that are imperceptible to human senses.
“Just as the consumers in the early 1990s were suspicious about online commerce, we believe that once users learn how biometrics can protect their data, voice technology will see exponential growth,” said Khitrov.
On that note, ID R&D recently took the top spot in the 2019 ASVspoof Challenge, which evaluates solutions based on their ability to detect fake speech. The company also released a face-based passive liveness detection system earlier in the year.
Though they displayed some overconfidence, 66 percent of the survey respondents were worried that deepfake tech could be used to impersonate their voice during an account takeover attack. Somewhere between a quarter and a third would be willing to use a chatbot or a smart speaker to access their account information if they were sure the platform was secure.
–
December 19, 2019 – by Eric Weiss
Follow Us