More customers are asking for digital services that are as convenient as they are contactless. In 2021, for example, as many as 41% of US retail banking customers said they only used digital channels to interact with their banks.
Along with this opportunity, however, come a number of risks. Security and fraud prevention are an ongoing challenge in an expanding digital landscape. Last year, 57% of companies saw a surge in fraudulent attacks. One of the most advanced ways to fight fraud is through voice biometric solutions. These use voiceprints—as unique as a person’s fingerprint—to automatically identify someone and authenticate their identity.
Done well, voice biometry offers a fast and friction-free alternative to traditional authentication methods like personal passwords, one-time passwords, PINS and tokens. There is no need for people to remember long strings of numbers and letters, or go through a series of identity verification questions with a service rep. All they need is their voice.
As no two voices are identical in pattern, rhythm and speed, a voiceprint is difficult to forge. And by the nature of the technology, it is also impossible to transfer. Voice biometric solutions are therefore a secure and reliable way to positively identify and authenticate individuals—or flag potential criminals.
But there is a crucial caveat: biometric traits belong in the category of sensitive personal information. That means their use falls under the provisions of various privacy and data protection laws, both in the US and abroad. Companies must be confident that the methods used to collect, store and process voiceprints are fully compliant with all relevant legislation that protects their users. Part of that process involves ensuring customers provide informed biometric consent before any of their data is collected.
How to avoid compliance failures
It was recently announced that a biometric data privacy suit implicated the voice biometrics company, Pindrop. According to the plaintiffs, informed consent requirements were not followed when collecting, storing, and utilizing voiceprints to authenticate customers for Amazon Web Services, which the plaintiffs claim violates the Illinois Biometric Information Privacy Act (BIPA).
For any company that wants to follow best practices and ensure watertight compliance in the area of biometric privacy, one of the most important steps is choosing a trustworthy voice biometrics technology partner.
Google voice biometrics, and a selection of providers come up. The ideal voice biometry partner is one that can offer both expertise and accountability in the field of biometric data privacy. They should also store data in a way that the Personal Identifiable Information (PII) is decoupled from the voiceprint. This means a consumer’s voiceprint is matched to a string of 0’s and 1’s—numbers that would mean nothing to a malicious actor.
How LumenVox meets your security and compliance needs
Since LumenVox is headquartered in California (where the biggest giants of customer data work, such as Facebook and Google) and operates internationally, we understand that customers demand privacy. There are now stringent laws in place, such as the California Consumer Privacy Act and the GDPR in the EU, to make sure that just as a company’s technology evolves, so must the strategy to correctly collect, handle and store sensitive consumer data.
The Pindrop scenario is exactly what LumenVox works hard to avoid. When it comes to any biometrics modality, informed consent is key. LumenVox advises our customers and partners to include a very clear information flow to gain informed consent for both active and passive voice biometrics.
We are a company that always takes a collaborative approach. Education, training, and guidance are three things we pride ourselves on as we work closely with our clients on voice biometric deployments. Get in touch to discuss your options and learn more about our products.