It was recently announced that a biometric data privacy suit implicated the voice biometrics company, Pindrop. According to the plaintiffs, informed consent requirements were not followed when collecting, storing, and utilizing voiceprints to authenticate customers for Amazon Web Services, which the plaintiffs claim violates the Illinois Biometric Information Privacy Act (BIPA).
This scenario is exactly what LumenVox works hard to avoid. Since LumenVox is headquartered in California (where the biggest giants of customer data work, such as Facebook and Google) and operates internationally, we understand that customers demand privacy. There are now stringent laws in place, such as the California Consumer Privacy Act and GDPR, to make sure that just as a company’s technology evolves, so must the strategy to correctly collect, handle and store sensitive consumer data.
When it comes to any biometrics modality, informed consent is key. LumenVox advises our customers and partners to include a very clear information flow to gain informed consent for both active and passive voice biometrics. As a voice biometrics provider, education, training, and guidance are three things we pride ourselves on as we work closely with our clients on voice biometric deployments.
Biometric solutions have been around for decades now, but understandably consumers and businesses still have their reservations. What’s particularly comforting about voice biometrics is the way voiceprints are stored – with Personal Identifiable Information (PII) decoupled from the voiceprint. This means a consumer’s voiceprint is matched to a string of 0-s and 1-s, numbers that would mean nothing to a malicious actor.
This news inevitably raises questions: Is voice biometrics safe? Yes. Can it be used to protect customer data? Yes. Is informed consent key when utilizing it? Absolutely. At LumenVox, we strive to put customers first, every step of the way.
Click here to learn more about our products and solutions.