AI Voice Cloning Is Coming to Your Phone—Here's Why You Need to Be Careful

Scammers could use the sound of your voice to fool people

  • Samsung phones are getting an upgrade that will let you clone your voice. 
  • Voice cloning can be used to help people with speech disabilities. 
  • The FBI warns that scammers are using AI voice clones to trick people.
Someone adjusting voice data on a computer while using headphones to hear the results.

Kelly Sikkema / Unsplash

Get ready for your phone to sound just like you, even when you're not the one doing the talking. 

Samsung is tweaking its software assistant Bixby to clone a user's voice to answer calls. It's part of a larger movement toward creating artificial intelligence (AI) voice clones that could lead to authenticity problems. 

"The biggest red flag is that with any deep fake, it is hard to tell what's real and what's not," Rijul Gupta, the CEO of the AI platform company DeepMedia told Lifewire in an email interview. 

Your AI Voice Clone

Samsung’s Bixby upgrade now lets English speakers answer calls by typing a message, which Bixby converts to audio and communicates to the caller directly on their behalf. You can also use the new Bixby Custom Voice Creator to record different sentences for Bixby to analyze and create an AI-generated copy of your voice and tone. 

AI Voice cloning, when done ethically, can be helpful in many ways, especially in the entertainment industry, Gupta said. Studios can save hundreds of thousands of dollars because they no longer need to pay the actors and sound technicians. Until recently, to clone a voice, you’d need a large amount of recorded speech to create a dataset, but because the technology is evolving quickly, all you need is a few minutes of recorded speech.

Another area where voice cloning technology can be helpful is when it's used to create custom synthetic voices for individuals with speech disabilities, Gupta said. "This can help them communicate more effectively and express themselves with a voice that is uniquely their own,” he added. 

For example, many patients with laryngeal cancer undergo removal of the larynx. But the procedure often causes patients to lose their voice. Voice cloning technology can enhance an artificial larynx to make patients sound more like themselves.  

The Dark Side of Voice Cloning

Despite its useful applications, voice cloning is fraught with potential problems. Mohamed Lazzouni, the chief technology officer of the biometric software company Aware, said in an email that voice cloning technology raises issues of privacy and consent. 

The biggest red flag is that with any deep fake, it is hard to tell what's real and what's not.

"The use of the replica of someone's voice without their permission or the exploration of the voice for malicious activities is a serious violation of identity and privacy," he added. "One needs to be cautious with this promising technology; although it has made significant strides, it remains a technology in the early stages and still under development. As such, the accuracy and quality have not fully matured. The most significant liability of voice cloning is legal. Voice cloning could be used to defame, deceive, or incriminate people."

Voice cloning is also becoming a misinformation tool available to almost anyone. ElevenLabs' voice synthesis platform lets users generate realistic audio of any person's voice by uploading a few minutes of audio samples and typing in text for it to say.

Voice cloning tech is already being used to promote hate. In a video from a recent news report, President Joe Biden discussed tanks. However, a doctored version of the video uses AI voice cloning to make it appear he gave a speech that attacks transgender people. It amassed hundreds of thousands of views on social media.

Someone recording their voice on a phone while wearing headphones.

Soundtrap / Unsplash

"Because voice cloning is accurate and convincing, we're seeing a lot more created simply for political gain," Gupta said. "Anyone could take a political figure like President Biden or [Vladimir] Putin and make them say anything—this only adds fuel to the misinformation fire. For example, hearing a world leader attack a marginalized group during a presumed speech could lead to real-world harm."

Scammers are also using AI voice cloning to call vulnerable people and trick them into transferring large amounts of money into their accounts, Gupta noted. The FBI has warned that criminals are using voice cloning to mimic your voice or a loved one's pretending to be kidnapped or in trouble to scam you out of money.

"These hackers have the ability to clone anyone's voice in a matter of minutes, so they can sound exactly like a family member crying out for help, forcing you into a dangerous situation," Gupta said. "In the heat of the moment, how could anyone question the validity of that call when the deep fake is just that good?"

Was this page helpful?