How Apple's Live Voice Feature Could Be Used for Voice Cloning Scams

It could easily become a disinformation tool

  • Apple’s new software will let you clone your voice on iPhones and iPads.
  • The technology is intended to help users who are non-speaking or at risk of losing that ability. 
  • But experts say voice cloning technology can aid fraud and cause confusion. 
Someone recording their voice on a smartphone against a yellow wall.

Westend61 / Getty Images

Your iPhone and iPad will soon let you clone your own voice, and experts say the new feature could lead to more deep fakes. 

The new Personal Voice feature will create a voice that sounds like the user. It's intended for users who are nonspeaking or at risk of losing that ability. But the technology could also create confusion. 

"This could very quickly become a swamp of deep fakes everywhere," Vinod Iyengar, AI expert and head of product at ThirdAI, told Lifewire in an email interview. "There are many voice biometric-based authentication systems that are going to be in trouble. What if someone uses a voice clone to gain unauthorized access to a bank account? Or uses it to spread misinformation." 

Your Voice, Recreated

You can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on an iPhone or iPad. The feature uses on-device machine learning to keep users' information private. 

"At the end of the day, the most important thing is being able to communicate with friends and family," Philip Green, who has experienced significant changes to his voice since receiving his ALS diagnosis, said in Apple's news release. "If you can tell them you love them in a voice that sounds like you, it makes all the difference in the world—and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary."

A Tool for Voice Cloning Scams?

Voice cloning technology like Personal Voice raises ethical questions regarding consent and the potential misuse of someone's voice without their permission, Tyler Sweatt, the chief revenue officer at the tech company Second Front, said in an email. 

“Voice cloning can be used to create fabricated audio content that appears to be genuine, making it harder to discern between real and fake audio recordings,” he added. 

Voice cloning scams are on the rise. Experts warn the technology can be used for malicious purposes, such as identity theft, fraudulent phone calls, and phishing emails. Criminals use AI to impersonate someone else and trick the victim into giving money or personal information. 

Voice cloning can be used to create fabricated audio content that appears to be genuine, making it harder to discern between real and fake audio recordings.

Scammers can clone the voice of a friend, family member, boss, or authority figure and call the victim with an urgent request for money or help. Alternatively, a scammer could clone the voice of a customer, supplier, or partner and send fake invoices or payment requests via email or phone. In 2021, at least eight senior citizens in Canada lost a combined $200,000 in an apparent voice cloning scam

The new voice cloning tech on iPhones could lead to unintended consequences, Iyengar said. He pointed out that iPhone calls are often used as evidence in legal cases. 

"That whole area is now a minefield, and pretty soon, courts might not accept voice recordings as evidence since it could have easily been faked,” he added. 

Personal Voice might blur the line between fiction and reality. Iyengar suggested that voice clones could be paired with AI chatbots to act as a "personal stunt double” for everyone. 

Someone editing an audio recording on a desktop computer, using audio equipment.

yanyong / Getty Images

The cloned voice could "represent the person and perform tasks on their behalf like making appointments, answering phone calls, and even video recordings that need to be read from a script,” Iyengar said. 

If your voice is cloned and used by someone else, it’s uncertain if the legal system will protect you. The "Right of Publicity” laws in the US give celebrities and other well-known people some control over the use of their names, image, and voice, Heidi McKee, a professor at Miami University, said in an email. And, she noted, it is illegal to impersonate a law enforcement official or a federal employee.

"But do those protections extend to all US citizens and in a way that would be easy for an individual to seek redress?” McKee said. “The law and regulations are not so clear on that. Already we are awash in misinformation and deep fakes, and the use of voice cloning will only expand those problems, particularly when coupled with video fakes as well.”

Was this page helpful?