Home > Media News >
Voice cloning has become a latest addition to the technology with lot of pros and cons. Rapid progress in voice cloning technology is making it harder to tell real voices from synthetic ones. But while audio deep fakes — which can trick people into giving up sensitive information — are a growing problem, there are some good and legitimate uses for the technology as well, a group of experts told an FTC workshop this week.
“People have been mimicking voices for years, but just in the last few years, the technology has advanced to the point where we can clone voices at scale using a very small audio sample,” said Laura DeMartino, associate director in the FTC’s division of litigation technology and analysis. At its first public workshop on audio cloning technology, the FTC enlisted experts from academia, government, medicine, and entertainment to highlight the implications of the tech and the potential harms.
FTC spokesperson Juliana Gruenwald Henderson said after the workshop that impostor schemes are the number one type of complaint the agency receives. “We began organizing this workshop after learning that machine learning techniques are rapidly improving the quality of voice clones,” she said in an email.
Audio cloning can be weaponized just like the internet can be weaponized. “That doesn’t mean we shouldn’t use the internet, but there may be things we can do, things on the front end, to bake into the technology to make it harder to weaponize voices.”
For voice actors and performers, the concept of audio cloning presents a different set of problems, including consent and compensation for use of their voices, said Rebecca Damon of the Screen Actors Guild - American Federation of Television and Radio Artists. A voice actor may have contractual obligations around where their voice is heard, or may not want their voice to be used in a way not compatible with their beliefs, she said.
And for broadcast journalists, she added, the misuse or replication of their voices without permission has the potential to affect their credibility. “A lot of times people get excited and rush in with the new technology and then don’t necessarily think through all the applications,” Damon said.
“Social media platforms are the front line, that is where messages are getting conveyed and latched on to and disseminated,” said Neil Johnson, an advisor with the Defense Advanced Research Projects Agency (DARPA
Patrick Traynor of the Herbert Wertheim College of Engineering at the University of Florida said the sophistication around phone scams and audio deep fakes was likely to continue to improve. “Ultimately, it will be a combination of techniques that will get us there,” to combat and detect synthetic or faked voices, he said.