There have been numerous news reports of scammers using AI voice cloning to trick people into thinking that their kid has been kidnapped:
-
An Arizona mom claims that scammers used AI to clone her daughter’s voice so they could demand a $1 million ransom from her as part of a terrifying new voice scheme.
-
The FTC is sounding the alarm on artificial intelligence being used to simulate someone’s voice in imposter scams, which was the most commonly reported fraud in 2022. NBC News’ Emilie Ikeda spoke to one father who got a call that sounded like his daughter and said she was being held hostage.
-
Scammers are using artificial intelligence to sound more like family members in distress. People are falling for it and losing thousands of dollars.
I'm skeptical – not that people are getting scammed, but that AI voice generation is involved. It seems like it would be a lot of trouble to get hold of audio of someone's kid and feed it into AI (in "terrified kidnapped child" mode?), when you can just use a generic terrified-kid soundbite and rely on the power of suggestion. None of the sources I've found present any actual evidence that AI is being used; they just repeat the victims' claims that the voice over the phone absolutely sounded like their child.
Even the FTC is claiming this is a thing. But I wonder if they have evidence or if they're just following the news reports. No one's ever going to get in trouble for telling people to watch out for scams, after all.
Are scammers using AI voice cloning to fake kidnappings?