You’ve no doubt heard of the scam where the perpetrator calls up an elderly person and pretends to be their grandson or other relative. The usual routine is to act sad, pretend they are in a difficult situation and request an urgent cash transfer to resolve the situation. While many grandparents will realize the voice isn’t their grandchild’s and hang up, others won’t notice, and are all too eager to help their anxious relative, go ahead and send money to the caller’s account.
A report from the Washington Post on Sunday revealed that some scammers have taken deception to a whole new level by deploying artificial intelligence technology capable of reproducing votes, making it more likely that a target will fall for the ruse.

To launch this more sophisticated version of the scam, criminals require “a voice sample of just a few sentences,” according to the paper. The sample is then played through one of the many online tools that use the original voice to create a replica that can be instructed to say what you want simply by typing the phrases.
Data from the Federal Trade Commission indicates that in 2022 alone, there were more than 36,000 reports of alleged fraud, with more than 5,000 of them occurring over the phone. Reported losses amounted to $11 million.
The fear is that as AI tools become more effective and more widely available, more people will fall for the scam in the coming months and years.
However, the scam still requires some planning, as the determined perpetrator needs to find an audio sample of the audio, as well as the relevant victim’s phone number. Audio samples, for example, can be found online via popular sites like TikTok and YouTube, while phone numbers can also be found on the web.
Scam can take many forms as well. The Post cites an example where someone, posing as a lawyer, called an elderly couple, telling them that their grandson was being held for an alleged crime and that they needed more than $15,000 to cover legal costs. The fake lawyer then pretended to hand the phone over to their grandson, whose reproduced voice asked for help paying the fee, which they duly did.
They only realize they have been conned when their grandson calls them later that day to chat. It is believed that the fraudster may have reproduced his voice from YouTube videos posted by the grandson, though it is difficult to be sure.
Some are calling for companies that make AI technology that reproduces voice to be held responsible for such crimes. But before that happens, it seems certain that many more will lose their money through this nefarious scam.
To hear an example of reproduced audio to see how close it is to the original, see this Digital Trends article.
Editors’ recommendations