New phone scam using AI generated voices

Artificial intelligence (AI) can make our lives easier in many ways. But technological advances also bring risks. A new phone scam now relies on AI-generated voices that are confusingly similar to real people. Criminals in the USA – disguised as alleged relatives in need – have already been able to deceive numerous unsuspecting victims.

The fact that scams via SMS or Whatsapp are so often successful is also due to the fact that almost anyone could be behind text messages. This is also the case with the so-called grandchild trick, where scammers pretend to be close relatives in writing. Older people in particular, whose memory is no longer quite fit and who may have their problems with technology, are relatively easy to get unsettled. So if a person gets a message from the apparent grandson who is said to be in an emergency situation, the purse is quickly pulled out. Telephone fraud, on the other hand, can fail because the victim perceives the voice of the alleged relative as foreign. But it is precisely this hurdle that criminals can now overcome – with the help of AI-generated voices.

Voice imitation with AI – how does it work?

AI stands for the imitation of human intelligence. With the help of training data fed into the programs and processed by them, AI can answer questions and think ahead for itself. Widespread areas of application are, for example, the recommendation systems from Netflix and Co., which are based on the streaming behavior of users, or the face and voice recognition function of smartphones. A Colombian judge recently read out a verdict that he had written by the AI ​​bot ChatGPT.

The capabilities of AI are still sufficient to internalize and faithfully imitate people’s voices using short sound samples. Software developed for this purpose analyzes what makes a voice unique, according to a current article in the daily newspaper “Washington Post”. They also take into account the estimated age and gender of the person as well as any peculiarities (e.g. speech impediments, accents). On this basis, the AI ​​selects one voice from an enormous database of voices that most closely resembles the template.

Numerous fraud cases in the USA

As the newspaper further reports, fraudulent calls with AI-generated voices are currently increasing within the USA. The case of the Card couple is presented, she is 73 and he is 75 years old. They were called from prison by their supposed grandson, Brandon, with an urgent request for bail money. As a result, the frightened grandparents want to have visited several financial institutions and withdraw the highest possible amount from their accounts. In this case, the manager of one of the banks was able to prevent even worse. He had already heard about the robber’s gun from other customers and thus exposed it as a scam. Another older couple, on the other hand, transferred the approximately $15,000 demanded by the scammer (the alleged son) without hesitation.

Such individual cases make up a remarkable, big picture: last year, fraudulent calls in the USA are said to have caused financial damage totaling around 11 million US dollars.

Also interesting: Spoofing – this is how you protect yourself from scams

Protect your voice from imitation attempts

Corresponding cases from Germany are not yet known. But it is probably only a matter of time before the scam with AI-generated voices will also find its way here. So if you don’t want to accidentally become an accomplice, you should be careful, for example if you get a call from an unknown or suppressed number. Because software for voice imitation only needs very short templates (e.g. a “hello, who is it?”) to generate a realistic imitator. Users of TikTok and Co. who let their voice out in reels or videos are a target for scammers.

ttn-35