Deepfake scam of all time? 24 million euros were taken from a Hong Kong company in an incredible way

The employee did not realize that the person who gave the instructions was not the company’s CFO. He turned out to have transferred almost 24 million euros of the company’s money to criminals.

An international company headquartered in Hong Kong has fallen victim to an exceptionally advanced deepfake scam, says South China Morning Post citing Ars Technica.

The authorities have not revealed the name of the company, but the fraudsters’ catch was significant.

The scam used deepfake content made with artificial intelligence in an unprecedented way. A person who worked in the company’s financial department received an email invitation to join a video conference, where he was met by the company’s financial director and other employees – or so he thought.

Secret mission

The employee invited to the video conference was tasked with transferring the company’s funds to certain bank accounts, and he did the work as ordered. There were a total of 15 transfers, and they were made to five different Hong Kong bank accounts. The total amount rose to no less than 200 million Hong Kong dollars, or just under 24 million euros.

The scam was only revealed a week later. The investigation revealed that the other parties to the video conference were created by artificial intelligence and not real people. Deepfake scams combine generated image and sound material, thanks to which practically anyone can be made to say anything in the video.

According to the employee who made the transfers, he had considered the requests suspicious because they had been told to do them in secret. However, the video call had been convincing to him, and he did not doubt the other party’s authenticity.

Inspector General of Hong Kong Police Baron Chan Shun-ching says that he has never come across such a sophisticated deepfake scam before.

Here’s how to spot a deepfake

After the incident, the police have published instructions aimed at the business world, which can be used to make sure that the remote meeting is with real people.

Conversation partners can, for example, be asked to move in a certain way or to answer a question that the scammer could not answer.

The police have recommended that companies also introduce personal identification keys with which employees can verify their identity in video conferences.

Source: Ars Technica

ttn-54