Home > Media News > Scammers pose as company director using deep voice tech to pull off heist in UAE

Scammers pose as company director using deep voice tech to pull off heist in UAE
17 Oct, 2021 / 05:49 am / OMNES Media LLC

Source: https://me.mashable.com/

617 Views

The robbers swindled $35 million by placing a call to tne bank ok behalf of the firm.

The increased online activity, leading to higher volumes of data changing hands, has created a cybersecurity nightmare for authorities who tackle evolving tactics of cybercrooks. Phishing attacks are often carried out via calls, texts and mails, where cybercriminals pose as bank representatives, to trick netizens into revealing personal info, which can later be used to rob them.

Deep-fake tech is another addition on the list of digital tools turning out to be a menace for online safety mechanisms, since it's being used to create morphed videos and images. But investigation into a major heist that took place in the UAE last year, has revealed that deep voice is a tool that can have far more damaging consequences, while being easier to use than deep-fake.

According to details from authorities, the manager of a bank in the Emirates received a call supposedly from a firm's director, requesting a transfer of $35 million for an acquisition. The executive quickly went ahead with the transaction, since the voice on the other side was clearly that of a client who he recognised.

But it was later found that the company director never made that call, and instead it was a gang of cybercriminals who had cloned his speech using deep voice tech. About $400,000 of the stolen money was traced back to accounts based in the US, and investigators claim that 17 people involved in the robbery had distributed the amount across the globe.

This is only the second known instance of such a heist being pulled off via deep voice, but is a lot more successful since the first attempt in UK involved $240,000. Back then the robbers had tried to pose as the CEO of a British energy firm.

Such attacks are also a bigger threat since modifications to speech are easier as compared to creation of deep-fake videos, and there's a lack of awareness about it. The situation is so grave, that earlier this year, Emirati authorities had to issue a notification warning people about the use of these tools for fraudulent schemes.