Man
Professional
- Messages
- 3,055
- Reaction score
- 580
- Points
- 113
Fraudsters cloned the voice of a UAE bank director and used it to rob the financial institution. They managed to steal $35 million.
In early 2020, the bank manager received a call from a man he thought was the company's director. He said he was about to conduct a certain transaction and the bank had to make a transfer of $35 million.
A lawyer named Martin Zellner was hired to coordinate the procedure, and the bank manager began receiving letters in his name and in the name of the director. They contained confirmation that the bank needed money for the move. The manager transferred the funds.
As it turned out later, the scammers used deepfake technology to clone the bank director’s speech. UAE authorities asked for help from American investigators to track the $400,000 that was transferred to US accounts owned by Centennial Bank. Law enforcement officials believe it was an elaborate scheme involving at least 17 people who sent the stolen money to bank accounts around the world.
Before this, the most high-profile case of fraud and the use of voice deepfake was the theft of $240,000 from a British energy company. This happened in 2019. At that time, the fraudsters tried to impersonate the German director of the company.
“Deepfakes are an exciting development in 21st century technology, but they are also potentially incredibly dangerous, posing a huge threat to data, money and businesses,” says Jake Moore, a former officer with Dorset Police in the UK and a cybersecurity expert at ESET. “Easier to orchestrate than deepfake videos, audio manipulation is only set to increase in volume, and without awareness of this new type of attack vector, more businesses are likely to fall victim.”
Some cybersecurity companies, such as Pindrop, already offer a service to detect voice deepfakes.
Source

In early 2020, the bank manager received a call from a man he thought was the company's director. He said he was about to conduct a certain transaction and the bank had to make a transfer of $35 million.
A lawyer named Martin Zellner was hired to coordinate the procedure, and the bank manager began receiving letters in his name and in the name of the director. They contained confirmation that the bank needed money for the move. The manager transferred the funds.
As it turned out later, the scammers used deepfake technology to clone the bank director’s speech. UAE authorities asked for help from American investigators to track the $400,000 that was transferred to US accounts owned by Centennial Bank. Law enforcement officials believe it was an elaborate scheme involving at least 17 people who sent the stolen money to bank accounts around the world.
Before this, the most high-profile case of fraud and the use of voice deepfake was the theft of $240,000 from a British energy company. This happened in 2019. At that time, the fraudsters tried to impersonate the German director of the company.
“Deepfakes are an exciting development in 21st century technology, but they are also potentially incredibly dangerous, posing a huge threat to data, money and businesses,” says Jake Moore, a former officer with Dorset Police in the UK and a cybersecurity expert at ESET. “Easier to orchestrate than deepfake videos, audio manipulation is only set to increase in volume, and without awareness of this new type of attack vector, more businesses are likely to fall victim.”
Some cybersecurity companies, such as Pindrop, already offer a service to detect voice deepfakes.
Source