Beware of deepfake. How to spot a scammer on video.

Student

Professional
Messages
439
Reaction score
184
Points
43
Cybercriminals are constantly inventing new ways to deceive citizens. For example, the number of cases of fraud using deepfake technology is growing. Deepfakes are artificial intelligence-generated voice and video calls from supposed colleagues, friends, and relatives. These calls typically ask for an urgent transfer of money to a specified bank card, account, or phone number.
Until recently, cybercriminals made phone calls exclusively. Using psychological manipulation, they persuaded citizens to take actions that would result in their money ending up in the scammers' accounts.
The bank has identified a list of threats and developed methodological recommendations to combat fraud. Starting in June 2024, payment system operators and banks will be required to report stolen funds. This should help combat illegal transfers made without the consent of clients.
Mobile operators and banks joined forces to monitor and block such phone calls, and after that, criminals began calling via messaging apps. The conversation pattern remains virtually unchanged: scammers pose as bank or law enforcement employees and continue to attempt to manipulate people. However, now potential victims see the name and logo of the bank or the police on their screens, and even receive fake copies of documents.
The fraudsters then withdrew money from his accounts over the course of a month. The total losses amounted to $500,000+.
Furthermore, criminals have learned to send voice messages impersonating relatives and friends and even make video calls, posing as loved ones or bank or police officers. This is made possible by artificial intelligence and the deepfake technology developed on its basis. It can be used to create a video of a person you know speaking with a familiar voice, or a voice message imitating the voice of a loved one.

How Deepfakes Can Be Used to Deceive​

Until recently, using deepfakes was too costly for scammers, and such deceptions were rare. The first significant incident occurred in 2019.
The director of a British energy company transferred $243,000 to a fraudulent account after receiving a call from "his boss." The scammers imitated the boss's voice so accurately that the victim believed they were carrying out his instructions.
Today, it's easy to find software online that can generate deepfake video or audio files. And the number of fraud cases using this technology is growing.
In early 2024, a video surfaced on the internet showing a woman answering a call on a messenger. She hears her son's voice, who sends her a phone number and asks her to transfer 1,000 rubles to it for a gift for a friend. Meanwhile, her real son is sitting next to her.

The US Federal Trade Commission (FTC) reported that
In 2022, fraudsters used voice imitation technology to steal approximately $11 million from the country's citizens.

In May 2023, a Chinese businessman received a video call from a man who looked and sounded like a close friend. He asked for $610,000 to pay for tender guarantees. The deepfake image was so convincing that the businessman complied.
Bank's cybersecurity team has noted an increase in the number of video calls made via messaging apps impersonating bank employees. The scammers, however, do not reveal their faces, using deepfake technology to disguise themselves. For example, after analyzing a video shared by a bank client, cybersecurity specialists discovered that the scammers had superimposed an image of the famous actor Keanu Reeves.
A high-profile incident occurred in Hong Kong in 2024. An employee of a multinational corporation, after a video call with the CFO and other employees generated using a deepfake, transferred $25 million to the scammers. The fraud was only discovered a week later, when the employee called head office.
Angara Security experts reported that scammers are increasingly posting ads on Telegram offering paid voiceovers for commercials and films. They ask users to call or send a recording via private message or bot. Participants are offered up to $100 for participating. After obtaining voice samples, the scammers generate an audio message, hack the victim's account, and send requests for money transfers to their friends and family.

Other deepfake scams​

Deepfake technology is evolving and improving, meaning scammers have more and more opportunities to exploit it. Besides messaging app calls, other types of crimes are also gaining popularity.
For example, even in the early days of this technology, it was used to create pornographic videos featuring famous people. And in June 2023, the US Federal Bureau of Investigation's Internet Crime Complaint Center (IC3) warned that cybercriminals were increasingly using deepfakes to generate sexually explicit content for blackmail purposes. They typically demand ransom, threatening to post the content online or send it to the victim's relatives and colleagues.
Fraudsters use deepfake technology to create advertisements enticing people to participate in prize draws or make profitable investments. For example, in 2021, a video appeared online featuring Oleg Tinkov promising eye-popping bonuses to those who opened an investment account with the bank. The video turned out to be a poorly crafted deepfake.

Another example: scammers cloned the voice of a famous rapper, created several new songs with it, and put them up for sale as demo recordings for a new album. The result: defrauded fans transferred $13,000.

How to spot a deepfake​

Today, there are numerous open-source programs available for creating simple deepfake audio or video. Furthermore, more advanced applications are available that, after training, can generate the most realistic image possible.
At the same time, deepfake detection tools are also being developed. However, to do so, they require an audio or video file.

But what should you do if you're on a video call? How can you tell if you're seeing a real person or a deepfake? First, ask the person you're talking to to sit facing the light so you can clearly see their facial expressions.

Cybersecurity experts recommend paying attention to the following details:​

  • Blurred or unfocused facial image in video
  • Lack of facial expressions during conversation (free deepfake creation tools often incorrectly display the facial expressions of the person speaking)
  • Unnatural facial expressions (blinking eyes, eyebrow and lip movements)
  • Most often, only the face is replaced in deepfakes, so you can see the boundaries of the superimposed image (differences in shadows, lighting, and skin tone)

Cybersecurity expert commentary
If you're receiving a video call from a friend asking for a large loan, or if your boss is urgently requesting a transfer to a new client, ask the person to turn their head 90 degrees. If it's a deepfake, their face will "float."

You can also ask the person to take a sip from their cup, fix their hair, remove their glasses, or simply wave their hand in front of their face. Typically, in such situations, the deepfake image will begin to malfunction, and you'll notice.
Recognizing a deepfake over an audio call is more difficult. Experts recommend paying attention to intonation and the use of distinctive words. A monotone, emotionless voice should alert you. However, these signs are only noticeable in longer messages — it's much harder to spot inconsistencies in a short phrase.
But the surest way to avoid falling into the hands of deepfake scammers is vigilance.
Please remember that bank and law enforcement officials never make video calls via messengers.
Before you start communicating with someone remotely, let alone taking any action, verify that they are who they say they are. Call them back on the phone and verify that it is really them communicating with you via messenger.

(c) Source
 
Top