Deepfake attack: scammers clone faces and empty wallets of Russians

Friend

Professional
Messages
2,653
Reaction score
850
Points
113
The Central Bank has issued recommendations on how to protect yourself from digital twins.

The Central Bank of Russia has warned about the increased activity of fraudsters using deepfake technology. Modern attackers use neural networks to create fake videos with realistic images of real people, using their photos, videos and voice recordings obtained through hacking accounts in social networks or messengers.

In such fraudulent schemes, virtual cloned images are sent to the victim's friends and relatives asking for financial assistance. Videos often contain false stories about an illness, accident, or dismissal, calling for money to be transferred to the specified account. In some cases, scammers create deepfakes depicting employers, government employees, or famous personalities to confuse their victims.

To protect yourself from such scams, the bank recommends following a few simple rules. First of all, if you receive a message asking for a money transfer, do not rush to do so. First, call the person on whose behalf the request was allegedly received to confirm the authenticity of the request. If you can't make a phone call, ask a personal question that only someone you know knows the answer to. Also pay attention to the quality of the video-monotonous speech, unnatural facial expressions and strange sounds can be a sign of using a deepfake.

At the end of May 2024, the State Duma proposed to introduce a bill that proposes to introduce criminal liability for deepfakes using the voice or image of a person.

Depending on the article, the perpetrators will face a fine of up to 1.5 million rubles or in the amount of other income for a period of up to two years, or imprisonment for up to seven years.

Yaroslav Nilov, head of the Committee on Labor, Social Policy and Veterans Affairs, explained that the initiative is aimed at establishing more serious responsibility for the use of deepfake technologies, which are already actively used for fraudulent activities, as was demonstrated on a direct line with the president last year.

Source
 
Top