Teacher
Professional
- Messages
- 2,669
- Reaction score
- 819
- Points
- 113
Cybercriminals began to extract samples of Russian votes by any means. They then use the obtained data for illegal activities, which can lead to problems for the owner of the voice, as well as for his acquaintances, friends and relatives.
Silence – security
Russians were hit by a wave of offers to earn money with the help of voice – for example, the Network has a lot of job ads in the voice-over of commercials. As representatives of the information security company Angara Security told CNews, they are not entirely honest citizens - they often collect samples of Russian voices, which are then "fed" to neural networks for their training. It is not uncommon for these samples to then become the basis for generating so-called "deepfakes".
A deepfake is a very reliable copy of something. The term became widespread with the development of neural networks, but at first it only concerned video-artificial intelligence made it possible to make visual digital copies of people, which began to be used in advertising, fraud, and pornography. There are many examples when scammers blackmailed their victims with the help of generated porn deepfakes with them.
With the voice, the scheme is roughly the same, only here most often the victims are not the owners of votes themselves, but their friends, acquaintances and relatives. For example, you can use a voice deepfake to extort money.
It's just getting started
Experts of the information security company Angara Security drew attention to the problem of security of votes of Russians. They told CNews that ads with a set of voice actors began to appear on the largest job placement platforms, as well as in Telegram. Some "orthodox" scammers still rely on phone calls – they call complete strangers and offer them a job. However, there is no guarantee that even during this phone conversation, no recording is made, which will later turn into a virtual copy of the voice.
The Angara Security report says that scammers have been interested in the voice data of Russians for several years, but over time their interest in them is growing stronger. So, in 2021, analysts identified about 1,200 messages with the offer to "work with your voice", and this still does not take into account spam calls. A year later, there were 4,800 of them, which is a fourfold increase. The result for 2023 is 7,000 ads.
Almost free voice
Fraudsters, trying to automate the process of obtaining the information they need as much as possible, decided to shift everything to the shoulders of the Russians themselves. In the text of their ads, they ask potential victims to send an audio recording of their voice, whether it is a recording of a conversation or a phone call.
At the same time, there is no talk of any fabulous fees for work in such ads – most often they offer from 300 to 5000 rubles. But, as indicated in Angara Security, most often those who decided to earn extra money and shared with unknown samples of their voice, often sometimes still receive the promised fee.
Banks are wary
After receiving a sample of a Russian citizen's voice, fraudsters can then use it to call their friends, colleagues, and relatives. They can also apply on his behalf to the banks where he has an account.
Note that the method with job ads is a very elegant way to get voice data, but even more often scammers use the brute-force method. They simply hack into the profiles of Russians in social networks and instant messengers, fish out voice messages from there and make deepfakes based on them. Sberbank experts spoke about the existence of such a method of voice theft in information security back in early 2024.
Sberbank does not publish an assessment of potential and actual damage caused to Russians through the theft and use of their voice. Experts say that so far there are not very many such cases, but, apparently, the key word here is "so far".
If you look at the situation with deception of Russians in general, then by the end of 2023, fraudsters stole 19 billion rubles from them, according to statistics provided by Deputy Chairman of the Board of Sberbank Stanislav Kuznetsov.
YouTube as the main assistant of scammers
Experts interviewed by Vedomosti drew attention to another potential source of Russian voice samples. Scammers can get them from videos posted on social networks.
In other words, video messages posted on users personal pages, as well as videos uploaded to their YouTube channels and its Russian counterparts – all this can be very useful for scammers. Once they have access to them, they don't even have to spend time posting job ads and waiting to be contacted. You probably won't need to hack your accounts either.As a rule, videos and video messages are placed in the public domain.
++++
Fraudsters forged the voice of the general director of the Moscow fitness club using a neural network and asked the cashier to give all the proceeds to the courier. She raked out the cash register and gave it all away.
According to SHOT, the theft of a new level occurred in one of the capital's clubs of the Territory of Fitness network. The 20-year-old administrator-cashier was called by a man who introduced himself as General Director Alexander Kalmykov, and asked to put 155 thousand rubles in an envelope and pass them to a courier who will soon arrive. The girl handed the phone to a colleague, who confirmed to her that it was Kalmykov's voice. Half an hour later, an Asian-looking delivery guy came to the reception, took the money, and left.
The administrator found all this suspicious, so she wrote to the work chat, where she was informed that the director was on vacation, and no one was transferring money. Realizing that she was a victim of cyber fraudsters who forged Kalmykov's voice through neural networks, she turned to the police. Verification is being performed.
Silence – security
Russians were hit by a wave of offers to earn money with the help of voice – for example, the Network has a lot of job ads in the voice-over of commercials. As representatives of the information security company Angara Security told CNews, they are not entirely honest citizens - they often collect samples of Russian voices, which are then "fed" to neural networks for their training. It is not uncommon for these samples to then become the basis for generating so-called "deepfakes".
A deepfake is a very reliable copy of something. The term became widespread with the development of neural networks, but at first it only concerned video-artificial intelligence made it possible to make visual digital copies of people, which began to be used in advertising, fraud, and pornography. There are many examples when scammers blackmailed their victims with the help of generated porn deepfakes with them.
With the voice, the scheme is roughly the same, only here most often the victims are not the owners of votes themselves, but their friends, acquaintances and relatives. For example, you can use a voice deepfake to extort money.
It's just getting started
Experts of the information security company Angara Security drew attention to the problem of security of votes of Russians. They told CNews that ads with a set of voice actors began to appear on the largest job placement platforms, as well as in Telegram. Some "orthodox" scammers still rely on phone calls – they call complete strangers and offer them a job. However, there is no guarantee that even during this phone conversation, no recording is made, which will later turn into a virtual copy of the voice.
The Angara Security report says that scammers have been interested in the voice data of Russians for several years, but over time their interest in them is growing stronger. So, in 2021, analysts identified about 1,200 messages with the offer to "work with your voice", and this still does not take into account spam calls. A year later, there were 4,800 of them, which is a fourfold increase. The result for 2023 is 7,000 ads.
Almost free voice
Fraudsters, trying to automate the process of obtaining the information they need as much as possible, decided to shift everything to the shoulders of the Russians themselves. In the text of their ads, they ask potential victims to send an audio recording of their voice, whether it is a recording of a conversation or a phone call.
At the same time, there is no talk of any fabulous fees for work in such ads – most often they offer from 300 to 5000 rubles. But, as indicated in Angara Security, most often those who decided to earn extra money and shared with unknown samples of their voice, often sometimes still receive the promised fee.
Banks are wary
After receiving a sample of a Russian citizen's voice, fraudsters can then use it to call their friends, colleagues, and relatives. They can also apply on his behalf to the banks where he has an account.
Note that the method with job ads is a very elegant way to get voice data, but even more often scammers use the brute-force method. They simply hack into the profiles of Russians in social networks and instant messengers, fish out voice messages from there and make deepfakes based on them. Sberbank experts spoke about the existence of such a method of voice theft in information security back in early 2024.
Sberbank does not publish an assessment of potential and actual damage caused to Russians through the theft and use of their voice. Experts say that so far there are not very many such cases, but, apparently, the key word here is "so far".
If you look at the situation with deception of Russians in general, then by the end of 2023, fraudsters stole 19 billion rubles from them, according to statistics provided by Deputy Chairman of the Board of Sberbank Stanislav Kuznetsov.
YouTube as the main assistant of scammers
Experts interviewed by Vedomosti drew attention to another potential source of Russian voice samples. Scammers can get them from videos posted on social networks.
In other words, video messages posted on users personal pages, as well as videos uploaded to their YouTube channels and its Russian counterparts – all this can be very useful for scammers. Once they have access to them, they don't even have to spend time posting job ads and waiting to be contacted. You probably won't need to hack your accounts either.As a rule, videos and video messages are placed in the public domain.
++++
Fraudsters forged the voice of the general director of the Moscow fitness club using a neural network and asked the cashier to give all the proceeds to the courier. She raked out the cash register and gave it all away.
According to SHOT, the theft of a new level occurred in one of the capital's clubs of the Territory of Fitness network. The 20-year-old administrator-cashier was called by a man who introduced himself as General Director Alexander Kalmykov, and asked to put 155 thousand rubles in an envelope and pass them to a courier who will soon arrive. The girl handed the phone to a colleague, who confirmed to her that it was Kalmykov's voice. Half an hour later, an Asian-looking delivery guy came to the reception, took the money, and left.
The administrator found all this suspicious, so she wrote to the work chat, where she was informed that the director was on vacation, and no one was transferring money. Realizing that she was a victim of cyber fraudsters who forged Kalmykov's voice through neural networks, she turned to the police. Verification is being performed.
