Man
Professional
- Messages
- 3,222
- Reaction score
- 828
- Points
- 113
Calls from the bank are a thing of the past. Fraudsters are now resorting to more brazen actions — using artificial intelligence. Let's figure out how they use neural networks and what tricks will help you recognize their tricks.
Usually, scammers threaten to send these materials to relatives, friends or police officers and ask for a lot of money in return. To prevent this from happening to you, warn your loved ones that this technology is very popular. And if you are involved in something, they should contact you and immediately check whether it is true or not.
We provided even more examples in one of our articles about deepfakes.
There are already apps that allow the user to undress anyone in a photo
Now, criminals can copy the voice of a loved one and call you on their behalf to extort money.
For example, there is a story about a friend who wrote to his friend and asked to borrow a large sum of money. As he said, his salary was delayed, and he did not have enough for the monthly payment on a car loan.
The friend suspected something was wrong, but the scammer sent a voice message, which really did contain the friend's voice, and there were no more doubts. The money was transferred.
As it turned out later, the friend was flying from another city while everything was happening. But this time was enough for the scammer to scam people. Think of a code word for your relatives in case someone calls them with your voice.
Sometimes obviously fake products come into fashion - seeds that supposedly will grow into a flower with a picture of a cat's face or a flower that looks like a bird with bright colors.
An example of a fake product on the marketplace
Attackers on Amazon often pull this off. But apparently they do it so stupidly that they are given away by the phrase in the titles of the ChatGPT cards: "Sorry, but I can't fulfill your request. It violates OpenAI policy."
Amazon itself claims that it checks sellers and cards, but no one gives a guarantee. The seller can accept your order, transfer it to a messenger, get the money and then cancel the shipment of the goods from their account. This is how Wildberries does it.
Recently, a Russian marketplace stated that scammers are increasingly creating fake pages to deceive buyers. After receiving an order, they ask to contact via messenger and send a link for payment on third-party platforms. After receiving the money, the scammers disappear without a trace.
The fraudster sends messages from the victim to the neural network, and in response receives flirtatious phrases from the AI. Using this, he gains the person's trust and lures important information or money from him.
Another option is that the fraudster generates a photo of a non-existent person, meets them online and sets up a date. Then asks to buy movie tickets, sends a link to a phishing site, and when the victim transfers funds, disappears.
The point is that fraudsters can steal other people's data, take photos from social networks to register in online casinos, car rental services, or apply for microloans. And then you will be trying hard to prove your innocence.
An example of a fake document. The fraudster adds personal information and a photo to the template
Now you have learned all the new fraud schemes and are ready to be on the alert. Share the article with your friends and family to warn them about the danger. See you soon!
Source
Deepfake blackmail
Deepfake is fake content that people create using AI. You can create it from video: for example, copy a person's face from photos on their social networks and train a neural network on them. And then generate obscene or extremist actions on behalf of the victim. Or make photo content of an inappropriate nature.Usually, scammers threaten to send these materials to relatives, friends or police officers and ask for a lot of money in return. To prevent this from happening to you, warn your loved ones that this technology is very popular. And if you are involved in something, they should contact you and immediately check whether it is true or not.
We provided even more examples in one of our articles about deepfakes.

There are already apps that allow the user to undress anyone in a photo
Copying voice
Previously, the criminal would call the victim on behalf of her relative and say that he had been in a traffic accident. After which he would ask for an urgent transfer of money - for treatment or to avoid criminal liability. It was easy to suspect fraud because of the different voice.Now, criminals can copy the voice of a loved one and call you on their behalf to extort money.
For example, there is a story about a friend who wrote to his friend and asked to borrow a large sum of money. As he said, his salary was delayed, and he did not have enough for the monthly payment on a car loan.
The friend suspected something was wrong, but the scammer sent a voice message, which really did contain the friend's voice, and there were no more doubts. The money was transferred.
As it turned out later, the friend was flying from another city while everything was happening. But this time was enough for the scammer to scam people. Think of a code word for your relatives in case someone calls them with your voice.
Non-existent products on marketplaces
With the help of neural networks, people make entire photo shoots for their products, and then design them as product cards and upload them to marketplaces. Many sellers got so carried away that they generated photos of non-existent products, their names and descriptions.Sometimes obviously fake products come into fashion - seeds that supposedly will grow into a flower with a picture of a cat's face or a flower that looks like a bird with bright colors.


An example of a fake product on the marketplace
Attackers on Amazon often pull this off. But apparently they do it so stupidly that they are given away by the phrase in the titles of the ChatGPT cards: "Sorry, but I can't fulfill your request. It violates OpenAI policy."
Amazon itself claims that it checks sellers and cards, but no one gives a guarantee. The seller can accept your order, transfer it to a messenger, get the money and then cancel the shipment of the goods from their account. This is how Wildberries does it.
Recently, a Russian marketplace stated that scammers are increasingly creating fake pages to deceive buyers. After receiving an order, they ask to contact via messenger and send a link for payment on third-party platforms. After receiving the money, the scammers disappear without a trace.
Fake girl
Fraudsters can pretend to be guys or girls from dating sites. This method of deception has long been known to everyone, but now its popularity is growing again thanks to the LoveGPT neural network.The fraudster sends messages from the victim to the neural network, and in response receives flirtatious phrases from the AI. Using this, he gains the person's trust and lures important information or money from him.
Another option is that the fraudster generates a photo of a non-existent person, meets them online and sets up a date. Then asks to buy movie tickets, sends a link to a phishing site, and when the victim transfers funds, disappears.
Fake documents
This is probably the most brazen way of deception. On the OnlyFake website, you can create photos of documents: passports or driver's licenses from 26 countries. All you need to do is fill out a template with a photo of any person and their data, and then the neural network will process everything itself. The photos are as reliable as possible and allow you to pass checks on crypto exchanges. And this service costs $15, about 1,300 ₽.The point is that fraudsters can steal other people's data, take photos from social networks to register in online casinos, car rental services, or apply for microloans. And then you will be trying hard to prove your innocence.

An example of a fake document. The fraudster adds personal information and a photo to the template
Now you have learned all the new fraud schemes and are ready to be on the alert. Share the article with your friends and family to warn them about the danger. See you soon!
Source