Brother
Professional
- Messages
- 2,590
- Reaction score
- 539
- Points
- 113
The growing popularity of fake nude image apps raises serious concerns.
Recently, more and more users have resorted to using apps and websites that use artificial intelligence to create fake nude images of people. As the data shows, these services allow you to upload a photo of a clothed person, after which the AI generates a nude image of them.
According to Graphika, a social analytics company, a group of 34 such websites attracted more than 24 million unique visitors in September alone. The study, published in the December report "A Revealing Picture", also claims that the number of spam links to these sites and apps has increased by more than 2,000% since the beginning of the year on platforms including X and Reddit.
In addition, it was revealed that more than 53 Telegram groups that provide access to such services have at least 1 million users.
The researchers note that the growing popularity and availability of strip services is likely to lead to an increase in online harm, including the creation and distribution of unsolicited nude images, targeted harassment campaigns, sextorching, and the production of child sexual abuse materials.
Applications for creating deepfakes (deepfake), which change digital images so that a person appears naked, have been around for several years. They are usually created without the consent of the people depicted and are often used against celebrities and Internet personalities.
For example, in February, Twitch streamer QTCinderella became a victim of fake porn videos with her participation spreading on the Internet. After posting a screenshot from one of these videos, she stated: "The amount of dysmorphia I experienced after seeing these photos destroyed me. This is much more than just a violation of personal boundaries."
Fake nude images of more than 20 girls were distributed in several schools in Spain in September. The photos were processed using an AI-based app, El País reports.
Recently, more and more users have resorted to using apps and websites that use artificial intelligence to create fake nude images of people. As the data shows, these services allow you to upload a photo of a clothed person, after which the AI generates a nude image of them.
According to Graphika, a social analytics company, a group of 34 such websites attracted more than 24 million unique visitors in September alone. The study, published in the December report "A Revealing Picture", also claims that the number of spam links to these sites and apps has increased by more than 2,000% since the beginning of the year on platforms including X and Reddit.
In addition, it was revealed that more than 53 Telegram groups that provide access to such services have at least 1 million users.
The researchers note that the growing popularity and availability of strip services is likely to lead to an increase in online harm, including the creation and distribution of unsolicited nude images, targeted harassment campaigns, sextorching, and the production of child sexual abuse materials.
Applications for creating deepfakes (deepfake), which change digital images so that a person appears naked, have been around for several years. They are usually created without the consent of the people depicted and are often used against celebrities and Internet personalities.
For example, in February, Twitch streamer QTCinderella became a victim of fake porn videos with her participation spreading on the Internet. After posting a screenshot from one of these videos, she stated: "The amount of dysmorphia I experienced after seeing these photos destroyed me. This is much more than just a violation of personal boundaries."
Fake nude images of more than 20 girls were distributed in several schools in Spain in September. The photos were processed using an AI-based app, El País reports.