ClothOff: how erotic deepfakes break the lives of ordinary teenagers

Teacher

Professional
Messages
2,673
Reputation
9
Reaction score
688
Points
113
AI erases the boundaries of what is allowed. Is it really possible to protect your child?

In the small Spanish town of Almendralejo, a scandal broke out that forced the public to pay attention to one of the serious problems of the modern digital world.

Miriam al-Adib, a local gynecologist and mother of four daughters, was confronted with an unpleasant discovery when her daughter showed her an image of her naked body created using artificial intelligence technologies.

Such images were distributed among the students of her school, and not only the daughter of Miriam was a victim of this phenomenon. This was just one of dozens of similar cases among schoolgirls in the city, whose photos were circulating in a WhatsApp group created by other students.

The situation was further complicated by the fact that many of the girls whose fake images were distributed in this way were extremely harmful: they suffered from panic attacks and refused to go to school, as they became victims of blackmail and bullying. The main danger, among other things, was also the possibility of these images getting on pornographic sites, as Miriam herself stated.

This incident caused a wide public response and highlighted the problem of distributing such materials on the Internet, which is becoming increasingly difficult for the police to deal with.

There are many different neural networks that can generate nude images, but at the center of an international scandal was the ClothOff application, which allows you to "undress" any person using artificial intelligence for a certain fee.

A journalistic investigation conducted by The Guardian revealed that ClothOff developers go to considerable lengths to maintain their anonymity, including using artificial intelligence to create fictitious public identities.

According to The Guardian, a brother and sister from Belarus named Daria and Alexander participated in the creation and distribution of the ClothOff app. Despite denying any connection to the app and Alexander's claims that he doesn't have a sister, the investigation showed otherwise.

An additional investigation revealed that payments to ClothOff were made through Texture Oasis, a London-based company that appears to be fictitious and designed to disguise financial transactions. All the text on the website of this company was copied from the website of another legitimate organization, which further confirms the suspicion of fraud.

The complexity of the fight against the dissemination of such materials underlines the need to develop international legislation in order to prevent this phenomenon, which can permanently overshadow many human destinies, with the help of global cooperation.

One of the recommendations to avoid becoming a victim of such AI technologies yourself or to protect your loved ones is either to completely abandon the management of social networks (meaning publishing personal photos and videos), or to close your pages from prying eyes, except for your family and closest friends.

Nevertheless, no one can prevent a potential detractor from "catching" his victim personally and making the necessary shot, especially if we are talking about schoolchildren who spend many hours together every day.

Perhaps the best solution to this problem will still be to inform the younger generation about possible responsibility for such digital crimes and emphasize that even, at first glance, an innocent joke can really break another person's life.

At the same time, the conversation with adult AI voyeurs should be as short and tough as possible-a statement to the police and a court verdict.
 
Top