Weapons, drugs and Mickey Mice in Gaza: Copilot Designer "went off the rails" and generates everything in a row

Teacher

Professional
Messages
2,673
Reputation
9
Reaction score
688
Points
113
A Microsoft security engineer contacted the FTC directly to resolve the issue.

A current Microsoft engineer has applied to the US Federal Trade Commission (FTC) with a warning about the potential danger associated with the Copilot Designer image generator.

Shane Jones, who has been working for the company for six years, pointed out in his letter that the tool is capable of generating malicious images, but Microsoft refuses to take the necessary measures to disable it, despite numerous warnings.

During a security review of Copilot Designer, Jones found that the tool is capable of creating images with scenes of violence, sexualized female images, minors with weapons, as well as promoting alcohol and drug use among teenagers. Moreover, images of Disney characters were generated in the context of the Gaza conflict.

Jones has been trying to draw attention to problems with the DALL-E 3 model used in Copilot Designer since December. He did not take the issue out of the company for a long time, trying to solve it quietly, and then turned directly to OpenAI, responsible for the development of DALL-E 3, but this also did not lead to results. Later, Jones published an open letter on LinkedIn to attract public attention, but the Microsoft legal service demanded that it be deleted.

In his appeal to the FTC, the engineer called on the commission to suspend the use of Copilot Designer until additional security mechanisms are implemented. After all, despite his direct calls to Microsoft, the company continues to offer the product to a wide audience.

In response to the concerns raised by the employee, Microsoft spokesman Frank Shaw said the company is taking steps to address any issues that comply with Microsoft's security policy. It is claimed that meetings were organized with the product management and the Office of the responsible AI to address this issue.

In addition, Jones turned to a group of US senators after Copilot Designer generated obscene images of Taylor Swift, which quickly spread on the X network. Microsoft CEO Satya Nadella called the incident "disturbing and terrible", promising to strengthen security measures.

Recall that last month, Google faced a similar problem and temporarily disabled its own AI-based image generator after detecting incorrect historical illustrations.

I wonder if Shane Jones will lose his job for his overly active civic position and desire to protect users, because the raised noise could only harm the reputation of Microsoft, as well as scare off potential employers if someone wanted to lure the engineer to their side.
 
Top