Man
Professional
- Messages
- 3,077
- Reaction score
- 614
- Points
- 113
The police will have access to complaints about unwanted messages.
Apple has introduced a new feature in the iMessage messenger for Australia, which will allow children to report receiving unwanted images and videos with nude elements. Upon receipt of the complaint, the company will be able to transmit the messages to the police.
The feature is part of beta versions of Apple's new operating systems for Australian users and expands on the security measures that have been enabled by default for children under 13 since iOS 17. Previously, iPhone automatically detected nude content that children could receive or try to send to iMessage, AirDrop, FaceTime and Photos, protecting data privacy with on-device detection. When such content was detected, the child was offered information resources or the opportunity to contact the parent.
With the new update, users will have the opportunity to report inappropriate content to Apple. After a report, the device will automatically generate a report that includes images or videos, as well as messages sent before and after receiving the content. The report will also include the contact details of both parties, and users will be able to additionally describe the situation. Apple will consider the complaint, will be able to limit the sending of messages by the offending user and transfer the information to law enforcement agencies.
Initially, the new feature will only be available in Australia, but it is planned to be launched globally in the future. This decision was made against the backdrop of changes in the legislation of Australia, where from the end of 2024 companies are required to control child and terrorist content on cloud and messenger services.
Apple has previously warned that the proposed measures could breach end-to-end encryption, putting users' privacy at risk. In response, the Australian authorities relaxed the requirements, allowing companies threatened by the new regulations to offer alternative solutions to combat inappropriate content.
Apple has long been criticized by regulators for its reluctance to weaken encryption in iMessage to please law enforcement. In 2022, the company scrapped plans to scan photos and videos in iCloud for child sexual abuse (CSAM) material, prompting another wave of criticism. Apple and other end-to-end encryption advocates argue that the weakening of security poses a global threat to user privacy.
The child protection organization NSPCC in the UK accuses Apple of not sufficiently recording CSAM cases in its products. According to a report by the National Center for Missing and Exploited Children (NCMEC), Apple filed only 267 reports of suspicious material in 2023, while Google and Meta reported millions of such cases.
Source
Apple has introduced a new feature in the iMessage messenger for Australia, which will allow children to report receiving unwanted images and videos with nude elements. Upon receipt of the complaint, the company will be able to transmit the messages to the police.
The feature is part of beta versions of Apple's new operating systems for Australian users and expands on the security measures that have been enabled by default for children under 13 since iOS 17. Previously, iPhone automatically detected nude content that children could receive or try to send to iMessage, AirDrop, FaceTime and Photos, protecting data privacy with on-device detection. When such content was detected, the child was offered information resources or the opportunity to contact the parent.
With the new update, users will have the opportunity to report inappropriate content to Apple. After a report, the device will automatically generate a report that includes images or videos, as well as messages sent before and after receiving the content. The report will also include the contact details of both parties, and users will be able to additionally describe the situation. Apple will consider the complaint, will be able to limit the sending of messages by the offending user and transfer the information to law enforcement agencies.
Initially, the new feature will only be available in Australia, but it is planned to be launched globally in the future. This decision was made against the backdrop of changes in the legislation of Australia, where from the end of 2024 companies are required to control child and terrorist content on cloud and messenger services.
Apple has previously warned that the proposed measures could breach end-to-end encryption, putting users' privacy at risk. In response, the Australian authorities relaxed the requirements, allowing companies threatened by the new regulations to offer alternative solutions to combat inappropriate content.
Apple has long been criticized by regulators for its reluctance to weaken encryption in iMessage to please law enforcement. In 2022, the company scrapped plans to scan photos and videos in iCloud for child sexual abuse (CSAM) material, prompting another wave of criticism. Apple and other end-to-end encryption advocates argue that the weakening of security poses a global threat to user privacy.
The child protection organization NSPCC in the UK accuses Apple of not sufficiently recording CSAM cases in its products. According to a report by the National Center for Missing and Exploited Children (NCMEC), Apple filed only 267 reports of suspicious material in 2023, while Google and Meta reported millions of such cases.
Source