Apple introduces new feature allowing Australian children to directly report inappropriate image content
Apple introduces new feature allowing Australian children to directly report inappropriate image contentAs part of the iOS 18.2 beta, Apple has introduced a new feature that allows children in Australia to directly report inappropriate content to Apple
Apple introduces new feature allowing Australian children to directly report inappropriate image content
As part of the iOS 18.2 beta, Apple has introduced a new feature that allows children in Australia to directly report inappropriate content to Apple. This feature expands on the safety measures included in iOS 17, which can automatically detect images and videos containing nudity in iMessage, AirDrop, FaceTime, and photos.
Previously, when the system detected a CSAM alert, users would see two pop-up windows containing intervention measures. These windows would explain how to contact authorities and advise children to tell their parents or guardians. Now, when inappropriate content such as nudity is detected, a new pop-up window appears, allowing users to directly report the images and videos to Apple. Apple will then forward this information to the relevant authorities.
When an alert appears, the user's device automatically prepares a report, including any offensive material, messages sent before and after the material, and contact information from both accounts. Users can choose to fill out a form describing how the event occurred. After receiving the report, Apple will review the content and take action, such as disabling the user's ability to send messages through iMessage and reporting the issue to law enforcement.
This feature is currently being rolled out in Australia as part of the iOS 18.2 beta and is expected to be expanded globally in the future. As per The Guardian's report, Apple may have elected to launch the feature in Australia first as the country is set to require companies to regulate child abuse and terrorist content in cloud-based messaging services by the end of 2024. Apple has expressed concerns about this law, arguing that it would weaken end-to-end encryption, making user communications more susceptible to mass surveillance. Apple has publicly opposed such regulatory measures since the end of 2018.
Apple's approach to handling child sexual abuse material (CSAM) on its platform has been controversial. Initially, the company was accused of not taking CSAM protection seriously, drawing criticism from numerous watchdog groups. In 2021, Apple planned to introduce CSAM protection measures that would involve scanning users' iCloud photos for known CSAM images. If CSAM images were found, Apple would review them and send reports to the National Center for Missing and Exploited Children (NCMEC).
However, many users voiced strong opposition to Apple scanning their private images and videos, expressing concerns about being wrongly flagged. Ultimately, Apple dropped the plan, citing concerns that scanning data could create new avenues for exploitation by data thieves.
In 2024, the UK's National Society for the Prevention of Cruelty to Children (NSPCC) said that more cases of abuse images had been found on Apple platforms in the UK than Apple had reported globally. This indicates that Apple still faces a significant challenge in combating child sexual abuse material and needs to continuously improve its platform's safety measures to protect children from harm.
Tag: Apple introduces new feature allowing Australian children to directly
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.