Apple is set to introduce a new feature in iMessage aimed at enhancing child safety in Australia. This initiative will allow children to directly report nude images or videos sent to them, enabling Apple to forward these reports to law enforcement when necessary.
This feature is part of the latest beta release of Apple’s operating systems for Australian users, and it expands on the existing safety measures that have been in place since iOS 17, particularly for users under 13. The current system automatically detects sexually explicit images or videos children may encounter via iMessage, AirDrop, FaceTime, and Photos, with privacy protection in mind, as all detection occurs on the devices.
When such content is detected, the young user is presented with two warning screens before they can continue, along with options for resources or a way to contact a parent or guardian. With the new update, users will also be able to report the images or videos to Apple directly.
The device will compile a report that includes the sensitive images or videos, alongside the messages exchanged immediately before and after the incident. It will contain contact information from both parties involved, and users have the option to fill out a form detailing the situation. Apple will review these reports and may take actions like disabling the offender’s ability to send messages through iMessage while also notifying law enforcement as needed.
Initially, Apple plans to roll out this feature in Australia, with plans for a global release in the future. This timing aligns with new regulations requiring tech companies in Australia to monitor child abuse and terrorism content on their platforms by the end of 2024.
However, Apple has previously raised concerns that some proposed regulations could compromise end-to-end encryption, which is vital for user privacy. The Australian eSafety commissioner has made adjustments to the law to allow companies to seek alternative methods for handling child abuse and terror content if they believe compliance would undermine encryption.
Despite these efforts, Apple has faced criticism for its approach to child safety. Reports have surfaced, such as the UK’s NSPCC accusing the company of significantly underreporting child sexual abuse material (CSAM) on its platforms. In 2023, Apple reported only 267 suspected CSAM cases, a stark contrast to competitors like Google and Meta, which reported millions.
As these developments unfold, it’s clear that Apple is taking steps towards increasing child safety in digital communication while navigating the complex challenges of privacy and regulatory compliance.