AppleInsider is supported by its audience and can earn commissions as an Amazon Associate and Affiliate on qualifying purchases. These partner relationships do not affect our editorial content.
Apple rolls out communications security feature that scans iMessages for nudity on devices owned by younger users in the UK and Canada months after its launch in the US
The feature, which differs from the controversial on-device Photos review feature, automatically blurs potentially harmful images in received or outgoing messages on child-owned devices.
First reported by The Guardian, Apple is expanding the feature to the UK after it rolls out in the US in iOS 15.2. AppleInsider has been informed that the feature is also expanding to Canada.
How the feature works depends on whether a child receives or sends an image with nudity. Received images will be blurry and the child will be provided with safety equipment from child safety groups. Nudity in photos sent by younger users will trigger a warning telling you not to send the image.
The feature focuses on privacy and is only available on an opt-in basis. It must be enabled by parents. All nudity detection is done on the device, meaning potentially sensitive images never leave an iPhone.
Apple first announced the Communication Safety feature alongside a suite of features intended to provide better safety mechanisms for children. That suite contained a system that scanned photos for child sexual abuse material (CSAM).
The CSAM scanning had several privacy mechanisms and never looked through a user’s images. Instead, it matched potentially offensive material based on known hashes provided by child safety organizations. Despite this, Apple faced a backlash and postponed the feature until further notice.
Apple’s communications security feature is completely separate from the CSAM scanning mechanism. It first debuted in the US in November as part of iOS 15.2, and its expansion to the UK and Canada means it’s rolling out to other markets.