Apple Introduces New Child Safety Feature: Reporting Nudity in Photos
In a recent announcement, Apple has revealed an innovative child safety feature designed to enhance protection against inappropriate content. As reported by The Guardian, this new functionality allows children to report any received photos or videos containing nudity directly to Apple.
Expanding Communication Safety Features
This feature expands upon Apple's existing Communication Safety tool, which already employs on-device scanning to detect nudity in images or videos received through services like Messages, AirDrop, or Contact Posters. When such content is detected, the image or video is blurred out, and users are presented with additional options, including:
- Messaging an adult
- Accessing resources for help
- Blocking the contact
Reporting Mechanism in Development
Currently under testing in Australia with the upcoming iOS 18.2, the reporting feature is a groundbreaking addition. It allows users to prepare a report containing:
- The images or videos in question
- Messages exchanged immediately before and after the image was received
- Contact information for both accounts involved
- A form for users to describe the incident
Once a report is submitted, Apple will review the content and may take actions such as disabling iMessage for the offending user or notifying law enforcement if necessary.
Google's Similar Initiatives
Coincidentally, Google also announced an expansion of its on-device scanning capabilities in its Android applications this week. Their update includes a Sensitive Content Warning feature that blurs nudity in images and provides resources for help. This will soon be enabled by default for all users under 18.
Future Availability of the New Apple Feature
While Apple has indicated plans to make this feature available globally, they have yet to announce a specific timeline for its launch. In a bid to prioritize user privacy, the company has not provided immediate comments on the testing progress.
A Look Back at Apple’s Child Safety Features
This recent initiative follows a series of child safety features set to launch in 2021, which aimed to scan users' iCloud Photos libraries for child sexual abuse material and notify parents when their children exchanged sexually explicit images. However, after concerns from privacy advocates, Apple postponed the project and ultimately decided against scanning for such content in December 2022.
With these advancements, Apple and Google are taking proactive steps to ensure the digital safety of children, making it easier for them to navigate the challenges posed by inappropriate online content.
Leave a comment
All comments are moderated before being published.
Trang web này được bảo vệ bằng hCaptcha. Ngoài ra, cũng áp dụng Chính sách quyền riêng tư và Điều khoản dịch vụ của hCaptcha.