Parents may soon start receiving warnings that their children are ‘sexting’ or sending explicit photos through the iPhone Messages app.
Apple has announced plans to introduced a number of safety and security measures in hopes of stopping the spread of Child Sex Abuse Material across its services.
The new feature is set to be introduced in an update later this year, and will be available for users in the US initially.
How will it work?
Using machine-learning built into the device will be able to identify images that may be sexually explicit. It will then block the image from being shown on the phone and bring up a series of warnings and prompts, including one which will inform the child that if they choose to view the image, their parent will be notified
Apple says the feature will not require or allow them to read user’s messages or see the image, as the program will be built into the Messages app and analysed within the app
In a statement, Apple said: “At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).
“Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts.
“The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.
“When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.
“Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.”
Some cybersecurity experts have criticised the plan, saying it could lead to major privacy breaches.
Speaking to The Sun, Matthew Green, a top cryptography researcher at Johns Hopkins University, said: "This is a really bad idea. These tools will allow Apple to scan your iPhone for photos that match a specific perceptual hash and report them to Apple servers if too many appear.
"The ability to add scanning systems like this to end-to-end encryption messaging systems has been a major ask by law enforcement the world over.
"This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government."