Apple Plans To scan iPhones For Child Abuse Imagery
6 Aug 2021
Apple is planning to install a software on iPhones that will scan for child abuse imagery according to reports.
Called “neuralMatch,” the software will automatically devices to identify if they contain any media featuring child sexual abuse. If a match is found Apple says it will report the incident to the US National Centre for Missing and Exploited Children (NCMEC).
The tool only looks for images that are already in NCMEC’s database, so if a parent took photos of their child in the bath it shouldn’t be flagged.
Whilst sounding like a decent enough plan for child protection, researchers worry the matching tool could be put to different purposes.
Professor Ross Anderson of the University of Cambridge said “it is going to lead to distributed bulk surveillance of… our phones and laptops” describing the move as a whole as “absolutely appalling”.
Matthew Green, a cryptography researcher at Johns Hopkins University, also highlighted the potential abuse that the system cooks face.
The ability to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child abuse images is something that “Researchers have been able to do this pretty easily,” he said.
He also warned of the geo-political implications with the potential of government surveillance over protesters.
For now it will be limited to the United States with Apple not revealing any other national authorities or countries it plans to make the feature available to.

Apple also announced they will scan end-to-end encrypted messages for the first time. This will be done on the behalf of parents to identify when a child receives or sends a sexually explicit photo, offering them “helpful resources” and reassuring the children that “it is okay if they do not want to view this photo”.