Apple Will Scan iPhones in U.S. for Images of Child Sex Abuse (UPDATE)

The tech giant will reportedly begin using a tool known as “NeuralHash," which can detect child sexual abuse material that's been uploaded to a device.

iPhone
Getty

Image via Getty/Christoph Dernbach/picture alliance

iPhone

UPDATED 8/9, 12:50 p.m. ET: Apple released an FAQ with over four pages of information addressing reports and reactions surrounding its Expanded Protections for Children system. The biggest takeaway is that phones will actually not be scanned via the new system.

As Forbeswrites, “[Apple] is, however, going to scan all photos users upload to the iCloud, using code that compares the ‘hash’ of the image to known hashes of child sexual abuse photos, stored in databases from the likes of the National Center for Missing & Exploited Children.” Apple said it “never gains access to communications as a result of this feature in Messages.”

Alongside its child sexual abuse material scanning tech, Apple shared that it’s implementing a feature via an iOS update that can tell if a nude image is being sent to a child’s device. The company explained such an image “will be blurred and the child will be warned, presented with helpful resources and reassured it is okay if they do not want to view or send the photo.” This takes place on the user’s phone and the photos aren’t screened by Apple. “The feature will only work on phones that have a child account set up in Family Sharing,” Forbes notes.

See original story below.

Apple has announced its latest plan to protect young children from sexual predators.

On Thursday, the tech giant confirmed it would begin using new software that will detect and report child sexual abuse material (CSAM) on U.S. iPhones. Apple will utilize a tool known as “NeuralHash,” which can help determine whether a user is trying to store known CSAM on iCloud.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result,” Apple wrote in the announcement. “The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

Once the automated system finds a match, a human will review the image in a question and assess whether it is illegal. If the reviewer concludes the content qualifies as child pornography, the user’s account will be deactivated and the material will be reported to the National Center for Missing and Exploited Children (NCMEC).

While some have applauded Apple’s efforts to beef up their child protection policies, some technology experts have raised concerns the tool will lead to abuse of privacy.

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, said to the Associated Press. “Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.”

NeuralHash will be introduced as part of the iOS 15 software update, which is expected to roll out within the next month or two.

Latest in Life