'Setback for privacy': WhatsApp on Apple's plan to scan iPhones for images of child abuse
- Apple has revealed its plan to identify and report iPhone users who store known images of child sexual abuse in their iCloud Photos accounts.
The head of WhatsApp has criticised Apple’s plan to scan iPhones for images of child sexual abuse, saying the approach introduces “something very concerning into the world.” On Thursday, the tech giant revealed its plan to identify and report iPhone users who store known images of child sexual abuse in their iCloud Photos accounts. While Apple insisted the process is secure and designed to preserve user privacy, not everyone is convinced.
“I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world,” tweeted WhatsApp head Will Cathcart.
Apple said it will use a perceptual hashing tool, called NeuralHash, that analyses an image and maps it into unique numbers. The system then performs on-device matching using a database of known image hashes of child sexual abuse provided by the National Center for Missing and Exploited Children (NCMEC) and other child-safety organisations. In case a match is found, the image will be manually reviewed and upon confirmation, the user’s account will be disabled and NCMEC will be notified.
WhatsApp chief blasted Apple for using a software that can scan all the private photos on a phone instead of focusing on making it easy for users to report such content shared with them. He raised concerns over the use of the scanning tool in countries like China where laws are broad and vague for considering content illegal.
“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” Cathcart added.