Hindustantimes wants to start sending you push notifications. Click allow to subscribe

'Setback for privacy': WhatsApp on Apple's plan to scan iPhones for images of child abuse

Apple has revealed its plan to identify and report iPhone users who store known images of child sexual abuse in their iCloud Photos accounts.
Cathcart assured that WhatsApp will not adopt Apple’s approach to combat child sexual abuse.(AFP)
Published on Aug 07, 2021 04:42 PM IST
Byhindustantimes.com | Written by Kunal Gaurav, Hindustan Times, New Delhi

The head of WhatsApp has criticised Apple’s plan to scan iPhones for images of child sexual abuse, saying the approach introduces “something very concerning into the world.” On Thursday, the tech giant revealed its plan to identify and report iPhone users who store known images of child sexual abuse in their iCloud Photos accounts. While Apple insisted the process is secure and designed to preserve user privacy, not everyone is convinced.

“I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world,” tweeted WhatsApp head Will Cathcart.

Apple said it will use a perceptual hashing tool, called NeuralHash, that analyses an image and maps it into unique numbers. The system then performs on-device matching using a database of known image hashes of child sexual abuse provided by the National Center for Missing and Exploited Children (NCMEC) and other child-safety organisations. In case a match is found, the image will be manually reviewed and upon confirmation, the user’s account will be disabled and NCMEC will be notified.

Also Read | 'Epic is right': Elon Musk slams Apple's app store fees

RELATED STORIES

Cathcart assured that WhatsApp, which has repeatedly been prompting Indian users to adopt its controversial privacy policy, will not adopt Apple’s approach to combat child sexual abuse. In a series of tweets, he said that WhatsApp reported more than 400,000 cases of child sexual abuse material (CSAM) to NCMEC in 2020 without breaking encryption.

WhatsApp chief blasted Apple for using a software that can scan all the private photos on a phone instead of focusing on making it easy for users to report such content shared with them. He raised concerns over the use of the scanning tool in countries like China where laws are broad and vague for considering content illegal.

“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable,” Cathcart added.

SHARE THIS ARTICLE ON
Recommended For You
This site uses cookies

This site and its partners use technology such as cookies to personalize content and ads and analyse traffic. By using this site you agree to its privacy policy. You can change your mind and revisit your choices at anytime in future.

OPEN APP