New Delhi -°C
Today in New Delhi, India

Apr 01, 2020-Wednesday
-°C

Humidity
-

Wind
-

Select city

Metro cities - Delhi, Mumbai, Chennai, Kolkata

Other cities - Noida, Gurgaon, Bengaluru, Hyderabad, Bhopal , Chandigarh , Dehradun, Indore, Jaipur, Lucknow, Patna, Ranchi

ADVERTISEMENT
Home / Tech / Apple halts programme where human contractors regularly listened to your Siri conversations

Apple halts programme where human contractors regularly listened to your Siri conversations

Apple said it will roll out a software update in the future to give users the option of participating in its grading practice.

tech Updated: Aug 02, 2019 15:10 IST
Kul Bhushan
Kul Bhushan
Hindustan Times
Apple suspends its Siri grading programme globally
Apple suspends its Siri grading programme globally(REUTERS)

Apple found itself in a rare privacy-related controversy after it was discovered that the third-party human contractors were listening to users’ Siri voice recordings. After wide criticism from users and privacy advocates, Apple on Friday said it had temporarily terminated the programme with the third-party contractors.

“We are committed to delivering a great Siri experience while protecting user privacy,” an Apple spokesperson told The Verge. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Apple hasn’t yet revealed whether the company will allow these third-party contractors to retain the earlier recordings or delete from its servers as well.

Earlier, The Guardian reported that Apple had signed up third-party contractors to listen to the confidential Siri conversations. The company said the move was aimed at improving the quality of its voice assistant. Apple, however, was caught storing sensitive personal data such as drug deals, medical information and even recordings of couples having sex.

“A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple earlier said. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Apple and other tech firms have come under wide scanner over how they handle users’ data. Just this week Google had to stop listening and transcribing Google Assistant recordings in Europe following a probe into claims that the tech firm was paying third-party contractors to listen users’ private conversations.

Amazon, which sells Echo smart home speakers, was said to be listening to voice recordings of users at homes and offices. According to a Bloomberg report, thousands of Amazon employees were involved in transcribing and adding annotations to these recordings and then training the software. Interestingly enough, Amazon’s defence was similar to Apple’s – voice samples help improved their speech recognition technology.

“We take the security and privacy of our customers’ personal information seriously,” an Amazon spokesman had said. “We only annotate an extremely small sample of Alexa voice recordings in order improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.