August 6, 2021 09:43 pm
Original Link: https://www.theverge.com/2021/8/6/22613365/apple-icloud-csam-scanning-whatsapp-surveillance-reactions
WhatsApp lead and other tech experts fire back at Apples Child Safety plan
WhatsApp won’t be adopting Apple’s new Child Safety measures, meant to stop the spread of child abuse imagery, according to WhatsApp’s head Will Cathcart. In a Twitter thread, he explains his belief that Apple “has built software that can scan all the private photos on your phone,” and said that Apple has taken the wrong path in trying to improve its response to child sexual abuse material, or CSAM.
Apple’s plan, which it announced on Thursday, involves taking hashes of images uploaded to iCloud and comparing them to a database that contains hashes of known CSAM images. According to Apple, this allows it to keep user data encrypted and run the analysis on-device while still allowing it to report users to the authorities if they’re found...
Original Link: https://www.theverge.com/2021/8/6/22613365/apple-icloud-csam-scanning-whatsapp-surveillance-reactions
Share this article:
Tweet
View Full Article
The Verge
The Verge is an ambitious multimedia effort founded in 2011More About this Source Visit The Verge