Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
August 13, 2021 08:25 pm

Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears





Illustration by Alex Castro / The Verge



Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system.


Apple’s upcoming iOS and iPadOS releases will automatically match US-based iCloud Photos accounts against known CSAM from a list of image hashes compiled by child safety groups. While many companies scan cloud storage services remotely, Apple’s device-based strategy has drawn...



Continue reading…




Original Link: https://www.theverge.com/2021/8/13/22623859/apple-icloud-photos-csam-scanning-security-multiple-jurisdictions-safeguard

Share this article:    Share on Facebook
View Full Article

The Verge

The Verge is an ambitious multimedia effort founded in 2011

More About this Source Visit The Verge