Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
August 22, 2022 12:17 am

Google AI flagged parents accounts for potential abuse over nude photos of their sick kids





Illustration by Alex Castro / The Verge



A concerned father says that after using his Android smartphone to take photos of an infection on his toddler’s groin, Google flagged the images as child sexual abuse material (CSAM), according to a report from The New York Times. The company closed his accounts and filed a report with the National Center for Missing and Exploited Children (NCMEC) and spurred a police investigation, highlighting the complications of trying to tell the difference between potential abuse and an innocent photo once it becomes part of a user’s digital library, whether on their personal device or in cloud storage.


Concerns about the consequences of blurring the lines for what should be considered private were aired last year when Apple announced its Child...



Continue reading…




Original Link: https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation

Share this article:    Share on Facebook
View Full Article

The Verge

The Verge is an ambitious multimedia effort founded in 2011

More About this Source Visit The Verge