Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
September 3, 2021 01:25 pm GMT

Apple is delaying its child safety features

Apple says it's delaying the rollout of Child Sexual Abuse Material (CSAM) detection tools "to make improvements" following pushback from critics. The features include one that analyzes iCloud Photos for known CSAM, which has caused concern among privacy advocates.

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," Apple told 9to5Mac in a statement. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Apple planned to roll out the CSAM detection systems as part of upcoming OS updates, namely iOS 15, iPadOS 15, watchOS 8 and macOS Monterey. The company is expected to release those in the coming weeks. Apple didn't go into detail about the improvements it might make. Engadget has contacted the company for comment.

The planned features included one for Messages, which would notify children and their parents when Apple detects that sexually explicit photos were being shared in the app using on-device machine learning systems. Such images sent to children would be blurred and include warnings. Siri and the built-in search functions on iOS and macOS will direct point users to appropriate resources when someone asks how to report CSAM or tries to carry out CSAM-related searches.

The iCloud Photos tool is perhaps the most controversial of the CSAM detection features Apple has announced. It plans to use an on-device system to match photos against a database of known CSAM image hashes (a kind of digital fingerprint for such images). This analysis is supposed to take place before an image is uploaded to iCloud Photos. Were the system to detect CSAM and Apple manually confirms a match, Apple would disable that person's iCloud account and send a report to the National Center for Missing and Exploited Children.

Developing...


Original Link: https://www.engadget.com/apple-child-safety-features-csam-delay-132534290.html?src=rss

Share this article:    Share on Facebook
View Full Article

Engadget

Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics. Engadget was launched in March of 2004 in partnership with the Weblogs, Inc. Network (WI

More About this Source Visit Engadget