Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
August 28, 2019 06:42 pm GMT

Apple apologizes for Siri privacy fail, makes audio recording for quality control opt-in

Apple's apologizing...again.

This time, the company is sorry not because a highly anticipated product is canceled (RIP AirPower), but because it failed to properly disclose to customers that it used contractors to listen to a small portion of their Siri audio recordings (which can be accidentally activated) to help improve the accuracy and quality of its digital assistant.

Uncovered by The Guardian in July, these contractors "regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or 'grading'."

The revelation raised new privacy concerns. Could these contractors identify you from these audio snippets? (Answer: of course not). Why didn't Apple make clear that it sent some Siri audio recordings to be reviewed by humans? And why couldn't customers choose whether or not they wanted their Siri requests to be used to help improve the assistant? Read more...

More about Apple, Iphone, Privacy, Siri, and Digital Assistants

Original Link: http://feeds.mashable.com/~r/mashable/tech/~3/3mLwESrIJJo/

Share this article:    Share on Facebook
View Full Article

Mashable

Mashable is the top source for news in social and digital media, technology and web culture.

More About this Source Visit Mashable