Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
January 28, 2019 06:28 pm PST

The Safe Face Pledge: an ethical code of conduct for facial recognition scientists and developers

The Safe Face Pledge launched last month as a "pledge to mitigate abuse of facial analysis technology," with four themes: "Show Value for Human Life, Dignity, and Rights;" "Address Harmful Bias"; "Facilitate Transparency"; and "Embed Commitments into Business Practices" (SAFE).

The full pledge is inspirational and comprehensive, covering bias, secret and discriminatory state surveillance, risking human life, law enforcement abuse, auditing customer compliance, communicating the systems' workings, and making your legal documents (from vendor contracts to terms of service) reflective of your values.

The pledge's announcement describes how the UK's notoriously inaccurate police facial recognition systems are more likely to falsely accuse black people of being a match for a criminal than people of different ethnic or racial backgrounds.

That reminded me of something that EFF executive director Cindy Cohn described on a panel last month: Cindy pointed out that there's a danger in centering the critique of facial recognition in racial bias, because this bias is the result of the systems not being trained with enough images of racialized people. When a Chinese state facial recognition system ran into this problem, the Chinese government simply bought the driver's license database from an African client state and used it as training data, eliminating bias in the algorithm's false positive rate, by massively invading the privacy of millions of African people, and now the system is even better at tracking black people.

Commitment One: Show Value for Human Life, Dignity, and Rights

Signatories of the Safe Face Pledge agree to:

1Do not contribute to applications that risk human life

By acknowledging that decisions that foreseeably increase the risk to human life are too dangerous for artificial intelligence, and by refraining from selling or providing facial analysis technologies to locate or identify targets in operations where lethal force may be used or is contemplated.

Read the rest


Original Link: http://feeds.boingboing.net/~r/boingboing/iBag/~3/5g_IQPg7upA/ibm-at-auschwitz.html

Share this article:    Share on Facebook
View Full Article