Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
May 26, 2020 11:13 pm GMT

Facebook studied how it polarizes users, then ignored the research

Our algorithms exploit the human brain's attraction to divisiveness.

64% of all extremist group joins are due to our recommendation tools

GOP operative turned Facebook policy VP Joel Kaplan, who threw a party for Brett Kavanaugh upon his Supreme Court confirmation, killed any action on Facebook's internal findings, reports WSJ

Mark Zuckerberg and other top executives at Facebook shelved damning internal research into the social media platform's polarizing effect, which hampered efforts to apply its conclusions to products and minimize harm, reported the Wall Street Journal on Tuesday.

Facebook's own internal research discovered that most people who join extremist groups did so as a result of Facebook's recommendation algorithms.

The company shelved the research, and pressed on, making money and radicalizing Americans.

The Wall Street Journal report by Deepa Seetharaman and Jeff Horwitz is based on company sources and internal documents. One internal Facebook presentation slide from 2018 laid out the issue like this: Our algorithms exploit the human brain's attraction to divisiveness.

If left unchecked, it warned, Facebook would feed users more and more divisive content in an effort to gain user attention & increase time on the platform.

Excerpt:

That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior?

The answer it found, in some cases, was yes.

Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms.

Read the rest

Original Link: https://boingboing.net/2020/05/26/facebook-studied-how-it-polari.html

Share this article:    Share on Facebook
View Full Article