An Interest In:
Web News this Week
- March 28, 2024
- March 27, 2024
- March 26, 2024
- March 25, 2024
- March 24, 2024
- March 23, 2024
- March 22, 2024
Facebook studied how it polarizes users, then ignored the research
Our algorithms exploit the human brain's attraction to divisiveness.
64% of all extremist group joins are due to our recommendation tools
GOP operative turned Facebook policy VP Joel Kaplan, who threw a party for Brett Kavanaugh upon his Supreme Court confirmation, killed any action on Facebook's internal findings, reports WSJ
Mark Zuckerberg and other top executives at Facebook shelved damning internal research into the social media platform's polarizing effect, which hampered efforts to apply its conclusions to products and minimize harm, reported the Wall Street Journal on Tuesday.
Facebook's own internal research discovered that most people who join extremist groups did so as a result of Facebook's recommendation algorithms.
The company shelved the research, and pressed on, making money and radicalizing Americans.
The Wall Street Journal report by Deepa Seetharaman and Jeff Horwitz is based on company sources and internal documents. One internal Facebook presentation slide from 2018 laid out the issue like this: Our algorithms exploit the human brain's attraction to divisiveness.
If left unchecked, it warned, Facebook would feed users more and more divisive content in an effort to gain user attention & increase time on the platform.
Read the restThat presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior?
The answer it found, in some cases, was yes.
Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms.
Original Link: https://boingboing.net/2020/05/26/facebook-studied-how-it-polari.html