Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
March 11, 2018 12:24 pm PDT

Why Youtube's algorithms push extreme content on every possible subject

Zeynep Tufekci was researching Trump videos on Youtube back in 2016 when she noticed something funny: Youtube began recommending and autoplaying increasingly extreme right-wing stuff -- like white-supremacist Holocaust-denial videos.

So she did an interesting experiment: She set up another Youtube account and began watching videos for the main Democratic presidential contenders, Hillary Clinton and Bernie Sanders. The result? As Tufecki writes in the New York Times:

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never hard core enough for YouTubes recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

This is an incredibly interesting and subtle point: That the problems of Youtube's recommender algorithms might be that they overdistil your preferences. Since they're aiming for "engagement" -- a word I am beginning to loathe with an unsettling level of emotion -- the real problem with these algorithms is they're constantly aiming to create an epic sense of drama and newness. At the tail-ends of this bimodal attentional landscape, only the Xtreme can survive. And of course, this precisely leverages our novelty-seeking psychology, which really does snap to attention when we're presented with intense stuff.

So it's not that Youtube radicalize politics specifically. It radicalizes everything, and politics just gets swept along in the slurry of zomg.

Read the rest of Tufecki's piece -- she's one of the best critics of our algorithmicized world, and this one really nails it.

(CC-licensed image above via Pixabay)


Original Link: http://feeds.boingboing.net/~r/boingboing/iBag/~3/Zaql9fDv_YU/why-youtubes-algorithms-push.html

Share this article:    Share on Facebook
View Full Article