The Radical Rabbit Hole

banner.jpg

It’s a well-known fact that the algorithms behind YouTube lead you ever deeper into whatever rabbit hole you like. There are 1.5 billion YouTube users in the world; that’s more than the number of households that own televisions. What you watch is shaped by this algorithm, which ranks billions of videos to identify the twenty ‘up next’ clips that are related to a previous video and, designed to keep you glued to your screen.

All the while a few subtle and some not so subtle advertising blinks away at you or precedes each video.

Advertising aside however, it seems that viewers are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the ante. Given it has 1.5 billion users, YouTube may be one of the most powerful radical mechanisms of the 21st century.

YouTube’s algorithm seems to be based upon the premise that people are drawn to content that is more extreme than what they started with — or to incendiary content no matter the topic.

One of the first people to notice this phenomenon was Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina. Whilst was researching Trump videos on Youtube in 2016, she noticed something unusual: Youtube began recommending and auto-playing increasingly extreme right-wing content -- like white-supremacist Holocaust-denial videos.

Intrigued, she experimented with non-political topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

The Wall Street Journal has also investigated YouTube content. It too found that YouTube often fed far-right or far-left videos to users who watched relatively mainstream news sources and that such extremist tendencies were evident with a wide variety of material. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos, it reported.

So, before any of us become radicals we need to take a step back and consider that an innocent Youtube visit can sometimes not be so innocent. I never thought I would ever say this but perhaps ‘LOL cats’ has a place in the world after all? To dilute some of the inflammatory and extreme content out there.

Viewer beware.