How YouTube’s A.I. boosts alternative facts

YouTube’s recommendation A.I. is designed to maximize the time users spend online. Fiction often outperforms reality.

Guillaume Chaslot
4 min readMar 31, 2017

We all heard about conspiracy theories, alternative facts and fake news circulating on the internet. How do they become so popular? What’s the impact of the state-of-the-art algorithms on their success?
Having worked on YouTube’s recommendation algorithm, I started investigating, and came to the conclusion that the powerful algorithm I helped build plays an active role in the propagation of false information.

In order to see what YouTube is currently promoting the most, I wrote an open-source recommendation explorer which extracts the most frequently recommended videos about a query. I compared them to the 20 first results coming from the same Google and YouTube Search queries.

The results on these 5 queries speak for themselves:

1 — Basic facts: “Is the earth flat or round?”

2 — Religion: “Who is the Pope?”

Pope Francis
Pope Francis

3 — Science: “Is global warming real?”

Nasa’s weather data

4 — Conspiracies: “Is Pizzagate real?”

Pizzagate is a conspiracy theory according to which the Clintons ran a pedophile ring out of a Pizzeria in Washington DC. Videos promoting this theory were recommended millions of times by YouTube in the months preceding the 2016 US presidential election.

5 — Celebrities: “Who is Michelle Obama?”

Michelle Obama

Why do recommendations differ from search ?

YouTube search and YouTube recommendation algorithm yield surprisingly different results in these examples, despite both algorithms using the same data. This shows that small differences in the algorithms can yield large differences in the results. Search is probably optimized more towards relevance, whereas recommendations might take watch time more into account.

YouTube doesn’t recommend what people “like”

Surprisingly, one can notice that “likes” and “dislikes” on a video have little impact on recommendations. For instance, many videos claiming Michelle Obama was “born a man” have more dislikes than likes, but are still highly recommended by YouTube. YouTube seems to put more weight in maximizing watch time than likes.

Hence, if “the earth is flat” keeps users online longer than “the earth is round”, this theory will be favored by the recommendation algorithm.

The snowball effect that boosts conspiracies

Once a conspiracy video is favored by the A.I., it gives an incentive to content creators to upload additional videos corroborating the conspiracy. In turn, those additional videos increase the retention statistics of the conspiracy. Next, the conspiracy gets recommended further.

Eventually, the large amount of videos favoring a conspiracy makes it appear more credible. For instance, in one of the “flat earth” videos, the author commented:

“There are 2 millions flat earth videos on YouTube, it cannot be B.S. !”

What we can do

The point here is not to pass judgement on Youtube. They’re not doing this on purpose, it’s an unintended consequence of the algorithm. But every single day, people watch more than one billion hours of YouTube content.

And because Youtube has a large impact on what people watch, it could also have a lot of power in curbing the spread of alternative news, and the first step to finding a solution is to measure it.

Experiment with the recommendation explorer if you want to find out what YouTube recommends the most about subjects you care about.

--

--