Hundreds of videos were removed from YouTube after the US presidential election

More than one fourth of videos recommended on election eve have since been removed. Why?

Guillaume Chaslot
2 min readJan 16, 2017
Visualization of some YouTube recommendations on Election Eve. Full visualization here.

On the eve of the US presidential election, we used this recommendation explorer to extract a sample of 800 unique videos recommended by YouTube in response to searches about a presidential candidates.

More than 200 of these videos have since been removed, some of them by YouTube, and others by the content creators themselves.

The most viewed videos that disappeared were:

“This video will get Trump Elected” (10 million views)

That video was not only, but in the top 1% of the most recommended by YouTube’s A.I. out of our sample. Surprisingly, the creator of the video removed it after the election.

“Must Watch!! Hillary Clinton tried to ban this video” (5 million views)

The account associated with that video was terminated by YouTube’s abuse team. Surprisingly, that video was also in the top 1% of the most recommended videos.

Here are some examples of the 200+ videos recommended by YouTube on election day, and removed afterwards:

Why it matters

The removal of these videos is a symptom of large-scale misinformation associated with the US presidential election: the removed videos among the 800 we analyzed account for tens of millions of views. Based on this, we estimate hundreds of millions of views of political YouTube videos are associated with videos removed in the aftermath of the presidential election.

YouTube abuse team takes a long time to examine and ban videos. This delay enables accounts and videos violating the terms of service to still be viewed a large number of times, especially if they strongly appeal to a particular constituency. This might also impact upcoming elections in the rest of the world.

Finally, by uploading defamatory content on YouTube before a critical period, and removing the videos afterwards, content creators can effectively promote defamatory videos without liability.

--

--