Here’s what Google looked like for months at the end of 2020 and on January 6th before the invasion of the capitol: (all searches were done in incognito from NYC)
The first thought that came to my mind was: maybe users are looking for these terms, and the algorithm is just a “mirror of society”? But Google’s own data tells another story. According to Google trends, and in the month before the capitol invasion, “civil war is what” was searched:
After France’s soccer world cup victory, millions went in the street to celebrate. Many people came from the suburbs, often proudly wearing French flags to party on the streets. Later in the evening, a fraction of this celebration turned violent. At least 3 stores were looted, 24 cars destroyed, and 292 people were arrested.
Online, disinformation campaigns started.
Fox News had a very ambiguous headline:
There were 2 deaths. And there was a violent turn in some places. But the 2 deaths had nothing to do with the violence, contrary to what this headline suggests. As Fox News reports:
During a long bus ride across France, my neighbor was watching YouTube for hours, watching the auto-play recommendations at the end of the video. Since I worked on the algorithm that computes these recommendations, I was curious what they were about. In one of these videos, the topic was the extermination of a quarter of the world’s population. I joked: “So who wants us dead?” he explained: “There is a secret plan from the government. Hundreds of videos say so! The media is hiding it from you. Go to YouTube, and you’ll discover the truth!”. …
Lors d’un interminable trajet en car, mon voisin passe son temps sur YouTube. Comme j’ai travaillé sur l’algorithme qui fait les recommandations, une certaine fierté m’incite à regarder de plus près. Dans ces vidéos, il est question d’un projet visant à exterminer deux milliards de personnes. Je blague: “Alors qui est-ce qui veut notre peau ?” Il m’explique: “Il y a un plan secret. Des centaines de vidéos le montrent ! Les médias te le cachent. Regarde sur YouTube, tu découvriras la vérité.” …
We all heard about conspiracy theories, alternative facts and fake news circulating on the internet. How do they become so popular? What’s the impact of the state-of-the-art algorithms on their success?
Having worked on YouTube’s recommendation algorithm, I started investigating, and came to the conclusion that the powerful algorithm I helped build plays an active role in the propagation of false information.
In order to see what YouTube is currently promoting the most, I wrote an open-source recommendation explorer which extracts the most frequently recommended videos about a query. …
On the eve of the US presidential election, we used this recommendation explorer to extract a sample of 800 unique videos recommended by YouTube in response to searches about a presidential candidates.
More than 200 of these videos have since been removed, some of them by YouTube, and others by the content creators themselves.
The most viewed videos that disappeared were:
That video was not only, but in the top 1% of the most recommended by YouTube’s A.I. out of our sample. Surprisingly, the creator of the video removed it after the election.
The account associated with that video was…
YouTube’s Artificial Intelligence (A.I.) recommends tens of billions of videos every single day, yielding billions of views. On the eve of the US Presidential Election, we gathered recommendation data on the two main candidates, and found that more than 80% of recommended videos were favorable to Trump, whether the initial query was “Trump” or “Clinton”. A large proportion of these recommendations were divisive and fake news.
We propose two transparency metrics to elucidate the impact of A.I. on the propagation of political opinions and fake news.