The Winners and Losers of YouTube’s Conspiracy Crackdown

Mark Ledwich
8 min readSep 21, 2019

--

painting of an algorithmic god parsing judgment on channels

In January 2019, YouTube announced that they will be cracking down on conspiracy theories by changing search results and suggested videos to reduce the spread of “borderline content”, that “comes close to — but doesn’t quite cross the line of — violating our Community Guidelines”. This includes suggestions to videos “promoting a phony miracle cure for a serious illness” or “making blatantly false claims about historic events like 9/11”, that could “misinform users in harmful ways”. This came after repeated criticism from the mainstream media that YouTube’s recommendation algorithm is an engine for far-right radicalization and fake news.

It is a widespread view that people are pulled into a rabbit hole of disinformation by YouTube’s recommendation algorithm, which recommends them one extreme video after another. I have previously tried to show how this picture is not accurate, but matters of accuracy don’t prevent YouTube from taking measures to respond to the media backlash.

Wanting to limit the influence of these videos seems reasonable at first, but what about that ambiguous phrase, “borderline content”? It seems designed to be meaningless. Given how the media has mischaracterized non-woke channels, I was suspicious that in an effort to appear to be cracking down on these channels they would show a strong left-wing bias and that they would use this ambiguity as a cover for arbitrary ad-hoc censorship.

This is important, because the algorithm that determines suggested videos has become an influential but underappreciated part of politics. A Pew survey found that 18% of US adults consume news through YouTube, and — according to YouTube Chief Product Officer Neal Mohan — 70% of the watch time is from YouTube’s suggested videos.

So when YouTube makes changes to its algorithm, it has the effect of a gentle breeze, influencing what everyone watches and ultimately what ideas get expressed. I don’t think Google went out seeking this kind of soft power, but unfortunately, an important part of the new media landscape is now formed by people with hazy incentives behind closed doors.

The effect of YouTube’s new algorithm is hard to determine, which is partly deliberate. This is because the company is caught in a bind: how can they give the impression that YouTube is a free and open platform, able to be used by anyone, while at the same time appease the censorious media, which has always viewed the platform as a boon for right-wing radicalization and conspiracy theories? This bind prevents them from drawing any concrete lines at all and leaves creators in the dark as to whether they should make an edgy joke in their videos or not, for fear of being labeled “borderline”.

The only way to find out what YouTube’s conspiracy crackdown really means is to go straight to the data. By closely monitoring their recommendations, I hope to be able to piece together patterns about how they are influencing politics. Who are the real winners and losers of Youtube’s conspiracy crackdown?

Loser: Deep-State Conspiracy Channels

According to my research, if YouTube’s intention was to limit the exposure of conspiracy channels and promote more authoritative news sources, they have somewhat succeeded. Recommendations to deep-state conspiracy channels like Next News Network have drastically reduced. This is a right-wing channel run by prominent conspiracy theorist Gary Franchi, which currently has 1.2 million subscribers.

Jesselyn Cook and K. Sophie Will have previously reported on the drastic reduction in recommendations to conspiracy channels. They provided me with their list of these channels and I provided them with the recommendation data. I think the list is legitimate. They cluster channels together by recommendations, and after watching a random set of their videos, I think that they do fall in line with what any sensible person would regard as fake news. Check them out for yourself and make up your own mind.

Here is a detailed look at the portion of impressions these channels received during the year. “Impressions” is an estimate for the number of times a suggestion has been seen (not necessarily clicked).

Despite the announcement being made in January, it appears that the recommendation algorithm wasn’t significantly changed until April. After April, these channels received close to nothing.

A noteworthy change is the portion of recommendations that now go to Fox News rather than similar deep-state conspiracy channels — a mild improvement as Fox News only sometimes peddles in hyper-partisan conspiracy theories.

According to Jesselyn Cook and K. Sophie Will, since January, Fox News’ main YouTube channel has seen its total monthly views more than triple. This is a startling result, perhaps one contrary to the desires of the liberal media.

No Change: Left/Center/Right

It turns out that my worst fears were unfounded. The portion of recommendations across the left/right political divide is very stable. I know most people reading this will think the following graph shows left-wing bias. But if you take into account the views of destination videos, there isn’t much in it.

So far it seems that there is little to complain about. Conspiracy channels have been specifically targeted and there appears to be no political bias. But the changes have also had effects outside the immediate intentions of YouTube. These — according to my own data collection — are some of the unintentional outcomes of the changes to the algorithm:

  • Recommendations to cable news channels, like Fox and MSNBC, have increased.
  • Suggestions to a seemingly random selection of independent YouTubers have reduced.

Winner: Corporate Media

The changes have been a big gift to corporate media — who have a large presence on YouTube — at the expense of many independent YouTube creators. Channels that re-publish their videos from cable or TV went from an estimate of around 50% impressions to 73%.

This is bad news for many YouTube creators like David Pakman, a politically progressive talk show host that is careful about the quality of information in his show. These changes have increased recommendations towards hyper-partisan channels like Fox and MSNBC who regularly misrepresent current events and promote conspiracy theories.

It isn’t obvious why YouTube creators who invest in the platform, sometimes depending on it for their livelihood, and who aren’t peddlers of conspiracy theories and right-wing extremism, should have their influence reduced in favor of corporations, which now take up nearly three-quarters of recommendations. It signifies how far YouTube’s vision has strayed from the community of creators that give it life. They are more concerned with turning YouTube into a bland receptacle for advertisers than a fun platform for ideas and entertainment.

It also isn’t obvious how independent YouTubers can avoid their fate. The nebulousness of “borderline” offers creators nothing to point to in deciding how they should moderate their content to avoid punishment (regardless of whether they want to compromise the integrity of their channel or not). This lack of transparency creates a dynamic where creators are like ignorant farmers praying to the algorithm gods for rain. The only way that YouTube can be fair to creators is if the rules are fixed, unambiguous, and announced beforehand, but this is never likely to be the case.

YouTube creators praying for suggestions and monetization

I like David Pakman’s response to algorithmic adversity. He is diversifying his distribution to Twitch and obtaining a stream of income directly from supporters through his website. This makes him less exposed to the algorithmic gods. As YouTube has decided to make their platform more pleasing to corporate media and advertisers, creators need to move or diversify between platforms to create leverage.

Loser: The Right of the Alternative Influence Network

When CNET reported that channels in the Alternative Influence Network (which includes channels like Joe Rogan, Sam Harris, Tim Pool, and Destiny) had their recommendations drop from 7.8% to 0.4%, I was worried that YouTube had overreacted and punished alternative channels that make a valuable contribution to public debate. Nicolas Suzor, the author of this new data, and Rebecca Lewis, the author of the AIN, celebrated this achievement and also mischaracterized many of the listed YouTuber’s as alt-right.

But from my own research, I have concluded that something is completely wrong with Suzor’s analysis, which suggested that Joe Rogan was receiving 7.5% of ALL YouTube recommendations — absurd on the face of it.

The new algorithm has reduced recommendations to the AIN, but YouTube has distinguished between centrists, like Joe Rogan, and white supremacists like Stefan Molyneux. The right-wing element of the AIN has dropped from 7% to 3%, but the rest has stayed the same.

The graph below shows individual channels and ranks the biggest winners and losers in terms of the change in their impressions as a percentage of all channel impressions. This list has been filtered to channels exclusively focused on news, politics or culture-war topics.

The recommendations are impacted by so many factors (e.g. the popularity and volume of content) that it is difficult to tease out algorithmic changes compared to natural ones. There is a large amount of variation over time that hides underlying systematic changes, so please interpret the above as you would the change in the value of volatile shares.

I believe YouTube is not so much politically biased as it is overly sensitive to media criticism. They don’t want to outright ban big channels such as Steven Crowder’s, because intervening will create an industry of people lobbying against them, and remove any illusion that they are concerned about free speech. Demonetization and algorithmic changes allow YouTube to give a nod to free speech while also giving a nod to the media, who call for censorship. But many channels lose out, not because of strong motives one way or the other, but as a product of the algorithm technology itself, which is inevitably loose with its definitions and catches many respectable channels in its “conspiracy theory” net.

Overall, YouTube needs to be more precise in their policies and more transparent with their recommendations if it is going to win back the trust of political YouTubers. As I write this, YouTube is threatening to cut my API access and asking me to delete all recommendation data older than 30 days. A sure sign they will need to be compelled to provide transparency into their recommendations, which I intend to keep you informed of, one way or another.

--

--

Mark Ledwich
Mark Ledwich

Written by Mark Ledwich

Software Engineer. Insatiable reader and podcast listener. Interested in biotech, psychology, philosophy, economics.