Ask Americans about the war on drugs and most will tell you illicit drug use is a health problem, not a crime. They’ll tell you “getting tough” failed and the best approach for problem usage is treatment and compassion, not censure. Why do diseases of the mind, then, not invite the same response? There is a powerful push by elites to purify social media platforms of misinformation, with a lack of interest in underlying causes, a lack of concern for unintended consequences, and a lack of humility in their ability to determine Truth. I’m worried that the war on misinformation will backfire and accelerate disenfranchisement and misinformation. That it will be a massive failure of dominant culture and institutions, just like the war on drugs.
In the past few months, Facebook, Twitter, and YouTube have been waging a war against QAnon, a leviathan of a conspiracy theory which claims, among other things, that there is a deep-state cabal of Jewish, Satanist, globalist, Democratic elites in government, business, and the media, secretly controlling the world and running a global child sex-trafficking ring. If that list of adjectives seems a little long, it’s because there is no anti-elite conspiracy that doesn’t eventually become entangled in QAnon’s web. More recently, QAnon fans have claimed that COVID-19 is a hoax and that the US election was rigged against Donald Trump, and these wisps of paranoia are constantly re-forming themselves to bend around the reality of events.
Social media giants, recognizing that these conspiracies were gobbling up a lot of engagement on their sites, took action by censoring them. In August, Facebook expanded their ‘Dangerous Individuals and Organizations’ policy to address ‘militarized social movements and violence-inducing conspiracy networks.’ According to their most recent update, they have removed around 1,700 pages, 5,600 groups, and about 18,700 Instagram accounts specifically representing QAnon. Twitter similarly removed many QAnon-linked accounts in July, citing their potential to inflict ‘offline harm.’
YouTube “terminated hundreds of QAnon channels” in October by stretching the definition of existing policies. According to Transparency.tube data, about 70% of QAnon’s reach was removed in the ban.
They are right to be worried. The ease with which a substantial portion of Americans can be made to believe the widespread-election-fraud narrative is alarming, and several QAnon adherents have been arrested after taking offline action, including Matthew Wright, who blocked a bridge over the Hoover Dam while heavily armed, demanding that an alleged report be released with damaging revelations about Democrats. In the lead-up to the 2020 election, a Yahoo News/YouGov poll found that half of Trump supporters, while mostly not onboard with QAnon as a whole, believe the claim that ‘top Democrats’ are involved in child sex-trafficking.
So if the falsehoods are so grievous and the stakes for our social fabric are so high, then why am I against the QAnon crackdown and liken it to the war on drugs? Because both are marked by a preoccupation with tackling the supply while ignoring the deep psychological needs that drive demand. What we see on platforms like Facebook is a supply of information for an increasingly uncertain and paranoid subsection of society, and Facebook imagines that by expunging this information they have eliminated the problem. In the same way, governments profess to solve the drug problem by criminalizing addicts, while ignoring the root cause of their usage.
While they may be able to control to a limited extent what content appears on their own platform, they exaggerate the effectiveness of their interventions. The platforms are powerless to prevent QAnon supporters from being funneled into alternative, even more epistemically challenged silos. The draconian censorship of recent months has encouraged QAnon supporters to migrate onto more conspiracy-friendly platforms such as Parler, Gab, Telegram, BitChute, and Rumble, and this was sometimes encouraged by the companies themselves as a way to recruit new users.
The homepage of Parler tells its visitors to ‘Speak freely and express yourself openly, without fear of being “deplatformed” for your views.’ And Andrew Torba, the founder of Gab, wrote on the company blog that he ‘welcomes QAnon across its platforms.’ In fact, the heavy moderation efforts of companies like Facebook has become one of the few wedges that competitors can use against them. We need competition, and light touch moderation is a good thing, but the dynamics of the current social media landscape is this: platforms pitching themselves as free speech zones in 2020 attract mostly conspiracies and the edgy-right.
As Renée DiResta writes in a piece for the Atlantic, censorship also ultimately creates a ‘feedback loop’ that lends credence to QAnon’s the-liberal-elites-are-trying-to-silence-us narrative:
Mainstream platforms have come to the conclusion that certain content or behavior has serious downstream implications, so they moderate it with a heavier hand. That moderation, particularly when sloppily executed, is perceived as censorship by those affected, and the content or accounts taken down are recast as forbidden knowledge.
According to COO Jeffrey Wernick, Parler gained 4.5 million user accounts in the week following the US election, increasing its total to 9 million. Also, Rumble’s number of unique visitors increased from 45 million in September to 60 million in October. Its trending list contains many videos from X22 Report, one of the conspiracist channels recently removed from YouTube. This great migration from mainstream to ‘free speech’ platforms will inevitably have the effect of fostering hyper-partisan, right-wing echo chambers — tailored realities which barely offer even a glimpse of an alternative opinion.
This is not to imply that companies such as Facebook and YouTube have been completely successful at policing their own turf. It has been reported that at least some prominent QAnon influencers and channels have learned to camouflage their content by using coded language or tweaking their Facebook bios and hashtags to avoid automated efforts to shut them down. QAnon believers have already demonstrated their ability to do this very well when they hijacked the hashtags #SavetheChildren and #SaveOurChildren — which started as a fundraiser for an anti-trafficking charity — to reach new audiences. Indeed, for people who believe a group of liberal elites secretly controls the world and wants to suppress them, the necessity of shapeshifting is built into their very worldview. QAnon continues to have a large presence on YouTube after the ban. Transparency.tube data shows that overall views for QAnon only halved between September and November. A majority of QAnon channels were removed and recommendations curtailed, but the remaining channels grew in popularity due to high demand for this content.
So, if censorship doesn’t actually reduce the demand for QAnon-related content, what is it for? What does it solve? Rather than helping society at large, the only real beneficiaries are YouTube, Facebook and Twitter, who get to wash their hands of our toxic political climate. And while they do this, they tend to drastically overestimate QAnon’s danger to justify their increasingly censorious policies. This ‘timeline of violence’ by the Guardian, for example, is a short list of what is essentially a series of uncoordinated, standalone acts of violence. Just like the war on drugs, the act of ‘getting tough’ is often based on an exaggeration of harm, which may in the end have unintended consequences worse than the problem itself.
This has been a very common feature of the war on drugs, where politicians and the media would often feed a moral panic over the harmful effects of drugs, including relatively harmless ones like marijuana. Harry Anslinger, who some say is the father of drug prohibition in America, said the following about the drug: ‘Marijuana is a shortcut to the insane asylum. Smoke marijuana cigarettes for a month and what was once your brain will be nothing but a storehouse of horrid spectres.’ The spectre of this kind of hysteria perhaps still lingers in these comments by Tristan Harris on Joe Rogan’s podcast, speaking about social media: “If you start with a World War Two video, YouTube recommends a bunch of holocaust denial videos”, and in other speeches: “With technology, you don’t have to overwhelm people’s strengths. You just have to overwhelm their weaknesses. This is overpowering human nature. And this is checkmate on humanity.” Harris’ assertion about holocaust denial videos is untrue. These videos are systematically removed from YouTube and less harmful conspiracy theories are hardly recommended at all. I don’t think he even believes what he says literally. Given how important it is to raise awareness of social media’s effects on human nature, he is perhaps exaggerating the truth to make a point. Even so, this kind of rhetorical extravagance adds fuel to the fire of a moral panic that is forcing social media companies into a censorious and unhelpful approach to tackling conspiracy theories.
The reality is that it’s only a small number of people who act on them. For the rest it’s simply a form of community, an emotional investment, a way to beat the chaos of the world into a more convenient shape for processing. It is no accident that QAnon is thriving during a pandemic and in an era of economic insecurity and divisive politics.
So, what is the solution if not censorship? I’m much less confident about solutions than I am diagnosing the problems, but I do like many of the options being tried and proposed. I like the idea of platform nudges like adjusting recommendations and information banners on misinformation. However, we should remain skeptical that any particular intervention will work, and evaluate them independently of big tech. Grassroots efforts to cool down partisanship, like BraverAngels, sound promising. I’m also with David Brooks when he says making pan-tribal friendships reduces ‘the social chasm between the members of the epistemic regime and those who feel so alienated from it.’
These are just a few hand-wavy examples that will appear weak and complacent to many of you, but they are more likely to work than excessive moderation. We need interventions focused on harm-reduction and the demand side of misinformation, not another war.