In the past week, tech giants including Facebook, Apple, YouTube, and Spotify banned notorious conspiracy theorist Alex Jones from their platforms. Jones, perhaps most famous for promoting the idea that the Sandy Hook Elementary School shooting was a hoax, was banned from these platforms for allegedly violating their terms of service in all sorts of ways.
But there was one Silicon Valley corporation that opted to allow Jones to stay: Twitter. You can still go to President Donald Trump’s favorite social media outlet and scan the @RealAlexJones feed, where you will learn that the bans are a plot by “deep state actors” to prevent the American public from learning the real truth about our government.
Tuesday night, Twitter CEO Jack Dorsey wrote a lengthy statement — published as a series of tweets, naturally — defending his company’s decision. My colleague Aja Romano has a lengthy and sweeping takedown of Dorsey’s full logic; I encourage you to read it.
But I want to focus on one of Dorsey’s specific tweets, one that, to my mind, reveals a deep issue at work here:
The tweet displays a profound misunderstanding of the way conspiracy theories and “fake news” work. The problem isn’t that there aren’t enough journalists correcting misinformation and myths; there’s tons of evidence out there that what Jones says is patently false.
Rather, it’s that conspiracy theories, once they spread, create hermetically sealed communities that are impervious to correction. The only way to stop this process is to stop them from spreading on platforms like social media, which is exactly what Twitter decided not to do.
It’s not surprising that Jack Dorsey doesn’t understand this: He doesn’t really have time to read the latest social science on conspiracy theories. And that’s the real problem: Tech giants are increasingly being asked to handle social problems, ones their leaders don’t seem equipped to address.
What Jack Dorsey gets wrong about conspiracy theories
In 2008, Harvard Law professors Cass Sunstein and Adrian Vermeule penned an article on conspiracy theories and how they work. They argued that conspiracy theories — which they define as “an effort to explain some event or practice by reference to the machinations of powerful people, who have also managed to conceal their role” — are, in their own way, quite rational.
“Most people are not able to know, on the basis of personal or direct knowledge, why an airplane crashed, or why a leader was assassinated, or why a terrorist attack succeeded,” they wrote. As a result, they search for information that fits what they already believe about the world and is confirmed by people they trust.
Conspiracy theories, Sunstein and Vermeule argued, spread in a variety of ways. One of these pathways, called an “availability cascade,” happens when a group of people accept a conspiracy theory because their preexisting beliefs about the world make them likely to believe it.
This is what happens with Alex Jones and people on the American right. Theories like “Sandy Hook was faked so Obama could take your guns” and “the ‘deep state’ is conspiring against Trump to destroy democracy” appeal to their basic, gut-level political orientation, which is that Democrats are nefarious and Trump is a hero.
Not all conservatives accepted these ideas when presented with them, of course, but it was appealing enough that Jones managed to build up a significant social media presence and a shockingly large amount of influence. In December 2015, then-candidate Trump went on Jones’s show, telling the host that his “reputation is amazing” and vowing that “I will not let you down.”
Jones has created a thorny problem for society. Once people start believing in his conspiracy theories, and trusting him as a source, it becomes extremely difficult to change their minds.
“Conspiracy theorists are not likely to be persuaded by an attempt to dispel their theories; they may even characterize that very attempt as further proof of the conspiracy,” Sunstein and Vermeule wrote. Because conspiracy theorists “become increasingly distrustful and suspicious of the motives of others or of the larger society,” efforts to debunk their myths often “serve to fortify rather than undermine the original belief.”
This isn’t just Sunstein and Vermeule’s theory: A significant body of empirical research on conspiracy theories finds that it’s extremely hard to change believers’ minds. One 2017 study, by two UK-based psychologists, presented people with anti-vaccine conspiracy theories and evidence debunking them — but randomly switched whether they saw the anti-vax arguments or the actual facts first. Then they asked them how that affected their opinions on vaccinating a child. The results were sobering.
“Anti-conspiracy arguments increased intentions to vaccinate a fictional child but only when presented prior to conspiracy theories,” the authors explained. “These findings suggest that people can be inoculated against the potentially harmful effects of anti-vaccine conspiracy theories, but that once they are established, the conspiracy theories may be difficult to correct.”
This is the problem with Dorsey’s logic. Now that Jones has an audience on Twitter, journalists’ attempt to “refute” him will fail. His fans will mostly disregard the debunkings, and his audience will continue to grow. This is what was happening on every other platform, prior to the bans. The other companies recognized that Jones was spreading dangerous lies, and that journalists simply couldn’t debunk them. The only way to stop these ideas was to deprive them of oxygen, to prevent people from being exposed to them in the first place.
Twitter’s CEO just doesn’t get that.
The problem with tech making social decisions
As frustrating as Dorsey’s statement is, there’s a part of me that doesn’t blame him. It really is not his fault that he hasn’t read the academic literature on conspiracy theories. His job is running a massive technology company.
While Twitter was alone on the Alex Jones issue, Dorsey is hardly the only tech CEO to make glaringly ignorant comments about social issues that affect their platform. Just last month, for example, Facebook CEO Mark Zuckerberg offered this nugget of anti-wisdom in an interview with Recode’s Kara Swisher:
Zuckerberg’s argument is that Holocaust deniers are merely deluded people (he later clarified that he “didn’t intend to defend the intent of people who deny the Holocaust”). But the purpose of Holocaust denial is not to have a good-faith argument about history — it’s to advance an anti-Semitic political agenda. Letting deniers spread poison on Facebook doesn’t serve the purpose of illuminating debate. Rather, all it does is allow yet another vile conspiracy theory to spread.
How to approach Holocaust denial has been, historically, a hard problem for liberal societies. The United States, with its expansive free speech tradition, permits Holocaust deniers to publish freely on the grounds that it would be dangerous to let the government regulate speech in this fashion. Germany and France have both decided to criminalize denial, on the grounds that it’s a form of incitement to racial hatred rather than legitimate political speech. Both approaches have benefits and flaws; brilliant scholars have written tomes making the case for one or the other.
But today, the spread of Holocaust denial, Sandy Hook trutherism, and other vile conspiracy theories isn’t just a problem for governments. It’s a problem for technology corporations, which regulate the primary means through which information is disseminated today.
Those companies — none of which have the legitimacy or public accountability that government officials do — have no choice but to engage with all sorts of extremely hard social problems surrounding free speech and bigotry. People like Jack Dorsey and Mark Zuckerberg are not the people who ought to be making these decisions for a democratic polity, but they have no choice but to make them. Sometimes they’ll get those decisions right, as most of these companies eventually did with Alex Jones. But often, they’re going to get them wrong — and the public will have no real way to hold them accountable.
This is your politics on big tech.