I know that it's tempting to blame what happened on Tuesday night — the re-election of a former game-show host and inveterate liar with 34 felony counts and two impeachments as president of the United States — on social media in one form or another. Maybe you think that Musk used Twitter to platform white supremacists and swing voters to Trump, or that Facebook promoted Russian troll accounts posting AI-generated deepfakes of Kamala Harris eating cats and dogs, or that TikTok polarized voters using a combination of soft-core porn and Chinese-style indoctrination videos to change minds — and so on.
In the end, that is too simple an explanation, just as blaming the New York Times' coverage of the race is too simple, or accusing more than half of the American electorate of being too stupid to see Trump for what he really is. They saw it, and they voted for him anyway. That's the reality.
It's become accepted wisdom that platforms like Twitter and Facebook and TikTok spread misinformation far and wide, which convinces people that the world is flat or that birds aren't real or that people are selling babies and shipping them inside pieces of Wayfair furniture. And it's taken as fact that these tools increase the polarization of society, turning people against each other in a number of ways, including by inflating social-media "filter bubbles." We all know this. And particularly when there is an event like a federal election, concern about both of these factors tends to increase. That's why we see articles like this one from Wired, which talks about how social platforms have "given up" on things like fact-checking misinformation on their networks.
But is there any proof that social media either convinces people to believe things that aren't true, or that it increases the levels of polarization around political or social issues? I don't want to give away the ending of this newsletter, but the short answer to both of those questions is no. While social media may make it easier to spread misinformation farther and faster, it hasn't really changed human nature itself all that much. In other words, social media is more of a symptom than it is a cause.
In case you are a first-time reader, or you forgot that you signed up for this newsletter, this is The Torment Nexus (you can find out more about me and this newsletter — and why I chose to call it that — in this post.)
The Russians are coming!
Anyone who followed the 2016 election in the US probably remembers how the topic of misinformation — and its cousin disinformation, which is misinformation that has been created deliberately to mislead — became a kind of frenzy, with everyone looking over their shoulders to see whether Russian disinfo was distorting reality for the purposes of electing Trump as president. The poster child for this phenomenon was the Internet Research Agency, an innocuously-named entity that was created and run by a close friend of Russian dictator Vladimir Putin and employed dozens of agents whose sole job was to create disinformation aimed at American social-media users.
One of the first things I did after I joined the Columbia Journalism Review as its chief digital writer in 2017 was to fly to Washington to interview senators and sit in on Congressional hearings into Russian disinformation. As with so many of these government hearings, however, very little of any consequence actually happened; most of the time was taken up by members of Congress showing mockups of Facebook disinformation on pieces of giant posterboard so that they could grandstand for the TV cameras.
Despite a number of articles drawing a direct link between social-media disinformation and the 2016 election, and suggestions from US intelligence sources that the IRA helped get Trump elected, this was a lot of sound and fury, signifying very little (Russian hacking and release of documents and emails is a somewhat different story). A study published in Nature last year looked at data from 1,400 respondents and found that only one percent of Twitter users accounted for 70 percent of the cases of exposure to Russian disinformation. In the end, the study said that it found "no evidence of a meaningful relationship between exposure to the Russian foreign influence campaign and changes in attitudes, polarization, or voting behavior."
A study in 2017 tested people's recall of "fake news" on Facebook, but in addition to using real fake stories, the researchers also created their own fakes and asked people if they remembered seeing them. The result? Just as many said they saw those ones as the 'real' fake news stories. As the New York Times put it, this suggests that it's not so much that misinformation is re-shaping people's views of the world, but rather that some proportion of social-media users "are willing to believe anything that sounds plausible and fits their preconceptions about the heroes and villains in politics."
Misinformation doesn't work the way you think it does
Jennifer Allen, a post-doctoral researcher at the University of Pennsylvania and an expert in digital persuasion and misinformation, told the Reuters Journalism Institute recently that we often believe that others are far more susceptible to false content than we are, despite evidence showing this isn't the case, a phenomenon known as the third-person effect. In reality, content designed to influence or persuade has very small effects on people's political attitudes or voting choices or behaviour. Alex Stamos, former director of the Stanford Internet Observatory, says there has been a "massive overestimation of the capability of mis- and disinformation to change people’s minds."
Carl Miller, research director at the Centre for the Analysis of Social Media at Demos in the UK, told me earlier this year that when it comes to misinformation, most people have “fairly naive ideas" about how it works. People imagine that bad actors spread convincing yet untrue images about the world in order to get people to change their minds on important political or social topics, Miller said, but in reality, most such influence operations are designed not to spread misinformation but rather to “agree with people’s worldviews, flatter them, confirm them, and then try to harness that." In other words, misinformation only works if there is an existing belief or tendency to play off, which means that it doesn't create beliefs so much as confirm them.
Last year, four scientific studies looked at how — or whether — Facebook's news feed influences the political beliefs or behavior of users, and found little evidence of any impact. One study involved more than twenty thousand users of Facebook and Instagram, and replaced the normal recommendation algorithms used by both services with a reverse chronological feed, or one in which the most recent posts appear first (this was one of the reforms suggested by Frances Haugen, who leaked hundreds of internal documents that she said showed Facebook was hiding how unhealthy its apps were for users' mental health). Other papers tried limiting whether certain types of content could go viral or not, and one looked at the news stories that made it into a user's feed and correlated that with how liberal or conservative the user was.
Meta, Facebook's parent company, crowed about these results, not surprisingly, although some critics — including Haugen — pointed out that all four research papers were written after the social network had implemented a number of news feed changes aimed at quelling disinformation in the runup to the 2020 election. David Garcia, a professor at the University of Konstanz in Germany, wrote in Nature that, as significant and broad-reaching as the studies may have been, they didn't eliminate the possibility that Facebook's algorithms contribute to political polarization because the research was done at the individual level and polarization is "a collective phenomenon."
This is the part I think a lot of people miss. As Casey Newton noted in his Platformer newsletter, the studies were consistent with the idea that Facebook is "only one facet of the broader media ecosystem," one that includes networks like Fox News and Newsmax and dozens of other outlets. Yochai Benkler — co-director of the Berkman Klein Center for Internet and Society — has argued that distribution of misinformation and partisan arguments by Fox News and other networks, part of an evolution of conservative media that began with Rush Limbaugh in the early 1990s — played a far bigger role in what happened in 2016 than anything that Twitter or Facebook did. In effect, they are echo chambers that reflect something that has emerged elsewhere.
So why does this myth of fake news on social media swinging elections persist, despite an almost complete lack of evidence to support it? Sociologist Brendan Nyhan’s theory is that it’s a little like the myth that Orson Welles’s radio program “War of the Worlds” caused widespread panic in 1938. The program was likely only heard by a tiny number of people, and there’s no actual evidence that it caused any kind of panic, and yet the myth persists. If you are blaming social media or "disinformation" for what happened in the election, I think you are barking up the wrong tree. At best, social media reflected or amplified what was going on in the "real world." It didn't create it.
Got any thoughts or comments? Feel free to either leave them here, or post them on Ghost or on my website, or you can also reach me on Twitter, Threads, BlueSky or Mastodon. And thanks for being a reader.
I research developmental psychology (aka leadership development, ego development, vertical development) and your point about “misinformation only works if there is an existing belief or tendency to play off, which means that it doesn't create beliefs so much as confirm them” is correct from my perspective. So it’s not about changing people’s minds but about reinforcing their existing perspectives, beliefs, and worldview.
But here’s the core thing that I think needs to be asked. If a political candidate in one of their rally speeches spreads misinformation and disinformation, and that then gets shared on social media, is that fake news?
Of course not, they actually said it in their speech.
But here’s the thing. What they said is still misinformation and disinformation. And it will still sway voters based upon their existing beliefs as you said.
This to me is the greatest issue of our day. We have leaders who both completely misunderstand the complex world we are moving into (often due to technology becoming so complex that it acts unpredictably like ecosystems in nature) and, at the same time, leaders who completely understand how to psychologically manipulate people because of people’s limiting beliefs and base psychological needs.
In fact, what’s even more alarming is how psychological manipulation of people has become so rampant in our society today that it’s effectively become normal (ie politics, business, etc), with many people completely unaware they are under the influence of it because it manipulates their beliefs and addictively makes them feels popular and strong. Gambling, video games, fashion / cosmetic industry, and the advertisement industry in general are good examples of this.
But I do think social media platforms do reinforce these bubbles of belief and these psychological addictions…depending upon how you use it though. Dave Gray has a website called Liminal Thinking that has a video him explaining how these bubbles of beliefs reinforce our beliefs, so much so that they intentionally limit our understanding of reality so that we often misinterpret it and misunderstand it.
Yet in our rapidly changing, complex world today, we need people who are capable of stepping beyond their limiting beliefs and worldviews, so they can actually understand the present reality clearly which then allows them to tackle these complex problems with a true understanding of them, rather than just with a delusional belief that they’re a “simple fix”, as some politicians would have you believe.
Great article Matthew. I too once thought people were just dumb and/or naive in believing the things the internet serves up to them on that silver algorithmic platter. One thing not addressed here is this: perhaps the larger contributor to Trump winning this election is because his voters "are" him. It's become clear to me that we are no longer a kind and decent people. I started to have these thoughts after Sandy Hook. I thought for sure that tragedy would bring out our greater good and finally make headways on gun control but nope - after everyone giving their meaningless statements on "thoughts and prayers", the next words uttered was "stay the hell away from my guns". My point is this: America is no longer a "shining city on a hill" as Ronald Reagan so eloquently said so many years ago. I used to believe that. Not anymore. I understand the pessimism of this view and I apologize for piling on to this already bleak scenario. Its just hard to see there being an alternative explanation in how this man received 72 million votes.