How Misinformation Spreads Online

Understanding the ways that misinformation campaigns are seeded and unfold online can help us determine how to stop them.

A scene unfolds across an illustrated flow chart that maps the spread of misinformation for one person or group of people on various devices to another.
Sophia Foster-Dimino

When a right-wing mob assaulted the United States Capitol on January 6, 2021, the dangers of misinformation and disinformation could no longer be denied. Here was a furious crowd acting on a genuine belief that the election had been stolen, based on false claims that had been championed and disseminated through right-wing outlets across a variety of social media platforms, particularly President Donald Trump’s Twitter account.

Misinformation and its effects have dealt a cruel blow to any lingering techno-utopian fantasies from the early days of the public internet. The “information superhighway” seems a quaint concept in the age of QAnon, but it’s worth remembering that naive optimism about the internet’s liberating impact on society was common throughout the 1980s and 1990s. Framing social problems as stemming from a lack of knowledge, prominent figures such as J.C.R. Licklider, Marvin Minsky, Edward Feigenbaum, and Nicholas Negroponte presented the computer as a sort of panacea that would allow everyone to access all of the information they could possibly need.

As we now know, the internet has also delivered a deluge of garbage that’s both unintentionally inaccurate (misinformation) and deliberately false or misleading (disinformation). Access to information is well and good; access to information without having the skills or inclination to distinguish truth from fiction can easily result in a situation where whatever appears at the top of Google’s search results or the Facebook news feed is regarded as gospel. But recognizing that misinformation is a grave problem isn’t enough to combat its spread — you have to understand how false and deceptive campaigns take root.

“These expressions aren’t innocuous and they did not spontaneously occur, and that’s what’s so hard to get people to realize.”

This effort is a core focus of the Media Manipulation Casebook, an initiative led by Dr. Joan Donovan at the Technology and Social Change Project at Harvard’s Shorenstein Center on Media, Politics, and Public Policy. The Media Manipulation Casebook is a resource that provides researchers, journalists, policymakers, and other members of the public with a common framework and vocabulary for examining misinformation. The Casebook offers a useful list of definitions ranging from “4chan” and “astroturfing” to “L33T speak” and “typosquatting,” but the project’s most valuable material is a series of case studies that break down actual misinformation campaigns into their constituent elements.

The Casebook uses a process that it calls “the life cycle method” to analyze and trace the sequence of media manipulation. “If you can reveal that lifecycle, it would give everyone a sense of empowerment to make sense of what they’re encountering in the world,” said Emily Dreyfuss, the project’s senior editor.

The research team tracks a campaign cycle as it progresses through five distinct stages. The first stage, “manipulation campaign planning and origins,” identifies the particular groups and individuals who are responsible for sparking the misinformation campaign. The second stage, “seeding campaign across social platforms and web,” is where the plan is put into action and dissemination begins. This in turn generates a third stage, “responses by industry, activists, politicians, and journalists.” These responses represent a turning point that demonstrates whether or not the campaign has successfully seeded its message. After a campaign takes off, a stage of “mitigation” occurs when a range of actors work to blunt and respond to the campaign. The final stage recognizes “adjustments by manipulators to new environment,” as those who had been active in stage one begin to plan for how their next campaign can successfully adapt to mitigation efforts.

“Media manipulation campaigns are concentric circles — campaigns inside campaigns inside campaigns,” Dreyfuss explained. “What we’ve found is that in order for the case study to really work, we want examples that are incredibly discrete.”

One recent study, “Distributed Amplification: The Plandemic Documentary,” is a case in point. It examines Plandemic, a conspiratorial video about the coronavirus pandemic that went viral in May 2020. While misquoting medical professionals and elevating the voice of a discredited scientist, the video claimed that the coronavirus was planned, that wearing masks would actually activate the virus, and that Covid-19 vaccines would harm those who received them. Exploiting the public’s concerns and frustrations surrounding the virus and the restrictive measures enacted to combat it, the video was initially uploaded to a range of social media platforms (YouTube, Facebook, Vimeo) by the video’s creator.

The creator anticipated that the video would be taken down, so it included a message instructing users to download and reupload it themselves. This “distributed amplification” strategy, the Casebook notes, “coaches participants to re-upload banned content in an effort to circumvent platform mitigation efforts.” In stage two, this exhortation worked, as viewers of the video seeded it across social media. By encouraging QAnon and anti-vaxxer groups to reupload and promote the video, it ultimately attracted millions of viewers by steadily accumulating a few thousand views here and several thousand views there — which brings us to stage three.

“Stage three is what makes or breaks a media manipulation campaign. There are many campaigns that don’t make it through the full cycle because they fizzle out in stage three,” Dreyfuss noted. “This is where we see campaigns hop from smaller platforms to much larger platforms… the moment when the society at large becomes vulnerable to manipulation.”

Rather than fizzle out, the conspiratorial video kept spreading. The threat of impending removal amplified a sense of urgency among viewers as it enlisted them to disseminate the message. When athletes and influencers with millions of followers began sharing the video, it migrated from the fringes to mainstream media. Stage four unfolded as social platforms worked to actively remove the video and media outlets and fact-checking organizations debunked its claims.

“If you’re going to report on a media manipulation campaign, you have to report on it critically,” Dreyfuss said, “otherwise you’re a manipulator yourself.” But despite attempts to take down the video and debunk the claim, by the time this campaign reached stage five, the video’s message had deeply permeated QAnon and anti-vaxxer groups while confusing much of the public. Warning at the outset that the video would be taken down had primed the audience to believe that deplatforming was just another attempt to keep the truth from getting out.

“Because something feels true, the guard is way down,” explained Dr. Jenny Rice, associate professor of writing, rhetoric, and digital studies at the University of Kentucky and the author of Awful Archives: Conspiracy Theory, Rhetoric, and Acts of Evidence. “The nature of conspiracy thinking is in some ways self-sealing, it’s generative… the proof against is proof of it.”      

While those who originally celebrated the informative potential of the internet generally placed the burden for sorting good information from bad on the individual user, the question recently has shifted to a consideration of the extent to which social platforms bear responsibility for the spread of misinformation.

“It’s unfortunate, but the only people who can really stop the campaign from being an ouroboros are the platform companies,” Drefyuss remarked, referring to an ancient symbol of perpetuation. “Platforms aren’t doing this on their own. They need a legal framework that requires them to act.”

It’s tempting to blame social media companies for letting misinformation metastasize. The impulse to mislead and manipulate is nothing new, but platforms like Facebook and Twitter allow deception and false beliefs to be disseminated more swiftly than ever before. There is a real difference between a mimeographed newsletter sent out to a couple dozen subscribers and a Facebook post, just as there is a difference between a VHS tape that you have to mail order and YouTube recommending that you watch conspiratorial content. Our susceptibility to misinformation isn’t entirely based on technology. Nevertheless, in a society that’s already inflamed, social media can seem like a hose spraying gasoline all over the place.

“People are trying to put something out there, and it isn’t even necessarily about that thing,” said Rice. “Correcting information around QAnon believers is not going to change the fundamental reason behind why they believe in it… We need to clear up bad information, but even more importantly is figuring out what is driving a whole bunch of people to put this out there.”

The challenge is determining how to stem the tide of misinformation while developing a better understanding of the underlying social realities that give rise to it. Resources like the Casebook are doing their part by documenting these campaigns and developing curricular tools to help the public grasp the dynamics at play.

“These expressions aren’t innocuous and they did not spontaneously occur, and that’s what’s so hard to get people to realize,” Dreyfuss emphasized. “Your friend is parroting, unwittingly in many cases, a campaign that was designed specifically to get them to write that Facebook post.”  

Follow The Reboot

Join a growing community that’s examining the state of the internet and exploring its future. Subscribe to our monthly newsletter.

A scene unfolds across an illustrated flow chart that maps the spread of misinformation for one person or group of people on various devices to another.

Artwork By

Sophia Foster-Dimino

Contact Us

Have an idea for a story or illustration? Interested in discussing partnerships? We want to hear from you. Send us a note at info(at)thereboot(dot)com.

Recommended Reading