Debunking versus prebunking

Author: Jon Roozenbeek & Sander van der Linden
Published: 2021
The spread of misinformation has reached epidemic proportions. The World Health Organization (WHO) has even referred to this spread as a global “infodemic.” As governments and social media companies scramble for solutions, there has been a revival of the classic social psychological approach called “inoculation theory.”

A Psychological Vaccine Against Misinformation

In the 1960s, concerns grew among US government officials that captured American soldiers might be brainwashed by enemy troops. This prompted social psychologist William McGuire to think afresh about how people could learn to resist unwanted persuasion. One suggestion was for a “vaccine for brainwash,” in other words a “psychological inoculation.” Psychological inoculation follows the immunization analogy. In vaccines, a weakened dose of a virus triggers antibodies, enabling the body to more easily ward off future infection. Similarly, a persuasion “vaccine” triggers a thought process akin to “mental antibodies,” conferring attitudinal resistance against future persuasion attempts. In the first step of the persuasion vaccine, a person must be warned that a persuasion attempt is imminent. This “forewarning” causes a sense of threat that activates our internal defenses. Second, there is a refutational pre-emption or “prebunk,” where individuals are exposed to a severely weakened dose of the persuasive challenge along with a clear guide on how to effectively counter-argue and resist the persuasion attempt. Our research team has revived inoculation theory in the context of online misinformation. First, we built upon the idea that revealing manipulative intent is a powerful way to induce resistance to persuasion. Second, we revisited an early hypothesis that “active” inoculations—where people generate their own counter-arguments against impending persuasion attempts—are more effective than “passive” inoculations where counter-arguments were simply provided to people. Together with other colleagues, we used these insights to give birth to the idea of prebunking, whereby forewarning people of the possibility of being exposed to manipulative misinformation, combined with training them in advance on how to counter-argue if they did encounter it, should reduce susceptibility to misinformation.

Game On: Active Inoculation via Playing a Game

What would our online inoculation task look like? Our answer was an online game we created called Bad News, in which people pretend to be a fake news creator. The game simulates a social media feed to give people the experience of navigating news media online. In the game, players create their own content and learn about six key misinformation tactics (“the six degrees of manipulation”) that are commonly used in our current media culture:
  • Fearmongering
  • Conspiracy theories
  • Impersonating experts
  • Polarization
  • Trolling
  • Discrediting
The game thus provides a forewarning about what manipulative tactics look like and then players interact with weakened doses of these tactics, for example, by creating their own conspiracy theory or impersonating an expert, thereby offering (inter)active inoculation. We found that playing Bad News significantly reduces the perceived credibility of social media content containing misinformation, improves people’s confidence in their ability to spot misinformation online, and reduces self-reported willingness to share misinformation with others in their network. Since finding how effective online inoculation is, we have worked with governments and social media companies to develop other games that target specific types of misinformation. For instance, Go Viral! was created with support from the UK Government, the WHO, and the United Nations to combat COVID-19 misinformation specifically. The approach has also inspired other popular games, such as Cranky Uncle, which inoculates people against misinformation about climate change. These games highlight how inoculation theory has once again become a major player. Although its originators may not have foreseen how relevant the inoculation analogy would become, its application in the context of misinformation is booming. Twitter, for example, ran a prebunking campaign ahead of the 2020 US presidential election. There are many exciting questions left. For example, as with some medical vaccines, the effectiveness of psychological vaccines wanes over time without regular boosters. We are also working on computer simulations to answer perhaps the most important question of all: What percentage of a population needs to be vaccinated to achieve “herd immunity” so that misinformation no longer has a chance to spread? Of course, not everyone plays games, so we recently teamed up with Google to produce animated inoculation videos. As the misinformation virus continues to evolve, we remind the reader of a powerful quote from Defense against the Dark Arts Professor, Severus Snape, “Our defenses must be as flexible and inventive as the arts we seek to undo.”