When Jigsaw researchers met Jennifer in a Montana cafe, she explained how she came to believe that the Earth was flat. For the past few years Jennifer had become immersed in conspiracy theories that eventually became a significant part of her identity and life. She renounced her relationship with her parents, who were regular NPR listeners. She relocated to another state to be closer to other people who believed the same conspiracy theories. She even met her boyfriend on a dating site marketed to “truthers.”
Jennifer’s devotion to conspiracy theories was mirrored in other conspiracy adherents we met, which highlights a challenge for those of us studying ways to make people more resilient to misinformation: by the time we met Jennifer, her beliefs were so deeply held that she wasn’t open to receiving any information that undermined her beliefs.
One of the reasons misinformation is so pernicious is its ability to continue to influence thinking long after someone initially sees it. In fact, misinformation often persists even after someone has been shown a factual correction of the false claim. This is because misinformation can be “sticky,” meaning it can have what experts call a “continued influence effect” on someone’s memory and reasoning long after seeing it.
Debunking is especially difficult with conspiracy theories, which are often believed at an emotional, rather than rational, level. When Jigsaw interviewed dozens of conspiracy theory propagators in the US and UK, we found that their deeply-held beliefs — like Jennifer’s — were resistant to rational or factual counter-arguments, whether from fact-checking sites or mainstream media, as well as from their own family and friends.
Given the difficulty of dislodging beliefs based on misinformation, there is a growing field of research into helping people resist persuasion by misinformation in the first place. One approach borrows from biomedical science. Inoculation protects people against disinformation by teaching them to spot and refute a misleading claim. Inoculation messages can build up people’s resistance or “mental antibodies” to encountering misinformation in the future, the way vaccines create antibodies that fight against future infection.
Introduce an emotional warning about misinformation
Equip users to spot and refute the attack
If misinfo appears again, users can identify it as misinfo
For example, an inoculation message could warn people that they are likely to see anti-vaccine messages from “fake experts” because this is a common tactic used to mislead people about the legitimacy of their claims. People would subsequently become more resistant to misleading anti-vaccine messages that feature fake experts.
An inoculation message typically has three components that work in conjunction: a forewarning, a refutational preemption, and microdose of the misleading message (akin to introducing a small dose of the virus that is weakened, in this case, by being thoroughly refuted).
Emotional warning - Users are alerted that there are impending "attacks" to manipulate them.
Refuse the attack - Users are eqipped to spot and refuse a manipulative message.
Micro-dose - Users see example(s) of manipulative message to identify in the future.
Studies over the past 60 years have shown inoculation to be effective across cultures and on a wide range of subjects including the environment, public health, crisis management, and animal rights, among others. More recently, academics have demonstrated how inoculation messages can reduce the influence of misinformation and extremist propaganda online.
The speed and sheer variety of misinformation claims pose a daunting challenge for efforts that seek to fight false claims one at a time, but inoculation offers an inventive alternative. It is possible to inoculate against a common misinformation technique like using fake experts, or a “meta-narrative” such as scapegoating, rather than a specific misinformation claim. Recent research shows that inoculating against techniques rather than claims creates broader, more transferable immunity. This has the added advantage of allowing inoculation messages to be apolitical as they do not have to take positions or contradict specific assertions. Jigsaw has teamed up with scholars at the Universities of Cambridge and Bristol to develop short videos that inoculate against five of the most common misinformation techniques that apply in a wide variety of contexts online (scapegoating, fearmongering, ad hominem attacks, incoherent logic, false dichotomies). Findings are forthcoming and will be linked here along with the videos once published as an academic paper.
In one foundational study of inoculation against extremism, researcher Kurt Braddock tested whether participants could be inoculated against far-right and far-left extremist propaganda. Participants who read a text-based inoculation message prior to seeing an extremist propaganda post were less willing to support the extremist group and had lower perceptions of the extremist group’s credibility relative to a control group that hadn’t been inoculated. This suggests that, at least in a controlled study setting, an inoculation message can successfully confer a degree of psychological resilience to extremist messages. In early 2020, Jigsaw partnered with Braddock and the Polarization and Extremism Research Innovation Lab at American University on follow-up research to explore the effects of video-based inoculation against common extremist narratives online. Findings from this study will be published in 2021. In an effort to promote transparency and contribute to the field, we agreed ahead of time that all data derived from this experiment would be owned by our partners, and no prohibitions would be placed on academic publications.
There is plenty to learn about the circumstances in which inoculation is effective. To date, academic tests of inoculation have primarily been in western countries, so we intend to conduct research across cultural contexts to better understand how inoculation messages can be appropriately contextualized. We also are developing research to better understand the limitations of inoculation, such as how long effects last, in hopes to inform ecologically valid tests beyond a lab setting in the future.
Inoculation messages are one of a number of approaches we are studying as part of our ongoing effort to help empower individuals with durable resilience to misinformation. We are partnering with some leading academics in the field, including Sander van der Linden, Jon Roozenbeek, Stephan Lewandowsky, Kurt Braddock, Brian Hughes, and Cynthia Miller-Idriss, to explore some key questions on the application of inoculation to misinformation and extremism. What mediums and messengers are the most effective for conveying inoculation messages? How long do the effects of inoculation last? Can inoculation still help mitigate misinformation after someone believes conspiracy theories? We’ll continue to share our progress and research insights on this promising approach.