Skip To Content
Cambridge University Science Magazine
Social connectedness is allowing misinformation to spread further and faster than ever before. Can the spread of misinformation ever be contained, or better yet, prevented?

We live in times dominated by uncertainty. Headline-news events are swiftly followed by a myriad of online responses, proposed causes and remedies, cries of ‘conspiracy’, and demands for transparency. The emergence of the COVID-19 pandemic has worsened the situation. False information is used to construct attention-grabbing stories that propagate mistruths. This is not new, but in our increasingly connected world it is becoming a more severe problem. Tedros Adhanom Ghebreyesus, the director-general of the World Health Organization put it perfectly when he said “we’re not just fighting an epidemic; we’re fighting an infodemic”. Perhaps our best hope of a solution lies in a vaccine — a psychological vaccine.

False information can be described as misinformation (unknowingly incorrect) and disinformation (knowingly incorrect). Like a biological virus, they can spread through social networks. The ongoing pandemic highlights that they can be just as dangerous. A widely discredited belief that injection or consumption of bleach can cure Covid-19, which began with an off hand remark by Donald Trump, led to 100 people calling an emergency hotline in Maryland alone with cases of bleach intoxication, with an overall spike in cases across America. Despite the poorly worded statement from the President, this likely would never have had such a large impact if it were not for the quick proliferation of the claim via social media accounts, notwithstanding its refutal within 24 hours by Dr Anthony Fauci, one of the key members of the White House Coronavirus Task Force. Information, no matter how incorrect, can rapidly spread regardless of expert opinion. Today, where every individual can go viral, the barriers to misinformation are low.

The new ways of freely spreading information via social media require innovative countermeasures to contain the spread of misinformation. Social media giants such as Facebook have pushed to promote content from reputable sources and lessen the reach of misinformed posts. Google alone has committed £5 million to fund fact checkers fighting misinformation related to the pandemic. Independent organisations have employed advanced artificial intelligence (AI) in an attempt to identify misinformation outbreaks in their infancy to channel debunking efforts in the appropriate networks. But AI is prone to error, and these methods only aim to control the spread once it has begun. Once the mistruths have taken hold, debunking loses much of its efficacy. As such, preventative measures could be more effective.

One preventative approach which comes from the world of psychology is inoculation theory.

To understand the theory, it helps to first understand how a biological vaccine works. These typically act by showing our bodies a version of a pathogen that is incapable of causing illness, whether due to having been weakened in a lab or broken into fragments. In response, our bodies build defence mechanisms so that, in the case of a real infection, they can quickly eliminate the agent before any significant damage is done. Inoculation theory works in an analogous way, preparing the brain by showing a weakened form of misinformation and allowing the subject to form their own counter arguments. This process will strengthen the mind's ability to counter newer and more complex arguments in the future.

Implementing the theory is somewhat of a balancing act, since supplying an argument that is too convincing for the subject (although still wrong) and not countering effectively can actually cement the idea one is trying to defend against. The argument needs to be “strong enough to trigger the antibodies but not so strong that you convince people of something that isn’t true”, says Dr Sander van der Linden, Professor of Social Psychology at Cambridge University and Director of the Cambridge Social Decision-Making Lab.

Dr van der Linden is one of the leading experts in tackling fake news. Much of his research concerns controlling its spread on social media, where most resources are invested in time consuming fact checking and debunking. But with the recent explosion in fake news surrounding the pandemic, social media companies have been struggling to keep up. ‘Everyone is running behind the curve’, says Dr van der Linden. Feeling overwhelmed, social media companies are turning to ideas like inoculation theory, pre-emptive strategies to stop the problem before it is one. His team has already entered into partnerships with Google and WhatsApp who are interested in employing tactics based on their research.

One tactic the researchers have investigated is an online browser game, Bad News. Players are guided through a comical exercise in which they impersonate ‘fake news’ experts promoting conspiracies, thereby learning the tools of the trade in a fun way. Large-scale studies of the game have shown that by exploiting techniques such as impersonation, polarisation, and emotional appeal, the players become more sensitive to their use in real life, and thereby develop resistance to those methods. Teaching these skills on a wide scale would be the equivalent to providing ‘herd immunity’ for the population — another idea borrowed from biology. One appeal of this tactic is that its entertainment factor can help it reach a wider audience. In fact, it has already gone viral on reddit. This is one place where the biological analogy fails — biological vaccines do not spread, but psychological ones can.

The key to inoculation theory’s success may be in its wide scale adoption by our biggest social media sites. However, Dr van der Linden believes it can also be used on smaller scales, even in everyday conversations, provided both sides are knowledgeable enough on the subject that they can express and counter the basic fallacies of the argument.

One such approach is the Fact-Myth-Fallacy framework developed by Dr John Cook. This strategy involves presenting a fact and a conflicting myth, followed by an explanation of the factual distortion. For example, the fact could be: ‘Human CO2 emissions are a main cause of climate change’. Subsequently, a conflicting misinformation is presented: ‘Human CO2 emissions are tiny compared to most other sources. The effect is minimal’. The psychological ‘virus’ has now been presented. In the final step, the fallacy within the myth is revealed: ‘Human CO2 emissions are small relative to other sources but small changes in a balanced system can create big effects. Climate models show that the imbalance induced by human emissions is the cause of recent climate changes’.

While our connected world has allowed misinformation to spread faster and further than before, each of us has a voice within this network. Although misinformation is a problem that will stay, we ca fight against it using innovative strategies. For the ills of misinformation, inoculation theory is a promising defence.




Joanna Lada studied Mathematics (Class of 2020) at St John's College; Jake Rose is an MSci student in Astrophysics at Magdelene College. Artwork by Marie Cournut.