By Fiona Crichton

What has become starkly apparent in the age of the SARS CoV-2 novel coronavirus (COVID-19) is that the effective communication of science-based health messages is no easy task (1). We see this exemplified in a resistance in certain quarters to accepting that COVID-19 is more serious than seasonal influenza (2) and reflected in a reluctance for some to comply with safety directives, such as wearing masks on public transport (3). 

More than a pure rejection of the state of the science there has been a rash of dangerous “alternative beliefs” adopted by some sectors of the community concerning the origin, symptoms, development, prevention, and treatments of COVID-19 (4). These misbeliefs include the falsehoods that COVID-19 was bioengineered in a laboratory in China; that COVID-19 is a “phony pandemic”; that 5G mobile phone signals cause, transmit, and weaken the ability to fight COVID-19; and that injecting bleach is an effective COVID-19 treatment. Most concerningly, there is a persistent misperception that any vaccine will be unsafe and pose a greater risk to health than contracting the virus (5).

How did we get here?

Science denialism occurs when facts and empirical evidence are rejected and there is an effort to convey the appearance of serious scientific debate, when there is in reality scientific consensus on a subject. There is nothing new about science denialism; in the 17th century the father of modern science Galileo was tried as a heretic for his championing of heliocentrism – the belief that the Earth and planets revolve around the Sun (6). In the 20th century science denial compromised public health because of a refusal to accept the link between smoking and lung cancer, and HIV and AIDs (7). And in the 21st century science denial has led to vaccine hesitancy, and a rejection of the evidence for anthropogenic climate change (8).

What lies at the heart of science denial?

There is considerable evidence that at this point in history science denial can be explained by a convergence of factors; the way we understand science, the way we are wired, and the way we now access and consume information.

The way we understand science

Understanding consensus 

There seems to be pervasive misperceptions about science, particularly about the scientific method and what consensus means in the scientific literature. In science, consensus is the collective judgment of scientists who are subject matter experts in a particular field of study. Such judgment is based on research findings that have held up in the face of rigorous and repeated testing and challenge. That isn’t to say that the consensus position is set in stone. Because science is about testing assumptions, challenging theories, and gathering evidence, the scientific consensus position will evolve in the face of new data and research findings.

It’s not a negotiation

It is unfortunate that in common parlance consensus is also used to refer to political agreement, compromise or appeal to popular opinion, which may have led to confusion about the rigour underpinning scientific consensus (9). Scientists are not negotiating a position or compromising to reach agreement. The oft talked about consensus figure that 97% of climate scientists believe humans are causing recent global warming has been arrived at by analysis of research papers and surveys of climate scientists (e.g. 10). Those proposing a less emphatic scientific consensus have been shown to include non-expert opinions; unsurprisingly evidence shows the level of consensus correlates with expertise in climate science (10). So, it is important that when we think about consensus, we consider the position of the relevant experts, rather than what the general public thinks or even what every scientist believes; it’s about the conclusions reached by scientists in the relevant field. 

Science is not about proof

There is also the disconcerting truth that science is not about proof – proof only applies to mathematics (11). Science is about collecting evidence to test a theory – a hypothesis. If the evidence collected is consistent with the hypothesis, then the hypothesis is validated, but it is not proven. As evidence is collected, and findings discovered, the hypothesis may be rejected, changed or refined and tested again. The process is often slow, so that knowledge is built incrementally and gradually over time. However, this has not been the case as COVID-19 rapidly spread across the world. The global pandemic sparked an international race to understand, contain and combat the virus. Without the luxury of time, researchers released findings rapidly – within days or months – rather than the customary months or years (3). The shifting state of the scientific consensus as new data surfaced has posed challenges for those crafting public health messages and guidelines.

The shifting state of scientific consensus 

For instance, it was an initial hypothesis – based on evidence related to most respiratory virus transmissions – that airborne transmission of SARS-CoV-2 was unlikely (12). However, mounting evidence that infective microdroplets are tiny enough to stay suspended in the air and expose someone at a two metre distance from an infected person, has meant that advice provided by the Centers for Disease Control and Prevention has recently changed to reflect this new understanding. Updated safety advice during a pandemic can be unsettling and upsetting to a public craving certainty, and serve to confirm suspicions that scientists are making it up as they go along. 

The way we are wired

Fight or flight

For some, science denial can be explained by the way the brain processes threatening information. News about the climate crisis or coronavirus is scary. The information becomes the figurative sabre-toothed tiger and the fight or flight reflex is in play. The impulse is to be attracted to “safe” information and repelled by information that feels dangerous (13). So, fear is a strong motivator of denial. If we find the concept of climate change or the current global pandemic terrifying, we may find ourselves drawn to narratives that down play risk or refute the science. In this way the fight or flight response is calmed and we can get on with our normal lives because everything is “okay”.

Cognitive biases

The way in which human beings process thoughts and ideas is also implicated in science denial. Cognitive biases are thinking or reasoning errors that occur as a result of the brain attempting to make sense of the world and reach decisions and judgments in an efficient way. These biases can affect what we remember, and what we attend to. For instance, we tend to pay attention to information that aligns with and upholds our pre-existing attitudes and beliefs, particularly if those beliefs are important to our sense of identity (14). In the interests of efficiency, we may also learn a little about a subject and then assume that we have an in-depth knowledge of that topic. That leaves us in the position of feeling confident in our own assessment of a situation, so able to reject a contrary view, even if it is backed by true subject matter experts. Such cognitive short cuts or thinking errors can massively undermine our understanding and acceptance of science. 

Belief perseverance and cognitive dissonance 

There are a number of related cognitive processes that play into science denial. As human beings, we are much less likely to be persuaded by and accept information that challenges or undermines the things we believe in – a phenomenon known as belief perseverance (15). In short, we are not inclined to change our mind easily and we resist being wrong. 

We also tend to pay attention to information that bolsters our beliefs and to discount information that does not align with those beliefs. This is known as confirmation bias. By actively looking for data that supports our viewpoint and screening out contrary evidence we can maintain a belief system that is not tested by the current state of science or knowledge (16). And if what we know, (e.g. fossil fuel use is contributing to climate change) conflicts with what we do (e.g. fly) that dissonance can lead to doubt, or minimising what we know so we can justify how we live. In this case science denial can be explained by the experience of cognitive dissonance – the inconsistency between what the science tells us and how we’re living our life means we are more inclined to shrug off the science to reduce the discomfort of living in a way that conflicts with the state of the evidence.  

Identity and motivated reasoning 

Importantly, science denial is strongly tied to the issue of identity and belonging. At this point in history, belief in anthropogenic climate change is often linked to how we identify politically. In the United States (17), Australia (18) and New Zealand (19) right leaning voters are more likely to be climate change sceptics. It also seems that when we identify with a group with beliefs that do not align with the science, group belonging can be more powerful than education to inform our point of view. In fact, research indicates that when it comes to climate change conservative voters in the United States are more likely to reject the state of the science if they are college educated (17). Surprisingly science education does not help.  Amongst Republican voters, higher scientific knowledge does not strengthen belief in anthropogenic climate change (20). This seems to be explained by a susceptibility to motivated reasoning; sometimes described as the tendency to decide which evidence should be accepted and which evidence should be rejected based on the conclusion we favour (21).

Motivated reasoning occurs when the preferred outcome acts as a filter that influences how we gauge the weight and truth of the scientific evidence before us. This process can help us protect beliefs that are important to maintain our worldview, which may be influenced by the way we identify politically, spiritually, or intellectually. But evidence does indicate that people are motivated by accuracy as much as they are by a preferred outcome. We are not doing mental gymnastics to uphold a view that we know in our heart of hearts is wrong. We are motivated by accuracy, but we vary on what we consider to be credible evidence (22).

When it comes to COVID-19 identifying as a conservative in the United States context has been shown to be associated with perceiving less personal vulnerability to the virus, rating the virus as less severe, believing that COVID-19 was the result of a conspiracy and that the media had exaggerated the risks of the virus (23). 

The way we access and consume information

A deluge of misinformation and disinformation

Beyond issues of misunderstanding the scientific method and cognitive biases implicated in science denial we are living at a time of the unprecedented dissemination of science-related misinformation and disinformation. Digital media has become a fertile ground for the spread of science misinformation feeding into areas such as climate-change denial (24), anti-vaccination (25), anti-windfarms (26), and Flat Earth doctrines (5). For instance, a recent exploratory research project investigating views about the climate embodied in videos found on YouTube found the majority of videos reflected worldviews opposing scientific consensus views (27).

An infodemic

In early September 2020 the New Zealand Government warned that disinformation and conspiracy theories were threatening to derail the country’s COVID-19 response. The problem has its genesis in what the World Health Organization (WHO) has called an infodemic, the global epidemic of fake reports and misinformation about COVID-19 spreading rapidly through social media platforms and other outlets (28). Such misinformation and disinformation, which is false information disseminated with the intention to mislead, has ranged from pushing fake cures to propagating baseless conspiracy theories, such as wearing masks actually activates the virus (29). Despite WHO-led initiatives, such as a WhatsApp service designed to rebut such false claims, the viral spread of misinformation has sparked attitudes and beliefs that are adversely impacting containment strategies (30). For instance, a YouGov survey conducted in May 2020 found approximately 44% of Republicans believe the false claim that Bill Gates wants to use a mass COVID-19 vaccination campaign to implant microchips in people that would be employed to track people with a digital ID (31). 

Misinformation and compliance with public health guidance

In a study involving five countries it has been shown that increased susceptibility to misinformation negatively affects people’s self-reported compliance with public health guidance about COVID-19, as well as people’s willingness to get vaccinated against the virus and to recommend the vaccine to vulnerable friends and family (29). Across all countries surveyed, evidence indicated that lower trust in scientists and lower numeracy skills were associated with higher susceptibility to coronavirus-related misinformation. The study’s authors highlight the potential importance of building trust in scientists, as well as fostering numeracy and critical thinking skills, as a way to reduce susceptibility to misinformation.


However, addressing the adverse effects of exposure to misinformation is complicated, particularly given recent research mapping the way misinformation spreads on-line which indicates that the increasing use of social media to access news and information can create “echo chambers” so that people are solely exposed to information which aligns with pre-existing beliefs, and worldview (32). Therefore, misbeliefs can be reinforced and perpetuated without challenge or correction. 

Psychological innoculation

One way to deal with this issue may be to act pre-emptively, by helping people build resistance to future exposure to misinformation. A recent online game, known as Go Viral has been designed to help players gain the ability to resist three manipulation strategies employed to disseminate misinformation about COVID-19; fearmongering, fake experts, and conspiracy theories (29). The game, which is based on psychological inoculation theory or prebunking, builds on research from Cambridge psychologists revealing that when people are given a taste of the techniques employed to spread fake news on social media, it enhances their capacity to identify and discount misinformation in the future (33). Psychological inoculation has been shown to be effective to reduce the influence of science misinformation (34) and increase resistance to conspiracy theory propaganda (35).

A way forward

It is clear that if we are to overcome the problem of science denialism, we need to do more than shout the facts out loudly and resolutely. If we hope to move more people to act in accordance with science, it is imperative we build trust in scientists and cultivate a better understanding of the scientific method. The most effective way to do this is to start in the education system and ensure all students learn the fundamentals of science. 

Of utmost importance is the fostering of critical thinking skills at every level of the education system. This should include familiarising students with cognitive biases and shortcuts that lead to faulty thinking. While we may be hardwired to maintain beliefs that protect our worldview, instilling critical thought at an early age should help us examine those beliefs when evidence suggests we are mistaken.

Finally, we must prepare people for the world in which we now live – a world where we are all exposed to misinformation and disinformation posing as fact. It means helping people develop the skills to use reputable sources to check and verify claims. It also means preparing people to recognise manipulation techniques used to propagate misinformation to assist them to identify and resist misinformation in the future. Fortunately, science is paving the way here – with the development of games based on psychological inoculation theory providing promising indications that we can indeed inoculate against fake news and misinformation.


  1. Bavel, J.J.V., Baicker, K., Boggio, P.S., Capraro, V., Cichocka, M. C…Willer, R. (2020). Using social and behavioural science to support COVID-19 pandemic response. Nature Human Behaviour 4,460–471.
  2. Wu, K. (Oct, 6 2020) No, the Coronavirus is not like the flu. Tracking Viral Misinformation. The New York Times
  3. Kreps, S. E. & Kriner, D. L. (2020). Model uncertainty, political contestation and public trust in science: Evidence from the COVID-19 pandemic. Science Advances, 6. DOI: 10.1126/sciadv.abd4563
  4. Nguyen, A. & Catalan-Matamoros. D. (2020). Digital mis/disinformation and public engagement with Health and Science controversies: Fresh perspectives from COVID-19. Media and Communication, 8, 323-328. DOI: 10.17645/mac.v8i2.3352
  5. Lewis, T. (2020). Nine COVID-19 myths that just won’t go away. Scientific American
  6. Debunking science denialism. (2019) Editorial. Nature Human Behaviour 3,887.
  7. Smith, T. C., & Novella, S. P. (2007). HIV Denial in the Internet Era. PLoS Medicine 4(8): e256.
  8. McLintic, A. (2019). The motivation behind science denial. New Zealand Medical Journal
  9. Torcello, L. (2020). “Science Denial, Pseudoskepticism, and Philosophical Deficits Undermining Public Understanding of Science: A Response to Sharon E. Mason.” Social Epistemology Review and Reply Collective 9 (9): 1-9.
  10. Cook, J., Oreskes, N., Doran, P. T., Anderegg, W. R. L., Verheggen, B., Maibach, E. W., . . . Green, S. A. (2016). Consensus on consensus: A synthesis of consensus estimates on human-caused global warming. Environmental Research Letters, 11(4), 048002. doi: 10.1088/1748-9326/11/4/04800
  11. Biercuk. M. J. (2011). Forget what you’ve read, science can’t prove a thing.
  12. COVID-19 transmission – up in the air (2020). Editorial. The Lancet Respiratory Medicine.
  13. Mooney, C. (May. June 2011). The science of why we don’t believe science. Mother Jones.
  14. Pulido, C. M. Villarejo-Carballido, B., Redondo-Sama, G, & Gomez, A. (2020). COVID-19 infodemic: More retweets for science-based information on coronavirus than for false information. International Sociology,
  15. Prot, S. & Anderson, C. A. Science denial: Psychological processes underlying denial of science-based medical practices. In Harm in Non-Science Based Health Practices Edited By A Lavorgna and A Di Ronco, 2019 – Routledge. 
  16. McGrath, A. (2017). Dealing with dissonance: A review of cognitive dissonance reduction. Social and Personality Psychology Compass, 11(12), 1-17.
  17. Hamilton, L. C. (2011). Education, politics and opinions about climate change: Evidence for interaction effects. Climatic Change, 104, 231– 242. doi:10.1007/s10584-010-9957
  18. Tranter, B.  (2017). It’s only natural: conservatives and climate change in Australia. Environmental Sociology, 3:3, 274-285, DOI: 10.1080/23251042.2017.1310966
  19. Milfont,T. L. Abrahamse, W & MacDonald, E. A. (2021) Scepticism of anthropogenic climate change: Additional evidence for the role of system-justifying ideologies, Personality and Individual Differences, 168,
  21. Bardon, A. (2020). Humans are hardwired to dismiss facts that don’t fit their world view.
  22. Druckman, J.N., & McGrath, M.C. (2019). The evidence for motivated reasoning in climate change preference formation. Nature Climate Change 9, 111–119.
  23. Calvillo, D. P., Ross, B. J., Garcia, R. J., Smelter, T. J., & Rutchick, A. M. (2020). Political Ideology Predicts Perceptions of the Threat of COVID-19 (and Susceptibility to Fake News About It). Social Psychological and Personality Science,
  24. Treen, K. M. I., Williams, H. T. P., O’Neill, S. J. (2020). Online misinformation about climate change. WIREs Climate Change 11:e665.
  25. Burki, T. (2019). Vaccine misinformation and social media. Lancet Digital Health 1(6):e258–e259. -7500(19)30136-0
  26. Crichton, F., & Petrie, K. J. (2015). Accentuate the positive: Counteracting psychogenic responses to media health messages in the age of the internet. Journal of Psychosomatic Research, 79, 185-189. doi:10.1016/j.jpsychores.2015.04.014
  27. Allgaier, J. (2019). Science and environmental communication on YouTube: strategically distorted communications in online videos on climate change and climate engineering. Frontiers in Communication; 4:36. doi: 10.3389/fcomm.2019.00036
  28. Zarocostas, J. (2020). How to fight an infodemic. Lancet 395:676. doi: 10.1016/S0140-6736(20)30461-X
  29. Van der Linden, S.; Roozenbeek, J.; & Compton, J. (2020). Inoculating Against Fake News About COVID-19. Frontiers in Psychology, 11, 2928
  30. Tagliabue, F., Galassi, L., & Mariani, P. (2020). The “Pandemic” of Disinformation in COVID-19. SN comprehensive clinical medicine, 1–3. Advance online publication.
  32. Del Vicarioa, M., Bessib, A., Zolloa, F., Petronic, F., Scalaad, A.,…Walter Quattrociocchi, W. (2016). The spread of misinformation online. PNAS, 113, 554-559
  34. Chapman, S., & Crichton, F. (2017). Wind Turbine Syndrome: A communicated disease. Sydney, Australia: Sydney University Press
  35. Banas J. A., & Miller, G. (2013) Inducing resistance to conspiracy theory propaganda: Testing inoculation and metainoculation strategies. Human Communication Research; 39(2), 184–207.

Dr. Fiona Crichton is an expert in science denialism. She has a PhD in Health Psychology. 

Disclaimer: The views expressed in this article reflect the opinions of the author and not necessarily the views of The Big Q. 

See Also:

Q+A: What’s the cost of dissing science?

Q+A: What do atoms reveal about human behaviour? Can we end the silence on science?