By Jess Berentson-Shaw

In an extract from her new book “Talking Truth in a Post-Truth World,” Jess Berentson-Shaw discusses whether we can teach critical thinking in what many deem to be a post-truth world.

‘You don’t need to be a scientist to think critically and ask good questions,’ said Iain Chalmers in relation to an innovative trial on critical thinking that was run with Ugandan children in 2016. The aim of the trial was to teach generalised critical thinking skills in the context of misinformation. Andy Oxman, who co-led the trial, said, ‘Working with policymakers, made it clear most adults don’t have time to learn, and they have to unlearn a lot of stuff.’ So, putting in place the building blocks of critical thinking when people are young is key. Allen Nsangi, the Ugandan based co-investigator, noted that misinformation is a particular problem in Uganda: ‘Some of the immunization campaigns have been sabotaged because of claims to do with infertility for the future.’1

The researchers developed a series of podcasts and comics to help both parents and children develop critical thinking skills and, specifically, recognise false health information. The results were very positive. The children in the group who received the critical thinking training materials showed significant improvement in the ability to recognise false health claims compared to the children who did not. ‘The results show that it is possible to teach primary school children to think critically in schools with large student to teacher ratios and few resources.’2

Fostering scepticism, one aspect of critical thinking, also fits with what we know of trust and distrust with regard to misinformation. Earlier, I touched on why distrust in particular led to problems in assimilating accurate information – for example, when people have a distrust of the pharmaceutical industry. However, distrust can also have a positive function. When combined with scepticism it may help people avoid misinformation traps. One example of fostering distrust is when climate communicators make people aware of industries (such as coal mining) which have a vested interest in claiming that disinvestment in fossil fuels will not help reduce carbon emissions. It is thought that non-routine information processing is triggered by distrust – in other words, people don’t take their normal cognitive short cuts and instead will more closely examine and question new information.3 The caveat being that distrust and scepticism work best before or at the time misinformation is received.

The art of conversation and learning is not dead. Careful and prolonged dissection of information with an argumentation-and-refutation (or serve-and-return) approach in classroom environments has been shown to improve conceptual learning.4 Directly addressing misinformation that may be received in educational settings is also effective.5 There is also data to suggest it may work in the political arena. An analysis of more than forty opinion polls showed that an argumentation-and-refutation approach did help win policy debates for politicians.6 The intense discussion of inaccurate information may help people to work through inconsistencies in their understanding and promote the acceptance of corrections.

Who will benefit most from skill improvement?

When considering critical thinking upskilling, it is vital to keep in mind imbalances between the more and less advantaged, so as not to reinforce existing differences in participation. New information is processed more efficiently by those with higher education. In addition, the greater mental energy, space and processing power (known as cognitive bandwidth) of those who are not financially or otherwise stressed, means existing skills and knowledge can be marshalled to make the most of new information, further entrenching inequality (known as the inverse health and social care law).7 Ironically, social media may be one way to overcome imbalances in access to information, as internet sources may be more accessible to diverse groups.8 However, manipulation of social media by powerful groups is also a danger to equity building. It is important for communicators, therefore, to be clear on the strategy for skill acquisition to constrain and shape the approach appropriately so not to further advantage the already advantaged.

Working with people’s skill sets

A key part of appraising evidence-based claims, using critical analysis skills, is understanding risk and benefit – often presented numerically (for example, the risk of side effects from vaccination and the benefits if exposed to a disease). Numeric knowledge is, however, low even among well-educated groups. Which suggests communicators need to get better at conveying risk. Research suggests, instead of avoiding numbers, using them with the following conditions:

  1. Reduce the cognitive load required to interpret numerical and statistical information, so fewer of the mental short cuts need to be taken;
  2. Explain what the numbers mean; and
  3. Highlight the critical information.9

Evidence from the climate change domain provides some pointers on how to convey risk effectively. Experiments found that an ‘estimate and reveal’ approach to help people understand the metrics on climate change agreement was more effective than a simple reveal of information.10 In estimate and reveal approaches, people are asked to select from a multiple choice, or free guess a percentage as it relates to the issue at hand. For example, ‘Based on the evidence X per cent of climate scientists have concluded that human-caused climate change is happening’, and people are then shown the actual percentage.

In this particular study, those people who gave an estimate prior to being shown the true level of agreement gave more accurate estimates of the scientific consensus on climate change later when compared to those who did not estimate the percentage in advance. It is thought that this technique in particular creates ‘an information gap’. In noticing such a gap in their knowledge, as opposed to being told, participants then seek to fill it with the credible information they have been provided. It is possible this approach does not act as a direct challenge to people’s beliefs, with all the associated negative feelings that go with it.

This is a method that could be used to help in public understanding in any areas, such as inequality, where misunderstanding occurs. Research has shown that misperceptions of inequality, for example, drive different preferences for redistribution of income, while those who are most privileged misunderstand the difference in outcomes based on relative privilege in society.11

The New York Times used this type of approach in an interactive visualisation called ‘You Draw It’. The paper asked online readers to draw the statistical relationship (on a blank online graph) between parents’ income and the percentage of children in the family who attended college in the US. They then revealed the actual relationship between these two factors, alongside the results of other people’s estimations (this added a social influence dimension to the task).12 Researchers found that when people saw that there was social consensus about the data (i.e. most others drew the true relationship) they could recall the correct information more easily than those who were only presented with the data. The researchers also found that when subjects saw that most others drew the same incorrect relationship as them, they were less likely to recall the correct data. This finding has many interesting implications for not just estimate-and-reveal techniques, but also for how social consensus affects beliefs and recall about evidence and data.13

References: 

1   Julia Belluz, ‘Our World Is Awash in Bullshit Health Claims. These Scientists Want to Train Kids to Spot Them.’, Vox, 15 November 2016, www.vox.com/2016/10/6/13079754/teaching-critical-thinking-schools-health-claims (accessed May 2018).

2   Allen Nsangi, Daniel Semakula, Andrew David Oxman et al., ‘Effects of the Informed Health Choices Primary School Intervention on the Ability of Children in Uganda to Assess the Reliability of Claims about Treatment Effects: A Cluster-Randomised Controlled Trial’, The Lancet 390(10092) (2017): 374–388.

3   Jennifer Mayer and Thomas Mussweiler, ‘Suspicious Spirits, Flexible Minds: When Distrust Enhances Creativity’, Journal of Personality and Social Psychology 101(6) (2011): 1262–1277.

4   Jonathan Osborne, ‘Arguing to Learn in Science: The Role of Collaborative, Critical Discourse’, Science 328(5977) (2010): 463–466.

5   John Cook, Daniel Bedford and Scott Mandia, ‘Raising Climate Literacy through Addressing Misinformation: Case Studies in Agnotology-Based Learning’, Journal of Geoscience Education 62(3) (2014): 296–306.

6   Jennifer Jerit, ‘Issue Framing and Engagement: Rhetorical Strategy in Public Policy Debates’, Political Behavior 30(1) (2008): 1–24.

7   Kerris Cooper, ‘A Reappraisal of the Inverse Care Law’, Socialist Health Association, 29 May 2010, www.sochealth.co.uk/national-health-service/public-health-and-wellbeing/poverty-and-inequality/the-inverse-care-law/a-reappraisal-of-the-inverse-care-law/ (accessed February 2018).

8   Michael A. Cacciatore, Dietram A. Scheufele and Elizabeth A. Corley, ‘Another (methodological) Look at Knowledge Gaps and the Internet’s Potential for Closing Them’, Public Understanding of Science 23(4) (2014): 376–394.

9   National Academies of Sciences, Engineering, and Medicine, Communicating Science Effectively: A Research Agenda, National Academies Press, Washington DC, 2017.

10  Teresa A. Myers, Edward Mailbach, Ellen Peters and Anthony Leiserowitz, ‘Simple Messages Help Set the Record Straight about Scientific Agreement on Human-Caused Climate Change: The Results of Two Experiments’, PloS ONE 10(7) (2015): e0133103.

11  Oliver P. Hauser and Michael I. Norton, ‘(Mis)perceptions of Inequality’, Current Opinion in Psychology 18 (2017): 21–25; Belinda A.E. Borell, Amanda S. Gregory, Tim N. McCreanor, Victoria G.L. Jensen and Helen E. Moewaka Barnes, ‘“It’s Hard at the Top but It’s a Whole Lot Easier than Being at the Bottom”: The Role of Privilege in Understanding Disparities in Aotearoa/New Zealand’, Race/Ethnicity: Multidisciplinary Global Contexts 3(1) (2009): 29–50.

12  Gregor Aisch, Amanda Cox and Kevin Quealy, ‘You Draw It: How Family Income Predicts Children’s College Chances’, The New York Times, 28 May 2015, www.nytimes.com/interactive/2015/05/28/upshot/you-draw-it-how-family-income-affects-childrens-college-chances.html (accessed May 2018).

13  Yea-Seul Kim, Katharina Reinecke and Jessica Hullman, ‘Data Through Others’ Eyes: The Impact of Visualizing Others’ Expectations on Visualization Interpretation’, IEEE Transactions on Visualization and Computer Graphics 24(1) (2018): 760–769.


Extract from Berentson-Shaw, Jess, A Matter of Fact: Talking Truth in a Post-Truth World, Bridget Williams Books, Wellington, pp. 71-77. 

Jess Berentson-Shaw is a Researcher in the Institute for Governance and Policy Studies at Victoria University, Wellington. 

Link to book: https://www.bwb.co.nz/books/matter-fact

Disclaimer: The ideas expressed in this article reflect the author’s views and not necessarily the views of The Big Q. 

You might also like: 

Are we living in a post-truth era? 🔊

How is the truth gated and gagged?