While many argue that we are in a post-truth era, fuelled by US President Donald Trump and the phenomenon of fake news, some scholars say that deception has always been ubiquitous. What is the truth about lying? Maria Armoudian is joined by Timothy Levine, David Livingstone-Smith, and Briony Swire-Thompson.

Timothy R. Levine is a Distinguished Professor and Chair of Communication Studies at the University of Alabama, Birmingham. He is the editor of Encyclopedia of Deception, Vol. 1 & 2 and is an expert in interpersonal communication.

David Livingstone-Smith is a Professor of Philosophy at the University of New England. He is the author of Why We Lie: The Evolutionary Roots of Deception and the Unconscious Mind and is an expert on the philosophy of psychology.

Briony Swire-Thompson is a Postdoctoral Researcher at Northeastern University. She is the co-author of ‘Processing Political Misinformation: Comprehending the Trump Phenomena’, and ‘Correcting False Information in Memory: Manipulating the strength of misinformation encoding and its retraction’.

Podcast:

 

Interview Transcript:

Maria Armoudian: I thought we would start with this idea that in our media now we are in a post-truth era. This has largely been because of either the man who has been elected president to the United States, Donald Trump, or because of the accusations of false news. In the past, David Livingstone-Smith, we have talked about how lying and deception are relatively ubiquitous. I wonder if we are in a different era. Is it more of the same? Is it less of the same?

David Livingstone-Smith: I don’t think history is divided into eras really. I mean this is an imposition of categories that we place on history to try and make sense of things. Lying and deception, because I use the term very broadly, is indeed a ubiquitous feature of human life, and so it’s not like suddenly we’re launched into a time when truth doesn’t matter anymore. However, there are historical circumstances and political circumstances that I think, at least in the United States, make lying with impunity a lot more tempting and a lot easier.

MA: Easy and tempting because it’s more permitted when the leader is also lying?

DLS: I think any politician worth their salt is going to be a very skilled image manipulator, and lying is a form of image manipulation. A skillful politician deceives people in the way they want to be deceived. Let’s put it that way. I think Trump has the salesman’s instinct for doing this. And that’s what I think is different, not that political lying is something new. That’s ridiculous. We’ve known since Plato’s Republic that lying is a crucial part of politics. “Politicians,” he said, “serve up sweet nothing, sweet pastries that are not nourishing, but taste good.” What’s different is that Donald Trump has really established for himself and for others that he can push the envelope very far indeed if he tells people the sorts of things that motivate them in the way he wants to motivate them.

MA: Timothy Levine, do you agree?

Timothy Levine: I agree with everything about Trump. I would like to push back a little bit on the idea that lying is frequent in our communication. I think the data shows pretty clearly that most people are honest most of the time. Most people lie less than the average person lies because lying is not normally distributed. There are a few people that lie a lot, but most of us are pretty honest, even with politicians. Most politicians, when they get fact-checked, they don’t always come out true, but they come out true more often than not. Donald Trump is the clear exception to this. I don’t know that there has ever been a politician in recent memory that’s close to his difficulty with the truth. However, even then, remember those are only the statements that get fact-checked, and there are a lot of statements that don’t get fact-checked, primarily because there’s no reason to check them.

MA: One of the things in your research, Timothy, is that people lie when they have a reason to lie, and if that reason has been taken away, then they don’t usually lie. How do we make sense of that?

TL: People lie or deceive when the truth is a problem. And for probably most everyday adults, the truth isn’t especially problematic for us. For example, if I’m on the radio talking about my research findings, the reason I’m here is to explain that to you. Hopefully the truth’s not a problem. Hopefully I’m not fudging my data or making up evidence, and if I were, it probably wouldn’t make sense to have a show like this.

MA: I wonder also since you’re in the communication field and one of the issues in communication deals with the idea of contagion. Is contagion a problem when it comes to deception?

TL: I don’t know. Good question. However, I do think the really insidious thing about the current political environment is not only the giving of false information, but attacking the credibility of those who fact-check. So now the fact-checkers are the “fake media,” and this, I find, really troubling because how are we to know what’s true or not? And it seems to me that communication can only work when we trust what others are saying. We move towards anarchy if just anything goes.

MA: That’s something else you’ve written about, which is how much we actually need to be able to trust in order for our general society to function.

TL: Yeah, it’s really hard to be close to people or have good cooperative relationships if you can’t trust them. I’m a professor. If students can’t trust me, there’s no reason to pay the high tuition dollars. It would be silly. Things unravel pretty quickly if we can’t believe what other people say. Communication just loses all its utility.

MA: Briony Swire-Thompson this might be the place to bring you in. You’ve been studying when people accept or don’t accept information that’s wrong. What can you tell us about this process?

Briony Swire-Thompson: What we find is even if people are presented with really good valid evidence-based correction, they might update their beliefs in the short period of time. But if we measure it, say, a week later, three weeks later, they start to re-believe in the original misconception. And I guess that’s one of the phenomenon’s that’s pretty scary. We call it the continued influence of misinformation, because misinformation or these lies that people are presenting really do have an impact on our reasoning and memory.

MA: Walk us through this. First of all, they hear the deception or the lie and if they accept it, then what happens?

BST: The way that we study it in the lab is we often present people with multiple pieces of information, and we ask people to rate on a zero-to-ten scale: How much do you believe in this piece of information? And once they’ve done their ratings, we give them a little blurb to say, “Well this is actually true because of these reasons,” or, “This is false.” And then we measure participants over a period of time. What we do find is they’re quite happy to update their beliefs in the short-term, but these corrections don’t seem to last.

MA: There was another study, which suggested that the effect of belief depends on who is attached to the information.

BST: This is actually a study that we did using Trump because he was an incredibly polarising figure. We set up these studies in 2015 and 2016, and he was obviously already a highly divisive figure. One of the items we used was Trump said that the MMR vaccine causes autism. The other presented just the information saying the MMR vaccine causes autism. Then we asked, “How much do you believe in this statement?” We found initially, straight off the bat, without correcting peoples’ misconceptions, if people were Trump supporters, they were far more likely to believe in the misconception. So what we think is happening here is that people are using political figures, people who they trust as what we call like a heuristic or mental shortcut for judging the recipe of information. So they are like, “Well, I trust Trump and therefore I think what he says is true.” And I don’t think this is necessarily only for political figures, or only for Republicans or Trump supporters. I think trust and source credibility is a big thing when judging how true or false information is.

MA: It sort of goes hand in hand also with the work that some psychologists have done with the brain scans, and once people have become sort of wedded to a particular position, or a particular candidate, they actually measure the cognitive dissonance that they experienced when somebody gave them counter-information. David, what do you make of this?

DLS: Let’s actually start with something a little bit more basic. You’ve sometimes spoken of lies and sometimes spoken of deception, and I think one of the differences between Tim and me is that Tim has a pretty fine-grain notion of lying, while I see lying as pretty much synonymous with deception. And I do agree with him, by the way, that most of the time, we don’t lie. If that were not the case, then lies would have no utility as the story of the boy who cried wolf tells us. In terms of contagion, I think it’s maybe a little bit of a misleading metaphor. What happens, I think, is that prominent people, people who are looked up to and in the case of Trump, looked up to as a sort of saviour, set norms. They set norms of permissibility by their behaviour. If someone like Trump suggests that one can play fast and loose with the truth, I doubt very much that this does not impact those who look up to him.

MA: In that case, would just be those that look up to him that would therefore be affected? Would it create a culture of deception?

DLS: You know these things sort of create a cultural atmosphere, and presumably there are a lot of psychological processes at work here, and I wouldn’t presume to specify them.

BST: I was just going to mention that that’s actually one of the things we measured in our study on Donald Trump. Even if people who supported Trump updated their beliefs that what he was saying wasn’t true, and were like, “Yes, we completely acknowledge that what he’s saying is false.” It didn’t actually have any impact on their voting preferences. So people were just as likely to vote for him even if they were acknowledging that they knew that he was spreading inaccurate information. What we assumed this meant was that why people like political figures is made up of a whole range of factors, but yet I guess being truthful wasn’t one of them.

DLS: That’s pretty cool research. I think that what goes on with authoritarian leaders is that the faithful followers see them as sort of being in touch with a deeper truth. So although the details might not correspond to reality, they’re seen as articulating something which is true, and in a more profound and deep sense. That’s one way that this problem gets negotiated.

TL: My theoretical perspective is called “truth to false theory.” And the basic idea is that we tend to believe other people unless we have reason not to. Our default state is belief and then at times we’re motivated to think carefully about is this true or not? For example, with the followers of a particular leader, they have a stake in believing this person, so they’re decidedly not motivated to believe them. The other side of the political spectrum they’re scrutinising, and so they are looking for if the statements are true. Then they’re more likely to try to fact-check and look at that. It’s a much more general principle. It’s like when we’re driving, we’re kind of on autopilot. We’re not really consciously thinking about things, and then something comes and grabs our attention. Some driver is behaving unusually or something, and we start thinking about it. Communication is a lot like this too. We tend to do it very automatically, and it’s only when we really have to start thinking about it [that we ask] is this deceptive? If it’s not prompted in some way, if it’s not triggered, and one of the things that can trigger it would be a politician from an opposite party. One of the things that could trigger it would be a sales person who is giving you a hard sales pitch. Maybe in certain kinds of online gaming environments or internet environments, or you might be very on guard for people who are saying things that are false. However, in most of our daily lives, this just doesn’t occur to us. So this is why I think that we are much more likely to be sceptical of people that have different belief systems than us, or are in different tribes or different groups.

MA: Is this a problem in an age where there are massive numbers of websites that might proliferate information that’s absolutely false?

TL: Right. Or with all the email phishing attacks. If you’re just going along on autopilot, it’s easy to click on those links. You have to be actively thinking about it, and actively thinking about it takes effort away from doing other things.

MA: When you studied why people lie—because they had to have a reason—what were those reasons?

TL: The exception is with pathological lying, which I define as lying without a reason. Though occasionally you run into people who will tell a lie when the truth would work better. But for most of us, the reason we deceive is that for some reason, the truth is a problem. Maybe we did something wrong. Transgressions are a big reason. We did something wrong. We’re trying to hide it. We want to make a positive impression on other people. There might be some kind of economic advantage to it, you know, “I want you to think my product is really better than it is.” Or, “I want you to vote for me.” However, a surprisingly big one is a category my authors and co-authors call “avoidance lies.” Basically, your friend, or somebody, wants you to do something. You don’t want to go out to dinner with them for whatever reason, but you don’t want to say, “No I don’t want to go out with you.” So you make up some kind of reason like, “I have plans,” or “I’m sorry I have a prior engagement,” or something like that, but the truth is you just don’t want to do it.

MA: So how ubiquitous did you find that this deception, this actual lying, is among people? Is it all the time every day, or is it really rare?

TL: I think most people don’t lie very often. We’ve done nationwide surveys in the UK and US where most people say they haven’t really lied in the last twenty-four hours. And we gave them a pretty broad definition of it, just misleading another person.

MA: Could they be lying about lying?

TL: They could be, but there’s been some cool validation studies giving people the opportunity to cheat for a cash prize, and the people who say they lie tend to cheat, and the people who say they’re honest tend not to cheat in behavioural studies.

DLS: I’m kind of sceptical. Every day that I go to the supermarket, I’m asked at the cash register, “How are you?” And I’m expected like everyone else to say, “I’m fine.” But I don’t. And they find it quite a relief. They are in a miserable job and have been on their feet all day. However, there’s a great deal of that that goes on. I think people are generally unaware of the degree to which they lie, and bear in mind that by lying, I don’t restrict lying to deliberate statements intended to cause others to form false beliefs. I include in lying any form of deception which has the function of causing others to form false beliefs. So I think lying is much more pervasive. I agree with Tim, but I would put it a little bit differently. The way he puts it is, “Truth is a problem.” I say people lie often to get things that they value which they feel they can get more effectively by lying than by telling the truth. People want others to love them, to think well of them and so on. But people also lie because they enjoy it. I mean, I hold as an example of that, the stories that are told to children about Santa Claus and the Easter bunny and so on and so forth, which seems to me a matter of the pure exercise of power of a big strong knowledgeable person over a small vulnerable one.

TL: I tend to not see “Hello, how are you, I’m fine” as deception because I ask David how he’s doing, he says he’s fine. I don’t really believe he is because I know that we say this all the time and there’s no deceptive function. If I said, “How are you doing?” and he said, “I’m having a really great day”, and he wasn’t, then I might be deceived. But the everyday little politeness rituals we have, I don’t think serve a deceptive function, because everyone knows perfectly well that this is just what we say and everybody knows that this is just following this mindless sequence.

DLS: Sure, but isn’t that refraining from telling the truth? So if you have someone who basically feels like they want to jump off a bridge, and I say to them, “How are you doing?” and they say, “Fine,” they are mistaken. They are engaging in a social ritual even though they are inside dying to let someone know what their life is like.

TL: We might be at the risk of jumping into a real rabbit hole here. We want to be very careful, I think, in talking about deception, not to conflate truth with deceptiveness because I can be deceptive while being one hundred percent truthful. So I can say something sarcastically that I mean is literally true, and I can say something absolutely false, like, with sarcasm, that’s not deceptive at all. I think David and I are on the same page on what deception is, and that is that deception is when communication or other signals function to mislead someone.

MA: I am going to bring David Livingstone-Smith in on another dimension that he’s also written about, which is self-deception. David, you have said in the past that we are frequently in a mode of self-deception and that perhaps this is to keep us from being sad or depressed or upset. What about that? Is there any harm done there?

DLS: There’s not necessarily any harm done there. Optimal self-deception is a requirement for keeping sane in this crazy world. Self-deception is a really complicated and difficult topic. There are a lot of problems understanding how it works and what it is and so on. I’m not going to get into that because that would take an hour of lectures to try to make sense of, but I think that the usefulness of self-deception is broader than what you specify. Truth can be really overrated. So let’s start with what you said. I mean, I’m wearing shorts that I got cheap some place. They are probably made by people working at ridiculous starvation wages under horrible circumstances, and I know that. But I don’t pay attention to it because if I pay attention to it, I would feel horrendously guilty, and guilty that I’m not sufficiently politically engaged to try to help do something about those sorts of things. I’m 64 years old. I don’t have that many years left, and I know I am going to die. That’s a bad thought. I can tolerate it occasionally. I know that my loved ones are going to die. I know that there’s tremendous suffering around me, and I’m a darker person than most, but even I ignore that stuff in order to keep going. Self-deception can have other functions as well. The assumption that the truth is always better, to know the truth about yourself is always better, to know the truth about others is always better, I think, is an unwarranted assumption. Self-deception can be the bomb that helps us get through the night and helps us get what we want as well, because of course if we believe our own bullshit, we’re presumably much more effective at getting others to believe it.

MA: Spoken like a true philosopher. What would the communication professor say?

TL:  I think self-deception is somewhat out of the realm of communication. However, the work on self-deception I find really interesting, is from an evolutionary biologist by the name of Robert Trivers. He makes the same point that David made at the end that the real advantage of self-deception is that it serves others’ deception that if we believe our own bullshit, then we’re way more believable. And I’m not sure how much I agree with Trivers exact argument, but I find it really interesting, and I think if he’s right that it turns at least a lot of the psychological research on deception on its head.

MA: What are the costs of this, both on a person and on society in terms of communication deception? Tim, can you tell somebody’s deception not by what they say, but what they do?

TL: It’s the exact opposite in my research. The principle that’s guided most research on deception detection is that you want to not pay attention to what people say. You want to pay attention to what people do and various nonverbal behaviours. And that view was characterised very well in a TV show Lie to Me.  What my research finds is the exact opposite. That if you’re paying attention to how people are coming off, then you’re very likely to be misled. However, if you listen carefully to what they say, and if possible, fact-check, hopefully understand what they say in the context in which it’s said, then we’re much better able to sort out truth from lies. And the other thing my research finds is that most lies that people detect are detected well after the fact. So we asked people, “Think about a time you caught somebody lying to you and how did you find out?” Something like ninety-eight percent of those are found out later.

MA: You looked at relationships as well in terms of how that occurs within relationships. What did you find there?

TL: The big finding in relationships is it’s not so much the lying, but what the lies are about. So when you lie about the wrong thing, that’s really what damages the relationship. If I’m lying to you about a surprise party, that’s probably not going to hurt a relationship that much. But spouses who are lying about infidelity that can really hurt the relationship once the lies are uncovered. However, it’s not just the lies. It’s the underlying problematic things that were being hidden.

MA: Briony, what are your findings with regards to this, with the deceiver, or the person receiving the deceptive information?

BST: First, I’d like to jump on the bandwagon of definitions. With my research, when I look into misinformation, it isn’t necessarily someone with the intent to deceive. So that could be coming back to self-deception. Sometimes I think that people believe these lies so much that they do have a way of coming off as believing it is true even when that’s not the case. Misinformation, it could be deception, but it could also be spreading inaccurate information that people really do have a strong disposition to believe.

MA: When it came to the people who returned to believing the false information do they have cognitive dissonance? Do they feel betrayed? Do they have this psychological effect? Have you interviewed them?

BST: I would love to interview them. My research is quantitative not qualitative, but there have been quite a few studies done online. When I was working in a lab, you actually got to speak with these people and it was always very interesting. People always had these theories of, “Well am I remembering what you told me was true, is that what I’m mastering, or am I still answering what I truly believe?” And I do think that when we correct information, the disconnect between what people remember a source told them and what they still believe, is definitely a very interesting phenomenon.

DLS: I think the whole idea of belief is a really interesting idea. I think a lot of times when people say they believe something, what they mean is they’re committed to it. There’s kind of getting behind it, and evidential considerations don’t play much of a role in that. I mean you see that in New England here with sports fans. They will say that they believe that the Red Sox are the greatest team in the world, and that’s entirely unrelated to how well they’re doing in a particular season. It’s simply a gesture of commitment. It’s like saying, “Go Red Sox.” Similarly, my students in Intro to Philosophy, when we look at philosophers’ criticisms of the purported proofs of God’s existence, and we see pretty plainly that those proofs just don’t hold up, students will absorb that. They’ll understand the logic, and they’ll say, “But I still believe.” And they believe because they were raised that way. That’s the claim.

MA: Sounds a lot like Briony’s research, like maybe for a moment they let go of that, but then they go back to it?

DLS: Yeah, because it’s the commitment that matters rather than the evidence that matters.

TL: Understanding belief, and particularly when we believe other people’s communication is a really cool thing to think about too. In my research I focused on what people do that make them believable, and it seems that what people do that makes them believable, is completely unrelated to whether they actually should be believed or not. However, there is a whole combination of behaviours that make people believable, and as it turns out these things are all interrelated. So if you are friendly, and you’re confident, and you’re engaged, and you’re looking the person in the eye, then pretty much almost everybody will believe you. And if you’re not doing those things, if you’re coming off as nervous and sketchy and adding a lot of qualifications, then this can undercut your believability. I find that fascinating, at just how powerful these things are.

MA: It’s interesting, too, in the political environment. In a previous Scholars’ Circle, the guests talked about the appearance of authenticity and the appearance of sincerity, and how much of a role that plays. And when we’re talking about Donald Trump, who is in the fact-checking world, what they would call “pants on fire” most of the time, yet he comes across as sincere and authentic. How do you make sense of that?

TL: I call it the believability quotient, and actually I have this checklist of eleven things. And I found, for example, that people who are both lying and telling the truth, or liars who are doing these things, even professional interrogators, will get it right only about sixteen percent of the time. However, when the people who are honest do the exact same thing, people are one hundred percent right about them.

MA: I am not sure I followed that.

TL: So I am coming off as really believable and I’m lying. Even professional interrogators will be wrong about me eighty percent of the time. However, if I’m actually honest in doing these things, then they will be right about me. And the interesting thing is, these show up cross-culturally. So, for example, I’m in Korea right now, and I show Koreans videotapes of Americans high up in these believability quotients, they believe them, and low in it, and they don’t believe them.

MA: Briony, what do you make of that?

BST: I think it sounds like great research. I think that believability really does go hand and hand with deception and why people lie. In a way, it’s the flip side, what we believe and why we believe it.

TL: I would look to see what’s going on in the brain scanner when people are watching these highly believable people, or watching these highly sketchy people.

MA: Now when you’re talking about the high believability people who are actually lying would that fall into the pathological side?

TL: Not necessarily. I think at least in the way I would use the term. What makes lying pathological is that they’re lying when the truth would work just as well. So there’s no reason for lying, they’re just lying to lie. And that’s different than I think a really effective liar, a highly believable liar who’s doing it very strategically, or might be using it very strategically and very successfully.

BST: Just on that note, and bringing it full circle back to politics and the 2016 presidential campaign, when that is paired with our finding of even if people reduced their belief in the misinformation, they didn’t change their voting preferences. It was actually a very effective strategy by the Trump campaign. I mean, who is to say whether or not it was an explicit strategy that was used. But if you are able to get all the benefits of this, like additional news coverage and a rousing rhetoric, and you don’t have the negative consequences of being caught out and spreading misinformation, then maybe perhaps it’s a good way to go.

DLS: Yeah, well with respect to believability, I fully agree. I mean the trick to being a good liar, a skilled liar, a competent liar is to be believable otherwise there’s no point. So, I can do no better than to quote one of my favourite philosophers—Marx, not Karl, but Groucho, who once said that sincerity is important and if you can fake that, you can fake anything.


This interview was originally aired on the Scholars’ Circle. To access our archive of episodes and download this interview click here.

For more of our audio and visual content check out our YouTube channel, or head to the University of Auckland’s manuscripts and archives collection.

Disclaimer: The ideas expressed in this discussion reflect the views of the guests and not necessarily the views of The Big Q. 

You might also like:

How Does Propaganda Work in Democratic Societies? 🔊

Are Hacking, Fake News, and Paid Trolls Destroying Democracy? 🔊