By David Salt
Why do simplistic three-word slogans have such cut through? Why does incumbency give a political party such an advantage? Why does a simple lie so often trump an inconvenient and complex truth?
The answers to these questions (and so many other mysteries surrounding the way election campaigns are run) lies in the way we think. And one of the finest minds alive today who has devoted much of his life on trying to understand how we think is a psychologist named Daniel Kahneman.
Kahneman, a Nobel Laureate in Economics, distilled the essence of his research on how we think in a book called ‘Thinking, fast and slow*’. It’s around 500 pages long and quite dense in parts as Kahneman explains how he and colleagues** rigorously tested many assumptions on how humans think and make decisions. There’s a lot of detail presented, and I’m not saying it’s an easy book to take in; however, if you have any interest in how our inherent biases distort our decision-making processes then this is a must read.
In a nutshell, Kahneman describes how ‘fast thinking’ is what we do intuitively, almost thinking without thinking. ‘Slow thinking’ is when we analyse the information we’re processing. It takes time (hence it’s ‘slow’) and, most importantly, it takes considerable mental effort. Slow thinking helps us correct the biases inherent in our fast thinking but because slow thinking is hard, our brain often gives up on it because it takes too much effort. When this happens, we default back to fast thinking usually without even being aware of it; which is fine a lot of the time (like when you’re fending off a sabre tooth tiger) but can often lead to sub optimal (and sometimes awful) outcomes.
In the words of Kahneman
How does this relate to the way politicians prosecute their election campaigns? I’ll let Kahneman spell out some of the consequences.
On the ‘illusion of understanding’, Kahneman says (p201 in Thinking, fast and slow):
“It is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”
My take: Politicians capable of telling a ‘coherent’ narrative do better than scientists attempting to explain to you a complex story with all the details.
On the ‘illusion of validity’ (p209):
“The amount of evidence and its quality do not count for much, because poor evidence can make a very good story. For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous.”
My take: We make many of our most important decisions based on what other people believe, people we trust, not on what we know. Scientists always believe more evidence and quality evidence will win the day (probably because the people they trust, other scientists, think the same way).
On ‘confidence’ (p212):
“Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”
My take: Don’t confuse confidence with validity. Don’t believe, as most scientists do, that information with high uncertainty is always discounted.
On ‘the engine of capitalism’ (p262):
“Optimism is highly valued, socially and in the market; people and firms reward the providers of dangerously misleading information more than they reward truth tellers. One of the lessons of the financial crisis that led to the Great Recession [GFC] is that there are periods in which competition, among experts and among organisations, creates powerful forces that favor a collective blindness to risk and uncertainty.”
My take: Some people (in some circumstances) can fool all of the people some of the time.
On being a successful scientist (p264):
“I have always believed that scientific research is another domain where a form of optimism is essential to success: I have yet to meet a successful scientist who lacks the ability to exaggerate the importance of what he or she is doing, and I believe that someone who lacks a delusional sense of significance will wilt in the face of repeated experiences of multiple small failures and rare successes, the fate of most researchers.”
My take: Scientists are human, too.
On not seeing flaws in the tools you use (p277):
“I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing. You give the theory the benefit of the doubt, trusting the community of experts who have accepted it.
…disbelieving is hard work, and System 2 [thinking slow] is easily tired.”
My take: When your only tool is a hammer, all you see are nails.
On ‘reform’ and attempting to change the status quo (p305):
“A biologist observed that “when a territory holder is challenged by a rival, the owner almost always wins the contest”…
…In human affairs, the same simple rule explains much of what happens when institutions attempt to reform themselves…
As initially conceived, plans for reform almost always produce many winners and some losers while achieving an overall improvement. If the affected parties have any political influence, however, potential losers will be more active and determined than potential winners; the outcome will be biased in their favour and inevitably more expensive and less effective than initially planned.
Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals. This conservatism helps keep us stable in our neighbourhood, our marriage, and our job; it is the gravitational force that holds our life together near the reference point.”
My take: Incumbent conservative governments have all the advantages when it comes elections involving reform and complex policy positions. Reformers wanting to shift the status quo have a very hard task because of the power of ‘loss aversion’. Also, a concentrated force beats a dissipated force, even if the dissipated force is greater overall.
On dealing with rare events (p333)
“When it comes to rare probabilities, our mind is not designed to get things quite right. For the residents of a planet that may be exposed to events no one has yet experienced, this is not good news.”
My take: Human thinking is not well adapted to deal with climate breakdown or biodiversity loss.
On good decision making (p418)
“They [decision makers] will make better choices when they trust their critics to be sophisticated and fair, and when they expect their decisions to be judged by how it was made, not only by how it turned out.”
My take: Good decisions are not just about good outcomes. Decisions should be judged as much by the process by which they are made, and that people take better decisions when they think they are accountable. (This quote, by the way, is the final line in the book.)
Kahneman’s quotes aren’t pithy generalised reflections that came to him as he was thinking about thinking. They are direct conclusions of multiple rigorous trials in which subjects were given options to choose between in which they needed to assess risk and possible outcomes.
And the research isn’t new or unreviewed. Some of his findings on cognitive biases and decision heuristics (the mental rules-of-thumb that often guide our decision making) go back some 50 years. Kahneman is recognised as one of the world’s leading behavioural psychologists, was awarded a Nobel Prize in economics in 2002 for his work on prospect theory (pretty good for someone who had never studied economics), and his work has been a cornerstone to the developing field of behavioural economics.
Of course, all of this is also central to marketing and politics: how do you communicate (sell) information to score a sale or bag a vote? You don’t do it by providing every detail available, like many scientists try to do. This simply switches people off.
Rather, you build a simple coherent narrative that you can ‘sell’ with confidence. You scare people about their losses if the status quo is threatened (as will happen if you ‘vote for the opposition’), and you frame your arguments for maximum salience to your target group.
‘Good marketing’ is about exploiting people’s cognitive biases and not overloading them with detail they can’t absorb. ‘Good politics’ is about simplistic three-word slogans and scaring voters into believing that change means they will lose.
Elections are all about good marketing and good politics
Good marketing and good politics often add up to poor policy, short-term thinking and vulnerability in a climate ravaged world.
Fossil fuel corporations (and conservative politicians in their thrall) have been manipulating community sentiment for decades, stoking scepticism and denialism about complex science, and preventing the world from responding to an existential threat.
Kahneman didn’t give them the blueprint for how this is done, but his science has revealed just how easy it can be to steer and nudge a person’s behaviour and beliefs if you understand how inherently biased our thinking can be.
The solution? There is no pill (red or blue) that can help people do more slow thinking and better reflect on the biases inherent in their fast thinking. As Kahneman has demonstrated throughout his career, humans simply think the way that they think. However, society has created many institutions that provide checks and balances on the way marketeers sell products and politicians acquire and use power. The integrity of these institutions is the bridge between day-to-day politics and good policy outcomes.
Australia is currently in election mode with a federal election only days away. Political integrity and climate change are a major concern to most Australians. Despite this, the incumbent conservative government has long resisted the establishment of an independent integrity commission to test the many claims of corruption that have been levelled at it over the years. And this government has been seen as dragging the chain on climate action (and lying about what they are actually doing).
And yet, our Prime Minister, a man who has been described as lacking a moral compass and being a serial liar (by his own colleagues!), is a masterful marketeer. Nick named ‘Scotty from Marketing’, maybe he should be retitled Australia’s ‘Prime Marketeer’. He knows how to spin a simple and coherent story and stick to it. He knows how to scare people about the costs of change, and divide communities by playing on people’s prejudices and fears. Using these skills he pulled off ‘a miracle’ victory at the last election.
Thinking fast has served him well. Now, for a meaningful response to multiple environmental emergencies, it’s time for a little reflection; a little more thinking slow is called for.
To be honest, I had never heard of Daniel Kahneman 15 years ago. But then I began working for a group of environmental decision scientists and his name constantly came up. Kahneman was the leading light who illuminated why our internal decision-making processes were so flawed, so biased. He was the ‘god’ who (along with his friend Amos Tversky**) had published the landmark paper ‘Judgement under uncertainty: heuristics and biases’ in 1974 in the journal Science, one of the most widely read papers of all time I was told. Well, I tried reading it and found it too technical and dense to take in.
Then, in 2011, Kahneman published Thinking, fast and slow. Someone described it as a 500-page version of his 1974 paper. Not a great sales pitch for me, I’m afraid.
However, just prior to the corona pandemic, I spied Thinking, fast and slow on a friend’s bookshelf and asked to borrow it. It took over a year before I found the courage to open it (it was my big pandemic read), six months to wade through it, and another three months before I’ve attempted to write down why I found its wisdom so compelling.
So, for me, my journey with Kahneman has been a long one. And now that I have finished this blog, I can return Thinking, fast and slow to my friend Michael Vardon, who loaned it to me many moons ago. Thanks Michael, sorry about the delay.
** Amos Tversky
If I’ve interested you at all in Daniel Kahneman but possibly put you off reading Thinking, fast and slow (because who has time to read a 500-page horse pill of information on cognitive biases) then I highly recommend another book that covers the same ground but from a more personal framing. This one is about Daniel Kahneman and his life-long colleague and closest friend, Amos Tversky. The book is called The Undoing Project and is written by Michael Lewis (who also wrote The Big Short and Moneyball, both about biases in the way we think and assess risk). It tells the story of Kahneman and Tversky, both Israeli psychologists, and how together they unpicked the many ways our thinking is biased without us even being aware of it. Not only does The Undoing Project give an excellent overview of the research described in greater detail by Kahneman in Thinking, fast and slow, it also paints a touching portrait of the friendship between two of the world’s finest minds. Tversky tragically died of cancer in 1996.
Banner image: ArtsyBee at Pixabay