By Nadia Chechlińska, Research Associate|
Nadia explains how, almost a decade on, ‘Thinking Fast and Slow’ is still an important text for higher education staff trying to influence student behaviour.
Taking decision-making theory out of academia and into everyday life
Daniel Kahneman’s Thinking fast and slow [i] was revolutionary because it explained the processes underlying our decision-making using relatable examples. Academics around the world are investigating these processes and their findings help us influence behaviour. However, the research is often complex and dominated by specialist terminology. As a result, it is difficult for practitioners to apply the results from these studies in a real-world setting.
Daniel Kahneman cuts through complex theory, describing decision-making processes through simple real-life examples. This explains why his book gained so much popularity in a wide range of disciplines – from business, through law, to medicine. Thinking, Fast and Slow enabled practitioners to understand human nature better and to account for it when designing policies.
People make decisions in two ways – fast or slow
The principle message of Kahneman’s book is that there are two pathways, or systems, people follow when making decisions. Kahneman elaborates on these two systems:
- System 1 – is responsible for fast, automatic, and intuitive thinking.
- System 2 – engages in slow, deliberate, and effortful processing.
When performing most of our everyday actions we rely on our System 1. It is responsible for ideas that simply come to your mind, which enables us to process the world around us automatically. This way we don’t need to decide what things to perceive – our mind does it for us.
System 1 is responsible for judgements and decisions that are simple and intuitive. For example, when you add 2+2, you don’t need to engage in deep processing and do the maths – you know the solution and the answer comes to your mind automatically and quickly. System 1 is also in charge of skilful behaviour and expertise – having enough exposure to particular patterns, experts can make judgements passively e.g. a pianist sight-reading.
To put it simply, System 2 oversees everything that System 1 is not able to do alone. In fact, one of its main responsibilities is to control actions suggested by System 1. It is activated when you are aware of the processes you engage with when making a judgement. When you activate the slow thinking of System 2, a decision doesn’t just passively occur. Take for example multiplying 24×17. In order to reach the solution, you need to actively process the problem, engage your attention and focus – which is much more effortful.
Heuristics can easily become biases
Your System 1 tries to reduce the cognitive effort when making a decision. To do so, it uses processing shortcuts that rely on learnt patterns and facilitate thinking that is fast and usually effective. In his book, Kahneman provides detailed description of the most common shortcuts – heuristics.
While usually System 1 uses heuristics accurately when making a judgement or a decision, this isn’t always the case. Approaching simple problems with heuristics is usually an effective strategy, because it allows us to filter out information we don’t need. However, when the shortcuts are used to simplify a complex choice which should be carefully thought through, the heuristics become biases. For example, the availability heuristic means that we tend to judge something is more likely to happen if we can easily recall examples of it. This leads to availability bias when we underestimate or overestimate how likely something is to occur.
For example, when outreach students envisage whether they will belong at university – they may employ the availability heuristic. Traditionally, university students have been from more advantaged backgrounds, and therefore, examples of students unlike them come more readily to mind than those like them. The result is an under-estimation of the number of students like them at university, and a feeling that they won’t belong. To combat this, our Widening Participation Department employ student ambassadors from similar backgrounds as our outreach students to help deliver activities and subtly convey the message that students like them belong in higher education.
Help others to activate their System 2
Kahneman explains that no choice is neutral and a range of factors influence decisions made by System 1. Once we acknowledge it, we can either use the automatic responses of System 1 to our advantage or try to engage our System 2.
In What Works, we often make use of students’ automatic responses to encourage behaviour, e.g. by sending text message prompts. However, we sometimes try to encourage students to activate System 2 when it is important. According to the peak-end rule, our memory of experience does not reflect ‘average’ feelings across a time period, rather it’s biased towards the most extreme point of intensity and end point. This is particularly poignant for students who appraise and provide feedback on their experiences at the end of the academic year. Recently, we developed a self-reflecting tool AWARE which enables students to reflect upon their experiences at King’s at several key points of the year. Not only does this tool encourage students to think carefully about their experiences, engaging the deep processing of System 2 to avoid the aforementioned biases – but at the end of the year they have a detailed summary of their experiences to overcome error-prone reflections of System 1.
It is important to also think about our own decision-making processes
Kahneman’s work is important for understanding the actions of others, but also to help us interrogate our own biases. When designing interventions, our automatic System 1 may call on heuristics and encourage us to use strategies we have already adopted successfully in the past. This may lead to biases in research design or to using familiar interventions in a wrong context. We also might be inclined to jump to conclusions about causality, because our System 1 often makes the What You See Is All There Is bias. This means that we infer the state of the world (causality) based only on the information we can see (two events – our intervention and outcomes) and overlook alternative explanations. System 1 is quick to attribute cause-and-effect to what is readily available and representative of a causal mechanism (our intervention). Regression to the mean, selection bias, natural changes over time, etc. are some of these less available – but more likely – explanations. What Works implement research methodologies and analytical strategies to infer true causality – you could say our ‘Statistics Trump Causes’.
The context of Kahneman’s research was one of challenging the theories and assumptions about how humans formed judgments and made decisions. At What Works, we constantly collect data to test our theories and assumptions. We acknowledge that contexts change and therefore our methods should too. For each project, we create a Theory of Change which enables us to form testable hypotheses about what we assume should happen based on our underlying theory. This theory is then iterated based on the results we observe – analogous to the development of Kahneman’s Heuristics and Biases Framework.
The core message of this book is often wrongly interpreted as: people are ‘irrational’. Kahneman tries to explain that we have developed cognitive tools to help us form judgements and make decisions in light of the vast amount of information that surround us. Most of the time, these cognitive tools are adequate – but occasionally they lead to biases. Once you are aware of the processes everyone uses when making decisions, you can try to avoid these cognitive pitfalls – both in everyday life and in research. This summary doesn’t really do the work justice, so I would still recommend you giving it a read. A decade on, it is still essential reading.
[i] Kahneman, D. (2011). Thinking, fast and slow. Macmillan.