Dr. Wim De Neys is an experimental psychologist and research director at the Sorbonne University of Paris. He studies the processes underlying human rationality and is interested in the interplay between intuitive and reflective reasoning. In this Reasoning Report, he discusses the mystery of why we find reflective thinking hard, the limits of de-biasing and the importance of being able to justify beliefs.
Intuitions are unreliable
When asked what everyone should know about human reasoning, Wim De Neys does not need time for deliberation. "Everyone should realize intuitions are often unreliable. If you speak to people from politics or business, you will still hear them argue to the contrary, but there is all this research, prompted by the work of Daniel Kahneman, that shows we often make mistakes if we reason intuitively."
Kahneman, Nobel Prize winner and best-selling author, popularized the idea of two cognitive systems that complement each other during reasoning in his book Thinking, Fast and Slow. System 1 is fast and intuitive, while System 2 is slow and deliberative. This so-called dual-systems view is used across the cognitive sciences, including in the psychology of reasoning that is the main focus of De Neys' work.
"Kahneman and others, such as for example John Evans, developed their ideas to understand biases in reasoning," explains De Neys. "You can find the idea of intuitive, emotional tendencies that are kept in check by a rational system much earlier, in the work of Sigmund Freud or even Plato. In these views, rationality was needed to display proper behaviour. What psychology later added was the finding that intuitive, quick thinking leads to erroneous biases in reasoning even on very basic logical and mathematical tasks. These tasks are not necessarily hard, but people put their ability to deal with them aside on the basis of their intuitions or stereotypes.
"What is so great about the dual-systems model is that it explains the fact that we could design rockets that took us to the moon and simultaneously explains why we sometimes make reasoning errors. These errors are wrong, but still it makes sense that we have an intuitive System 1. System 1 is fast, does not use up attentional resources and does not need cognitive surplus. It often yields correct responses, even though it can be at odds with certain logical, mathematical or probabilistic principles. If you would use the reflective System 2 for all thinking, that would be costly, difficult and time-consuming. People prefer to avoid that – they do not like to think hard about everything. And the price for that is that their reasoning becomes prone to biases."
That laziness is probably recognizable for everyone. Yet from a biological viewpoint it is not obvious why reflective thinking should be subjectively hard. Couldn't we have evolved in such a way that we would have always preferred reflective thinking over intuitive responding? According to De Neys, this is still an open question. "One suggestion has been that reflective thinking leads to the accumulation of waste products in our brains, perhaps because reflective thinking takes longer or perhaps by its nature. These waste products can lead to trouble, such as hallucinations and therefore limiting the time we think reflectively is a way to keep the concentration of waste products low.
"Another line of thought looks at the balance between System 1 and System 2 as an exploration-exploitation problem. If we would always be using System 2, you would take much time considering your options instead of exploiting the options that are in front of you. Taken this way, there needs to be some barrier against always making use of System 2."
Reprogramming our intuitions
Still, it is not as if the intuitive System 1 is destined to reason poorly on logical or mathematical tasks. "The classical idea," De Neys says, "was that good reasoners are very good at activating their System 2 and overruling the conclusions from System 1. But it turned that those good reasoners – so people who do well on tasks featuring logical, mathematical or probabilistic reasoning – actually still respond correctly if they can only use their System 1 to respond. Somehow they have logical intuitions.
"In one experiment we took a group of people, had them perform reasoning tasks and in a brief intervention we explained to them the logic behind things like the ball-and-bat problem and why people often get it wrong. If you do that, people get better – this was already known from experiments elsewhere. Our interest was to see at which level the improvements took place. Does the explanation bring people to think more deeply and engage their System 2 or do they become more like the strong reasoners who can give correct answers using their System 1? It turns out the latter was the case – the improvement came from better intuitive responding.
A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?
The intuitive response of people is to say that the ball costs $0.10, as that's the difference between $1 and $1.10. But if the ball were to cost $0.10, then the bat would cost $1.10 and their total would be $1.20. Some reflective thinking reveals that the ball must cost $0.05.
"The way we study this is by introducing time pressure and adding tasks to the experiments. For example, participants may have to remember a pattern or a series of numbers during the experiment. This requires attention which is then not available to solve the reasoning problem. Such additional tasks, as well as the time pressure, make people report that they are basically gambling when giving the responses, because they hardly have the time to read the task before they need to answer. We take this to mean they had to use their intuitive system to solve the reasoning problem."
Under these conditions, the participants performed better after a short training, which is also called de-biasing. But while de-biasing works on problems like the ball-and-bat problem, more recent work by De Neys has not shown performance increases for syllogistic reasoning, in which System 1 can display a so-called belief bias.
All mammals can walk.
Dogs can walk.
Therefore, dogs are mammals.
This is not valid reasoning, because the proposition that all mammals can walk does not imply that there are no other classes that could walk. Logically speaking, dogs could still belong to a non-mammal class that can walk. However, because we have world knowledge about dogs being mammals, we are tempted to consider the reasoning to be valid. After all, the conclusion fits with our beliefs.
"What I think was the problem," explains De Neys, "was that even for simple syllogisms, participants had two separate rules to keep track of. They had to look for valid conclusions, even if these conflicted with background knowledge and they had to look out for invalid conclusions that were actually aligned with background knowledge. Explaining this in a de-biasing training is complicated: you have to explain what is valid reasoning, you have to show that such reasoning can conflict with background knowledge (but not always) and you have to explain invalid reasoning and again show how it can conflict with existing knowledge. So the training intervention became way more complex and took more time than that for ball-and-bat problems.
"I think System 1 is very good at associating one thing with another, but has trouble with conditionals that say 'x leads to y, except if there's also z'. We have done our best to explain the logic in the most simple terms, but we did not succeed in de-biasing. In fact, we hardly saw effects on performance when System 2 could be used, which was in line with studies done before. For comparison, explaining ball-and-bat tasks can give 50% or 60% improvement, but on these syllogism tasks you get a modest 10% improvement."
Distinct neural systems
Since they show differences in both processing speed and types of tasks they can perform, it is tempting to think that System 1 and System 2 correspond to clearly distinguishable structures in the brain. Yet the evidence does not point in that direction, as De Neys explains.
"Neuroimaging studies show that if people respond correctly to classical reasoning tasks, you see the typical prefrontal cortex activation, which ties in with the idea that this is the 'seat of reason'. A lot of demanding tasks lead to this activation, but this of course does not mean prefrontal activation always indicates a demanding task, as specific brain areas can be used for a wide range of cognitive processes. One of our ideas is that prefrontal cortex is in fact also involved with the intuitive processing of System 1 and we are doing neuroimaging studies on people doing reasoning tasks while under time pressure and additional task load.
"Our first results indicate that under these circumstances, reasoning is still processed by prefrontal cortex. So we cannot use this activation to distinguish System 1 and System 2 processing. Perhaps at some deeper level there are differences. Perhaps the two types of reasoning are distinct at the molecular level or perhaps we can recognize reflective reasoning through these waste products mentioned before. But right now we cannot disentangle System 1 and System 2 processing by looking at neural activations."
Explainability as a feature
Despite this neural overlap and despite the fact that System 1 is malleable enough to take over some tasks from System 2, one clear functional difference remains. People who are able to use System 2 can also explain their reasoning, while those depending on System 1 cannot. De Neys: "Even if their intuitive responses are correct, System 1 thinkers have trouble offering a justification for their belief.
"To me this suggests that one of the functions of System 2 is to reflectively arrive at justifications, for use in communication. Reflective reasoning is useful even if problems can be solved solely by intuitive reasoning, because it can help you convince others of your viewpoint. What's interesting is that in some studies, we allowed participants to reflect on their intuitive responses if they wanted to. This did not affect their pay or anything, so in principle there was no incentive to do it, but still people preferred to spend an additional 15 or 20 seconds on a problem, to think it through. Something motivated them to reflect."
During such additional reflection, the confidence of people in their answer also increases. One might therefore argue that System 2 is doing metacognitive work: monitoring the thought processes that led to a conclusion in order to determine the quality of reasoning. De Neys, however, does not agree with this view.
"I'd say that System 2 is about justifications and about sharing convictions and arguments, which are not necessarily metacognitive processes. I think metacognition is part of our intuitions, with System 1 monitoring whether activation of System 2 is needed, in order to think more deeply about a problem if fast, intuitive processing is not reliable. It would be a paradox if we needed reflective thinking to determine whether to activate the system of reflective thinking – a better solution would be for the efficient, less cognitively demanding system to determine whether the more demanding System 2 is needed.
"This can be accomplished by having parallel processes within the intuitive system. Some of these might be tilted towards using stereotypes, others might work on the basis of more logical reasoning, and all these different processes can work on the same problem and accumulate evidence for different conclusions. Now if the evidence for one response generated by the intuitive system is much stronger than that for the others, System 1 can determine the proper response. But if the evidence for different conclusions are of comparable strength, this can signal the activation of a new, reflective process that depends on System 2. In this way, the activation of the reflective system depends on the outcomes of all sorts of fine-tuned processes going on in the intuitive system."
The benefits of limited transfer
Educators are very interested in the development of metacognition. The capability to monitor how well you understand something or how certain you are of an answer can predict positive learning outcomes. Yet educational approaches are often geared towards reflective processes. Are they focusing on the wrong system?
"This is a factor that should receive more attention," says De Neys. "There is probably some metacognition happening in System 2, too – once you have activated your System 2 you can use it to monitor your own thought processes. But there must be monitoring processes in System 1 and these also deserve attention. It could very well be that we can enhance metacognition at the intuitive level. One obvious way would be to change the sensitivity of the switch that activates System 2. If two intuitions are competing, this switch is activated at some critical threshold, which we can perhaps lower by training.
"However, in some situations it might be beneficial if people would use their reflective system more often, but the other side of the coin is that a lowered threshold might also prompt deep thinking when it is not useful. This is similar to the pre-bunking work done to battle misinformation. You can train people to be more skeptical of news sources and this makes them less likely to fall for fake news, but it also makes them reject trustworthy sources more often. If you offer heuristics that are supposed to lead to more thoughtful responding, you can also get misfiring.
"Interestingly enough, students prove quite resilient against such training. If heuristics are not helpful to them, they will abandon them after some time. Educators often consider it a problem that training in critical thinking in one context does not generalize well to others, but it might actually be a good thing. It means you can prepare people to think carefully in specific contexts without suffering negative effects elsewhere."
"What I hope to find out in the coming years is how the switch from intuitive to reflective reasoning works. It is a metacognitive mechanism and I think we understand it at a conceptual level, but we need specific computational models to describe how it works. Once we understand switching at a detailed level, this can inform all sorts of new applications, including ways to develop of critical thinking skills."
If you'd like to receive interviews like these in your inbox, you can subscribe to The Reasoning Report for free! This also gives you the ability to comment on all posts on Connecting Cells, both intuitively and reflectively.
If you are interested in the scholarly work behind this interview, make sure to read the following recent publications (co-)authored by Wim De Neys:
- Boissin, E., Caparos, S., Voudouri, A.., & De Neys, W. (2022). Debiasing system 1: Training favours logical over stereotypical intuiting. Judgment and Decision Making, 17, 646-690.
- Boissin, E., Caparos, S., & De Neys, W. (in press). No easy fix for belief bias during syllogistic reasoning? Journal of Cognitive Psychology
- De Neys, W. (in press). Advancing theorizing about fast-and-slow thinking. Behavioral and Brain Sciences
For those interested in the mechanisms underlying effortful thinking:
- Wiehler, A., Branzoli, F., Adanyeguh, I., Mochel, F., & Pessiglione, M. (2022). A neuro-metabolic account of why daylong cognitive work alters the control of economic decisions. Current Biology, 32(16), 3564-3575.