4 min read

De-biasing intuitions

De-biasing intuitions
Photo by Joey Kyber / Unsplash

Intuition has a bad reputation. It is prone to all sorts of biases and mistakes and said to be poor at handling novel problems. That is why critical thinking training often emphasizes deliberation and reflection: you can’t always trust your intuition (despite claims to the contrary), so when the stakes are high, you’d better take the time to step out of intuitive mode. However, some recent evidence suggests that sometimes, a little bit of training can render intuition more trustworthy.

One popular model of human reasoning posits that we have two separate systems for thinking: a “System 1” that is intuitive, fast and biased and a “System 2” that is deliberative, slow and rational. This so-called dual-systems theory was popularized by Daniel Kahneman in his best-seller Thinking Fast and Slow (Kahneman, 2011) and while not without its critics (e.g. see Grayot, 2020), it remains popular among behavioural economists and psychologists of reasoning.

For example, performance on the following reasoning problem (Frederick, 2005) is often explained via the fast System 1 and the slow System 2:

A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?

If you answer this quickly and if you are like most people, you’ll say that the ball costs $0.10. However, on reflection that answer doesn’t work: if the ball costs $0.10, the bat must cost $1.10 and their total cost would be $1.20, not $1.10. If you take some more time to work on the problem, you might correctly conclude that the ball must cost $0.05, so that the bat costs $1.05 and their total becomes $1.10.

I have to admit that personally, despite having seen the problem many times before, I still consciously deliberate to solve it. I turn the problem into a linear equation, with x being the cost of the ball, and then proceed to solve it. In dual-systems speak, that means I am always attacking this problem with System 2: deliberative, slow and — at least according to proponents of dual-systems theory — using extra cognitive resources. In contrast, quick responders are said to be using System 1, which is apparently biased towards a specific, erroneous answer.

Or is it? If reasoners are repeatedly tasked with ball-and-bat-like problems in such a way that they first have to give intuitive responses, and then deliberated responses, most participants continue to apply faulty reasoning. However, some of the participants show a learning effect and start applying sound reasoning after deliberation (Raoelison & De Neys, 2019). Moreover (and in apparent contrast to me), these participants quickly automatize this sound reasoning and start showing correct intuitive responses. In fact, some people already start out with correct intuitive responses to ball-and-bat-like problems (Bago & De Neys, 2019). So apparently, there’s more to reasoning on this task than just a slow, rational system that needs to catch up to a fast, error-prone system!

To explore this further, a recent experiment added a brief training session to an experiment in which participants gave intuitive responses, followed by deliberated responses to ball-and-bat-like problems. The idea was that this training session would increase learning of the task, so that the researchers could investigate whether this learning improved the quality of both intuitive and deliberative reasoning (Boissin et al., 2021).

It turned out that the training was effective. A majority of initially biased participants started giving correct responses, even during intuitive trials, and this effect lasted for up to two months. The effect was strongest for participants who had realized before the training that their intuitive responses were questionable, suggesting that people who tend to reflect more on their own thinking are also more likely to benefit from training. Unfortunately, the effect did not generalize to other tasks that are susceptible to erroneous intuitive reasoning: the training specifically boosted performance on ball-and-bat-like problems.

Does this mean that System 1 can be trained and de-biased? The authors of the study certainly argue so, but I can’t help but doubt whether their conclusion should be fit into dual-systems theory in the first place. A System 1 that is flexible and susceptible to logical arguments is not much of a System 1 to begin with, as its lack of rational potential has been a central feature ever since the early formulations of dual-systems or dual-process theory (Epstein, 1994).

In any case, Boissin et al. show that we do not need to be completely pessimistic about fast, intuitive judgments. Although their intervention had domain-specific effects, their work suggests that in principle, the heuristics employed in intuitive reasoning are susceptible to change. This means that we may be able to train people to avoid specific reasoning pitfalls without requiring them to do this in a deliberative, reflective and therefore time-consuming mode.


Bago, B., & De Neys, W. (2019). The smart System 1: Evidence for the intuitive nature of correct responding on the bat-and-ball problem. Thinking & Reasoning, 25(3), 257-299.

Boissin, E., Caparos, S., Raoelison, M., & De Neys, W. (2021). From bias to sound intuiting: Boosting correct intuitive reasoning. Cognition, 211, 104645.

Epstein, S. (1994). Integration of the cognitive and the psychodynamic unconscious. American psychologist, 49(8), 709.

Frederick, S. (2005). Cognitive reflection and decision making. Journal of Economic perspectives, 19(4), 25-42.

Grayot, J. D. (2020). Dual process theories in behavioral economics and neuroeconomics: a critical review. Review of Philosophy and Psychology, 11(1), 105-136.

Kahneman, D. (2011). Thinking, fast and slow. Macmillan.

Raoelison, M., & De Neys, W. (2019). Do we de-bias ourselves?: The impact of repeated presentation on the bat-and-ball problem. Judgment and Decision making, 14(2), 170.