7 min read

Sacha Altay: "What we need is intellectual humility"

Sacha Altay: "What we need is intellectual humility"
Trump supporters stand on a U.S. Capitol Police armored vehicle during attack on Capitol on Jan. 6, 2021. There are increased concerns that misinformation is destabilizing democracies. BILL CLARK/CQ-ROLL CALL, INC VIA GETTY IMAGES.

In this Reasoning Report, Sacha Altay discusses the limits of fact-checking, the double-edged sword that is open-mindedness and the importance of intellectual humility. Sacha completed his PhD at École Normale Supérieure and now, he works as a postdoctoral research fellow at Oxford University where he studies the spread of false beliefs.

The following is not a verbatim transcript, but rather a summary of a conversation held with Sacha Altay. It has been rendered as a dialogue for ease of reading. The summary broadly reflects his views and has been checked (by him) for accuracy. No misinformation here.

There's widespread concerns about misinformation, disinformation and conspiratorial ideation. One response has been to say that we need to teach people critical thinking skills. Yet in your work on misinformation you state that we in fact do not need critical people so much. What do you mean by that?

There's nothing wrong with critical thinking, but the teaching often focuses on fallacies and cognitive biases. I believe critical thinking should be more about helping people decide who to trust rather than what to believe. Conspiracy believers would laugh out loud at most introductory classes on argumentation – they know that stuff! It's just that they work with entirely premises and so they draw different conclusions.

Their starting point is that authoritative mainstream sources cannot be trusted. Conspiracy believers are not too credulous, but rather too distrustful. To limit the spread of misinformation, you would need to teach people which sources are reliable and why, instead of teaching them how to evaluate information. You can't expect everyone to fact-check each bit of information. In fact, such insistent fact-checking is the thing conspiracy believers do.

There is overall too little appreciation of how well collective intelligence works. Wikipedia is a great source for most things. Science, although not perfect, has the right incentives in place to generate knowledge, because scientists can make a career out of disproving a misconception. Do alternatives look better? Hell no. It would be good teach people how collective systems of generating knowledge work.

There is a paradox here, it seems. Conspiracy believers are reluctant to trust reputable sources, but they do not have the same scepticism when it comes to alternative media. Are they sceptical and gullible at the same time?

One hypothesis close to my heart is that conspiracy believers are just trying to distinguish themselves by not following the herd, and by relying on individual learning ("doing your own research") in opposition to social learning (deferring to science, etc.). They see expert consensus as a sign of groupthink and so they admire individuals who go against the grain. But the key factor predicting susceptibility to misinformation is probably low trust in institutions. Alternative media define themselves in opposition to establishment, and attract people with low trust.

Thing is, there can be good reasons not to trust institutions. In France, in the early stages of the covid-19 crisis, it was not clear whether there were enough face masks. So the government claimed that face masks do not matter anyway. If governments treat people as if they are stupid, if they do not tell the whole truth, fail to communicate about uncertainty or refuse to motivate their decisions, they take a big risk. Dishonesty can work on the short term, but in the long run it erodes trust. And in the absence of trust, alternative sources step in.

Could education help here?

To contain the spread of misinformation, what we need is intellectual humility. Everyone should realize they hardly know anything about almost everything. I built a chatbot that delivered information on vaccination and I learnt a lot about the topic. Am I now an expert on vaccines? No, that would require much more work and experience. And so on vaccination, I defer to the experts.

Experts can be wrong, but I am more likely than they are to be wrong on their topic. And if experts as a group are wrong, at some point one of them will stand up to improve collective knowledge. On the topic of misinformation, I sometimes go against the herd, only because my expertise allows me to do so. But when I talk to conspiracy theorists, they are puzzled I don't do this in other cases. 'Just look at the evidence,' they say, and I respond that 'no, I am looking at the experts interpreting the evidence.'

An interdisciplinarian would argue that you also need the outside view. Intellectual courage to go with intellectual humility.

Interdisciplinary work is extremely hard. It's not just that you don't speak the same language, but you have different methodologies and standards, too. For instance, neuroscientists studying misinformation with fMRI often have very small sample size because it’s very costly to run such studies. To me their sample size is then too small to draw any meaningful conclusions, but in their field they have no choice. Political scientists have concerns if your subject pool is not politically balanced, while psychologists care less about it. You can have the intellectual courage to comment on work from a discipline other than your own, but that is only valuable if you are also an expert on the respective topic.

In the Netherlands, there have been calls to open up science and collaborate with society at large. One argument is that this could restore trust in the scientific enterprise. You don’t believe that would work?

Participatory science is great, with people counting birds or gathering data in other ways. And I can see that for social sciences, the views of non-scientists can be valuable. But what do you do if you plan to build a nuclear reactor? Sure, the decision whether or not to build it should have everyone weigh in, but do you want non-engineers to give their opinion on how to build it? Or do you say, probably the experts know best?

One idea that takes is a central role in your work is reputation management. What people say or not say affects their standing and they take this into account. Can reputation be used to push people away from false beliefs?

The evidence suggests this won’t work directly. In the lab, you can indeed manipulate people’s online sharing behaviour by emphasizing that sharing false news hurts their reputation. They will share less fake news that way, but also less accurate news! This is despite findings that people are generally good at distinguishing fake news from real news.

In real settings, the difficulty with reputation-based interventions is that people are very good at seeing actual incentives around them, whether that’s at their schools, in their workplaces or elsewhere. You can nudge people all you want, but if the real incentives are not about accuracy, people will not adapt their behaviour.

Sometimes these reputational incentives to be accurate can have a real impact because they change the incentive structure. For instance, fact-checking politicians and exposing their inaccuracies will affect their reputation, and thus reduce the likelihood that they repeat inaccuracies. Prominent actors are big drivers of misinformation, so targeting their behaviour has real impact.

One might organize a debate with a big buzzer and if you suspect a claim of your opponent is false, you can push the buzzer and have the claim fact-checked by an independent party. My prediction would be that this would be a deterrent to making false claims.

Journalists try to be a buzzer like that. They write fact-checking and debunking sections. Do these work?

It turns out these sections are hardly read by people falling for misinformation and the people who do read them often had not seen the misinformation in the first place. So in that sense they probably don't help that much, beyond possibly deterring prominent people.

Still, I like it when fact-checking articles do not focus so much on the false stuff but more on explaining interesting true things about the world. Even if fact-checking only focused on the truth I suspect that it would still make people more resilient to the false stuff. For instance, if you understand how the mRNA vaccines work, the false claim that it changes our DNA will not appear plausible at all.

You have made the case that instead of false information, we should focus on misleading information. How would that work?

Misinformation often works by taking things out of context, or by subtly changing some details to make the claim misleading. It’s harder to fact-check these, it requires a lot of work to point out poor framing. Fact-checkers often focus on the things that are easy to fact-check, but these may not be the most impactful forms of misinformation.

Besides their focus on fallacies and biases, critical thinking educators also talk about boosting actively open-minded thinking. Seeing multiple perspectives would be key to good judgment. Do you have thoughts on this view?

What if you only trust scientists and reputable media, but then you're prompted to be more open-minded? That might turn out to be counterproductive. Perhaps reflective open-mindedness could help, which is a thoughtful way of considering the viewpoints of others. By seeing the values and convictions of others -- and realizing that opposing groups do not necessarily hate you -- open-mindedness could reduce polarization and consequently misinformation.

Personally, I am not very interested in leaving my bubble of following scientific consensus. Perhaps that is not open-minded, but if you follow the progress of science you will still be changing beliefs all the time.‌


Want interviews like this in your inbox? The Reasoning Report is sent out to subscribers on a monthly basis. Subscribers can also comment on all website posts. And all this for the amazing price of nothing! Next month, we'll hear from Federica Russo, philosopher of science, technology and information at the University of Amsterdam.


Further reading

The following work of Sacha Altay is definitely worth a read if you are interested in the spread of misinformation and the attempts to counter it:

Altay, S. (2022). How Effective Are Interventions Against Misinformation? PsyArxiv. https://doi.org/10.31234/osf.io/sm3vk

Altay, S. & Acerbi, A. (2023) People believe misinformation is a threat because they assume others are gullible. New Media & Society, https://doi.org/10.1177/14614448231153379

Altay, S., Berriche, M., & Acerbi, A. (2023). Misinformation on misinformation: Conceptual and methodological challenges. Social Media+ Society, 9(1)

Altay, S., Hacquin, A. S., & Mercier, H. (2022). Why do so few people share fake news? It hurts their reputation. New Media & Society, 24(6)