The confirmation bias is one of the most famous cognitive biases. Known by most people interested in psychology and beyond by large segments of the public, this term describes our tendency to seek evidence that confirms our beliefs rather than evidence that refutes them.
If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. - Nickerson (1998)
At a time when political polarization is escalating in numerous Western countries, especially in the United States, the importance of the confirmation bias becomes all the more significant. It seems to be a leading cause of heightened political tensions, as individuals increasingly choose to isolate themselves in echo chambers, where they're exposed solely to opinions and information that reinforce their pre-existing views.
Awareness of the confirmation bias is by no means a recent revelation. Four centuries ago, in 1620, Francis Bacon was already describing it as the tendency to collect information that supports our existing opinions.
The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. - Francis Bacon (1620)
Considering its pervasive influence and effects on our judgments and decisions, it's hardly surprising that the confirmation bias is one of the most frequently cited biases requiring remedial actions. Stressing the importance of this bias, Lilienfeld et al. (2009) asserted, “research on combating extreme confirmation bias should be among psychological science’s most pressing priorities.”
Fortunately, thanks to behavioural sciences we know that instead of looking for confirmation, the way we should look for information is…
Wait… what is the right way of looking for information by the way? Somewhat surprisingly, while looking for confirmatory information is described as a “bias”, behavioural textbooks typically do not describe what is the “right” way of looking for information. Indeed, this is a non-trivial question.
The problem with “balance”
Criticising the confirmation bias comes typically hand in hand with the idea that we should balance our search of information with contrarian sources to fight our preconceptions. But to what extent is this true?
Imagine reading a news report that claims there's proof of the existence of UFOs, supposedly found by the US Army. To verify this claim, which website would you check: NASA or a UFO enthusiasts' site? Like most people, you probably trust NASA more. So, you might rely on it to check such astonishing news about UFOs. But, aren't you not precisely falling into the confirmation bias trap? By choosing a source likely to dismiss these news reports, you'll presumably confirm your belief that UFO stories are fabrications.
Shouldn't you seek balance by also examining the strongest argument put forward by UFO enthusiasts? Well perhaps not, and your initial instinct to only check the NASA website is probably correct. If you are confident that a contrarian information source is highly unlikely to provide accurate information, paying attention to it could simply waste your time. With limited time and resources to look for information, it's impractical to look at sources that are unlikely to offer anything of interest.
But if you think that the claim above is trivial because, obviously, UFO enthusiasts are not very credible, you are missing the point. The fact is that this reasoning applies symmetrically to UFO enthusiasts as well. If they believe that the government is likely engaged in covering up the presence of aliens on Earth, then it does not make sense to check the NASA website, they should instead check UFO websites they trust would provide any piece of evidence out there.
From this perspective, the tendency of most people, generally sceptical about UFOs, to verify such claims via the NASA website doesn't necessarily make them more rational than UFO enthusiasts who pursue information from websites dedicated to UFO phenomena. Undoubtedly, at least one of these groups is likely operating under misconceptions. However, considering the beliefs held by each group, their respective approaches to seeking information make sense.
A key intuition here is that there is a limit to the degree of balance you would want to maintain in your information sources. This notion might initially seem unusual, but it's not particularly controversial. It's widely agreed upon that an excessive pursuit of balance can result in an undue focus on information sources that are generally perceived as unlikely to provide valuable insights.
The media’s sphere provides a useful example of this point. If there is one news source that has built its brand on balance it is the BBC. The BBC's balanced approach to political reporting garners wide praise, perhaps contributing to its status as one of the most trusted sources of information, even in the USA. However, it has also faced criticism for dedicating excessive airtime to seemingly improbable perspectives. This aspiration for impartiality often results in accusations of promoting a “false balance”, where an overly strict commitment to neutrality risks lending disproportionate exposure to marginal opinions. The BBC's own guidelines recognise that balance has its boundaries:
Impartiality does not necessarily require the range of perspectives or opinions to be covered in equal proportions either across our output as a whole, or within a single programme, webpage or item. Instead, we should seek to achieve ‘due weight’. For example, minority views should not necessarily be given similar prominence or weight to those with more support or to the prevailing consensus. BBC Editorial guidelines (my emphasis)
Ultimately, this concept of 'due weight' must rely on the judgment of editors, who determine the credibility of various positions based on what they perceive to be reasonable beliefs. If a particular viewpoint seems improbable, it may not be worthwhile to broadcast it. In essence, it's inevitable that some form of selection will occur against potential information sources that conflict with our existing beliefs.
The same line of thinking explains why people within organisations can become frustrated with those who persistently question widely accepted ideas. Aside from the dynamics of internal politics, such opposing views might be seen as a waste of time in light of the beliefs held by the majority. So, the process of gathering information might have a bias towards supporting existing views and a tendency to dismiss challenging standpoints seen as not worthy of significant consideration. Suen has made this point in a formal model looking at a decision-maker choosing rationally the type of advisor to listen to:
A neutral information source is not the optimal choice in general. Individuals demand information not for its own sake but for improving their decisions, which are governed by preferences and beliefs… the optimal advisor is one who shares the same preferences and prior beliefs as the decision maker. - Suen (2004)
What is the best way to acquire information?
Looking for sources of information likely to confirm our beliefs is therefore not such a bad thing. This point was already made by Klayman and Ha many years ago when discussing the confirmation bias:
Many phenomena labeled "confirmation bias" are better understood in terms of a general positive test strategy. With this strategy, there is a tendency to test cases that are expected (or known) to have the property of interest rather than those expected (or known) to lack that property. This strategy… can be a very good heuristic for determining the truth or falsity of a hypothesis under realistic conditions. - Klayman and Ha (1987)
If we could go beyond heuristics (rules of thumb) what would be the optimal way of acquiring information? The economist Weijie Zhong recently provided an answer to this question in a fairly general setting.1 He envisages a situation where an individual needs to make a decision and seeks to gather some information beforehand. This individual can select any type of informative signal, which can be received continuously over time. However, there are two constraints: acquiring more informative signals incurs higher costs and the time spent collecting information also comes at a cost (the individual prefers to make a decision sooner rather than later).
Zhong shows that, in that setting, the optimal source of information is confirmatory! That is, the individual would opt for an information source that tends to send strong confirmatory signals from time to time (formally a Poisson process). These signals would be potent enough to instil confidence in the individual to make a decision aligned with his/her pre-existing beliefs. However, if the individual doesn't witness the occurrence of such a signal, his/her confidence in these prior beliefs would gradually wane.2
Here is a very simple example to illustrate the intuition behind this result. Let's assume a politician, X, is accused of fraudulent behaviour and a legal proceeding is underway. How should you decide on your sources of information to determine whether X is guilty or innocent? Zhong’s result indicates that if you initially believe that X is likely guilty, you'll want to follow news sources—be they TV channels or newspapers—that will immediately report any decisive evidence against X. Because of your preliminary belief that X is probably guilty, this strategy represents the most effective allocation of your attention. However, it's worth noting that if you don't see such negative news about X in these critical sources, your beliefs will gradually shift towards X's innocence.
This logic applies equally if you initially believe X is likely innocent. In this scenario, you would follow news sources likely to promptly report any information indicating that the legal case is unfounded. In essence, both proponents and opponents of X will gravitate towards their partisan information sources. But this is not indicative of bias. Rather, given their initial beliefs, it's the most efficient way for them to allocate their attention!
To sum up, the psychological and behavioural economics literature has labelled the search for confirmatory information as a “bias”. But this characterisation has lacked a foundation in a precise understanding of what the correct method of information acquisition would actually be.3 When information is costly, the optimal strategy involves seeking confirmation of what you already believe to be true. The confirmation "bias" is, in reality, the solution to the problem of how to allocate our attention between different news sources, given our beliefs.
Does this suggest that people are always correct, or that there are no biases in gathering and interpreting information? Not at all. However, it might be that the issues arise from the divergent opinions people hold initially. Factors such as wishful thinking, self-deception, and motivated reasoning could cause people to harbour views that stray too far from factual evidence.4 But these views may not be driven by a “confirmation bias” and, given these beliefs, seeking confirmatory evidence isn't then a bias in itself.
In the end, the history of the confirmation bias exemplifies how advancements in economic theory can lend clarity to certain "biases." Through a deeper understanding of the incentives and constraints we face in the real world, we can discern that behaviours, which at first appear perplexing, can indeed have sound explanations.
This post is the second of three instalments that show how iconic “biases” turned out not to be biases. The first post was on the hot hand fallacy. The next and final post will be on reference dependence, or the sensitivity to gains and losses, one of the most important results in behavioural economics. To get notifications of future posts you can subscribe. The content of this Substack is free.
References
Bacon, F., 1960. The new organon and related writings.
Che, Y.K. and Mierendorff, K., 2019. Optimal dynamic allocation of attention. American Economic Review, 109(8), pp.2993-3029.
Goette, L., Han, H.J. and Leung, B.T.K., 2020. Information overload and confirmation bias.
Griggs, R.A., & Cox, J.R. (1982). The elusive thematic-materials effect in Wason’s selection task. British Journal of Psychology, 73, 407-420.
Klayman, J. and Ha, Y.W., 1987. Confirmation, disconfirmation, and information in hypothesis testing. Psychological review, 94(2), p.211.
Klein, G. 2019. The Curious Case of Confirmation Bias, Psychology Today.
Lilienfeld, S.O., Ammirati, R. and Landfield, K., 2009. Giving debiasing away: Can psychological research on correcting cognitive errors promote human welfare?. Perspectives on psychological science, 4(4), pp.390-398.
Nickerson, R.S., 1998. Confirmation bias: A ubiquitous phenomenon in many guises. Review of general psychology, 2(2), pp.175-220.
Suen, W., 2004. The self‐perpetuation of biased beliefs. The Economic Journal, 114(495), pp.377-396.
Wason, P.C., 1968. Reasoning about a rule. Quarterly Journal of Experimental Psychology, 20(3), pp.273-281.
Zhong, W., 2022. Optimal dynamic information acquisition. Econometrica, 90(4), pp.1537-1582.
Zhong is not the only one finding the emergence of confirmatory acquisition of information as a result of the cost of acquiring information. The interested reader can also check Che and Mierendorff (2019) and Goette et al. (2019).
The figure below describes the evolution of the beliefs of a decision-maker in favour of the state of the world seen as more likely at first, for instance “X is guilty”, when the source of information is chosen optimally. The person waits for informative signals confirming his/her prior belief. These signals should be strong enough for the person’s belief to cross the threshold of confidence required to make a decision, for instance making one’s mind about X. If no signal arrives, the decision-maker’s confidence in the initial belief progressively decreases.
The reasons for these will be discussed in another post.
There could well be a confirmation bias in the sense that people would overly look for confirmatory information relative to the optimal solution. But such a case would have to be made from observations trying to compare model predictions and actual behaviour.
Lionel seems to misunderstand what confirmation bias means. He criticizes a distorted version of it. Anyone can see that the actual definition (the simplest of the actual ones) is that it is searching for evidence that supports our existing beliefs while ignoring the evidence that contradicts them. This second (and essential) part of the definition makes it a clear bias.