Confirmation bias – ferreting favourable findings while overlooking opposing observations
![]() |
I have evidence that confirmation bias is a red flag, so don’t try to convince me otherwise. |
How to recognise this tactic
This is a cognitive bias that we all suffer from. We go out of our way to look for evidence that confirms our ideas and avoid evidence that would contradict them.
The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects.
Francis Bacon, English philosopher and author, 1620
Why do people use this tactic?
Confirmation bias is part of human nature. We feel more comfortable when our opinions are reinforced. Contradictory evidence would upset our equilibrium, so we turn a blind eye.
What’s wrong with this tactic
Unfortunately, confirmation bias leads us into errors of judgement because we ignore real-world evidence.
What to do when confronted by this tactic
The only remedy is to develop the habit of identifying and resisting confirmation bias. People such as scientists, who deal with gathering and interpreting evidence in their work, need to guard against it constantly. Their training should have made them experts at this.
Beware of those who try to persuade you to accept their fringe ideas by pandering to your tendency for confirmation bias. They’ll feed you evidence that you feel comfortable with and withhold anything that contradicts their claims.
Variations and related tactics
Confirmation bias is closely related to experimenter’s bias (also known as expectation bias or research bias), congruence bias (in which scientists perform tests likely to support their model, but neglect to perform those likely to support alternative models) and attentional bias (in which preferred data are selected at the expense of other relevant data). In putting confirmation bias into operation, people use such specific tactics as wishful thinking, single-study syndrome and cherry-picking.
Examples
- In the face of the fact that 97% of climate scientists accept the anthropogenic climate disruption model, climate deniers use a variety of tactics to support their confirmation bias. John Cook describes how they recruit cherry-picking, conspiracy theories and logical fallacies to bolster their positions.
- Climate deniers regularly imply that climate scientists show confirmation bias by claiming they ignore “natural cycles” or disregard “basic science”. Some of these arguments (linked to their refutations at Skeptical Science) are: It’s the sun; It’s cosmic rays; It’s a natural cycle; Water vapour is a stronger greenhouse gas; Carbon dioxide lags warming; The greenhouse effect contradicts the second law of thermodynamics; Clouds provide negative feedback; The CO2 effect is saturated.
- Here’s a video sequence that explains why British journalist James Delingpole is so prone to confirmation bias in his denial of climate change. Notice how Delingpole embarrassingly slips out of answering the question that makes him feel uncomfortable. His infamous “I’m an interpreter of interpretations” line is obviously code for “I choose to ignore evidence I don’t like.”
- A recent report on the safety of GM crops (here) went out of its way to emphasise studies that claimed to show harmful effects, while de-emphasising others. For example, A 2011 systematic review of animal trials which found that “GM plants are nutritionally equivalent to their non-GM counterparts and can be safely used in food and feed” was dismissed on the basis that it had ignored statistically significant differences. This is despite the fact that the review pointed out “there were no statistically significant differences within parameters observed.”
- The founder of the science of genetics, Gregor Mendel, has been accused of confirmation bias in his seminal work on pea plants. In 1936, R. A. Fisher published a paper accusing Mendel of falsifying data so that they would agree with his expectations. More recent investigations (here and here) disagree with Fisher’s conclusion.
- All kinds of pseudo-scientists, from astrologers to psychics harness confirmation bias to mislead. They emphasise anything that seems to fit their unfounded claims and conveniently forget to mention anything that disagrees. Here’s a quick video from the Con Academy that illustrates the techniques.
- Here’s an example of blatant confirmation bias from personal trainer Michael Jarosky in the Sydney Morning Herald. Jarosky disagrees with the weight-loss ideas of Michael Mosely. Nothing wrong with that if he’s got a strong argument. But what is his justification?
I disagree with science on this one, and I’ll disagree with anybody demonising any kind of exercise. When it comes to exercise, why do we always want to know about “less”?
I can’t compare my master’s degree in economics with his medical degree, yet some things just don’t require more science. Inherently, we all know what to do when it comes to weight loss and living a healthy life, and exercise is part of that equation.
I don’t have any science to back my claim, but I believe a man should be able to perform 20 push-ups. I believe women should also have a reasonable amount of strength and fitness obtained via exercise. Why? Because it makes life easier from a physical perspective, and vigorous exercise makes a person feel good. Also, there’s nothing wrong with a bit of vanity to tone up for your best pair of jeans and a t-shirt.
…
It’s a statistical fact that we’re getting fatter and fatter, so what is science really doing for us?
I now know why we shouldn’t lean so much on science.
Further reading
Trolling our confirmation bias: one bite and we’re easily sucked in by Will J Grant at The Conversation.
You can add to the list of examples by leaving a comment.
Francis Bacon’s quote is from Aphorism 46, Novum Organum.
The cartoon is from Saturday Morning Breakfast Cereal.
![]() |
This is one of ScienceOrNot’s Science red flags. See them all here. |
I spotted a cognitive bias in my own behaviour the other day and thought I’d tell you about it – regression to the mean fallacy ahoy! I have a severe chronic illness whose behaviour changes from day to day in an essentially random manner (there are longer term trends, but we can ignore them over the day-to-day). Think “bell curve distribution” for “How bad is Jeshyr today”, basically.
I have been attempting to reduce the doses of various medications I take. The dose reduction tends to cause 3-5 days of more severe symptoms just as a result of the reduction, whether the new lower dose is effective can’t be judged until 5-7 days after the reduction takes place. Because I know that the change will cause more severe symptoms, I have been waiting until I have a few days of “above average” levels of health and feel like I can cope with the excessive symptoms.
So then a week after the dose change, I’m thinking “how were my symptoms a week ago, compared with now?” As you can see, I’m comparing the above-average health level immediately before the reduction which what’s probably (on average) an “average” level a week later.
Unsurprisingly, this tended to make me think that the dose reduction was having some longer-term bad effect when it actually wasn’t. Having realised what I was doing, I feel like a proper idiot …
Ah yes, we all do it Jeshyr. But the important thing is that we are aware of it, and we compensate, as you did. Unfortunately, there are too many people who are totally unaware of their cognitive biases, and too many others who know how to exploit them.
So very true!
Preferential treatment of faith. Many people will pick out the positive elements of their own religion while focusing on the negative elements of other religions.