Confirmation Bias: The Most Human of Tendencies | TechWell

Confirmation Bias: The Most Human of Tendencies

Confirmation bias is the tendency to notice evidence that supports our beliefs, preconceptions, and hypotheses, and to miss, ignore, or dismiss evidence that contradicts them. Instead of trying to falsify a hypothesis, we tend to try to confirm it. It’s a human thing to do.

The term confirmation bias was coined in the 1960s by the English psychologist Peter Wason as a result of an experiment designed to examine how people test hypotheses. The experiment was ingenious. (If you despise logic or number puzzles, skip the next paragraph.)

Wason gave participants a series of three numbers, such as 2-4-6, and asked them to try to identify the rule that described the sequence by offering other three-digit sequences that followed the rule. In response, participants offered sequences such as 4-8-10, 6-8-12, and 20-22-24, all of which, they were told, followed the rule. Most participants concluded that the three pairs of three numbers were an ascending sequence of even numbers. They were wrong.

Can you think of other sets of digits that might tease out the rule? (Suggestions at the end.)

What participants failed to do was to offer sequences that might falsify their hypothesis because, according to Wason, people don’t want to break their own rules. Other possible explanations include wishful thinking and the limited human capacity to process information.

But does this bias have any practical relevance? Definitely, because confirmation biases can cause overconfidence in your beliefs, leading you to maintain beliefs in the face of contrary evidence.

For example, journalists aiming to tell a certain story must avoid the tendency to ignore evidence to the contrary. The same applies to decision making in military, political, and organizational contexts, all of which are susceptible to the confirmation bias. In fact, this bias is a popular topic in executive education, since people making decisions often don’t fully analyze evidence that contradicts their preconceived notions of a situation.

Even scientists can fall victim. Scientists who are seeking to prove a hypothesis must avoid designing experiments that don’t allow for the possibility of alternate outcomes. Science is full of examples of great scientists, including Galileo and Einstein, who tenaciously clung to theories long after contradictory evidence had become overwhelming.

Interestingly, conspiracy theories are prime examples of the confirmation bias. If you’re convinced man didn’t really set foot on the moon, you’ll find plenty of evidence to support that belief, and you’ll ignore evidence that disproves it.

To avoid the confirmation bias—no simple matter, by the way—allow for the possibility that what you believe might be wrong. Be willing to ask, “Am I wrong?” or even better, “How am I wrong?” To get honest feedback in work settings, don’t ask, “How did I do?” but “What could I have done better?” Constantly seek evidence that counters your views. When you do research, don’t look just for information that supports your beliefs. Make a point of looking for information that refutes it.

As to three-digit sequences that might challenge the “rule” of ascending even numbers, did you consider sequences such as 3-2-1 or 5-7-9 or 2-8-11? Such sequences would have led to the actual rule in Wason’s experiment: the sequence was simply any three ascending digits.

Up Next

November 4, 2012

About the Author

TechWell Insights To Go

(* Required fields)

Get the latest stories delivered to your inbox every month.