skip to Main Content

Confirmation Bias: I See What I Believe

Confirmation Bias

Picture this true psychology experiment: Subjects were told they were testing the effectiveness of drinking alcohol to reduce social anxiety. People split into two random groups. In one, the individuals were told they were drinking a spiked beverage, and then they were asked how they felt as they headed into a discussion with strangers. The other group served as a control, and those participants were told they were served a non-alcoholic drink. They, too, headed off to mingle with strangers.

Group A (we’ll call them “The Booze Team”) related that the alcohol did make them relax. Group B (“The Abstainers”) said that they felt their normal level of social anxiety.

Here is the kicker: The drinks were secretly swapped. In other words, The Booze Team drank fruit punch and reported feeling calm, while The Abstainers drank alcohol and did not feel calm.

I reported this experiment to a friend and asked what he thought. He said, “I signed up for the wrong studies in college.”

How can this be? Henry David Thoreau figured out this cognitive bias in the 1800s. He said, “We hear and apprehend only what we already half know.”

The confirmation bias is a human quirk that makes us listen to, remember, see, and find credible information that agrees with our pre-existing beliefs. The tricky part is that this happens below conscious awareness and at a speed we never recognize. This is particularly insidious in organizations when company cultures limit contrary opinions. Let’s face it; most leaders rarely get enough feedback anyway, and the tendency for them to side with confirming data makes it even more challenging.

This leads to a second cognitive bias that Nobel winner Daniel Kahneman calls “WYSIATI”. The acronym stands for “what you see is all there is.” People and groups tend to make decisions confirming data without challenging themselves to explore whether the salient events or anecdotes are the totality.

The work-around for our wiring relates to assuming the tendency will occur. Plan for the fact that a strongly held belief will color the data considered. Choose a team member to play devil’s advocate and rotate this role for professional development. Also, as leaders, we can hold back from announcing our view until later in a conversation. Confirmation bias undermines accurate appraisal and good decisions. We can work with this together.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top