We feel like such conscious people, but a number of unseen biases derail decisions for leaders every day. Understanding these can help avoid predictable errors. Here’s the funny part: While we are bad at seeing these in ourselves, we have x-ray vision for these mistakes in colleagues. Leverage this! By training your team on common thinking traps, you can improve decisions immediately.
The Availability Error:
When we evaluate data, we draw on what comes to mind readily. The problem is that we don’t pause to ask what we aren’t evaluating. Your company was burned by outsourcing in 2002? You conclude no outsourcing works.
Ask: On what data do we base this conclusion? Is this all the data? (Tip: Anecdotal data doesn’t count).
First impressions really do linger. The first meeting with an employee colors how you see them. Even when it was an anomaly. The first price in a negotiation anchors how high or low we think we should go.
Ask: What are all the data points we have about this product, person, or idea? Lop off the first impression. That doesn’t count.
The Confirmation Error:
This ties to anchoring. Humans evaluate data based on what they believe about it and weed out indicators that oppose our views. We literally see data that confirms our assumptions and retain it better than contrary information.
Ask: Assume this assumption is wrong. What is all of the evidence for the other side?
The Curse of Knowledge:
It is very difficult to unlearn what we know and accurately identify what makes sense to others. Leaders routinely assume that employees have the same understanding and background in a topic that they do. This is a major cause of change management flops.
Ask: Gather trusted employees and ask them to highlight the places that lack enough background. It also helps to err on the side of repeated context.
Blind Spot Bias:
It is normal to believe that we have fewer mental distortions than other people. This is disastrous when combined with the limited accurate feedback leaders receive.
Ask: Request that your team anonymously rate each other on the bias they see in each person. Share the data (and accept it is true).
What you see above are not weaknesses but bugs in our machinery. Assume you have them and provide a safe way for team members to pause action when they see one at play.