Pexels / Pixabay

Most decision makers tend to think of themselves as well-informed and objective in their approach. They may be surprised to learn, then, that smart people aren’t immune to making bad decisions. In fact, their confidence in their intellect and experience may even leave them more vulnerable to the effects of one of the leading contributors to bad decisions: bias.

Simply put, a bias is a preference for one thing over another, often with little rational justification. Biases are often informed by experience and observation in ways that are both positive and negative. Cultural beliefs and social attitudes can be powerful sources of bias, but many biased behaviors are motivated by psychology, often without people even realizing what’s happening.

In terms of decision making, biases influence how people select and interpret information, which can have a dramatic effect on the choices they ultimately make. Here are a few strategies for learning to make unbiased decisions.

Slow Down

Biases have the most influence on decisions that are made quickly without much consideration. As a matter of psychological economy, quick decisions rely on personal experience and preconceived notions about people or situations. The problem is that these things may not be accurate or may not apply to the current decision. While they can produce efficient decisions, there’s no reason to believe they’ll be the correct ones or even good ones. Even in situations where time is a factor, it’s better to slow down long enough to carefully consider relevant information and make decisions on the basis of that data rather than relying on instinct.

This form of “snap judgment” bias is especially dangerous in selection or promotion decisions. Many managers confess to relying on their “gut instinct” when considering candidates, sometimes even making up their mind about whether or not to hire someone within the first five minutes of an interview. While initial impressions aren’t entirely meaningless, superficial qualities like “executive hair” should not be the basis for making personnel decisions. Fortunately, strategies like systematic interviews, AI-driven screening tools, and competency models can go a long way toward eliminating bias in the selection process.

Break the Information Bubble

While information is generally seen as a good antidote to bias, sometimes decision makers don’t look at enough data or the “right” data. Cognitive biases play a huge role here. For example, confirmation bias filters information through pre-existing assumptions, discarding any data that challenges those notions. Another common example is anchoring bias, which gives disproportionate weight to the first piece of information considered and diminishes the relevance of subsequent data. Many organizations fail to take these biases into account and become caught up in an information bubble that only contains data that reinforces existing beliefs.

A recent McKinsey investigation into the film industry found that this form of bias can be extremely costly to an organization. While film studio executives are relatively effective at calculating how much it will cost to produce a film, they consistently miss the mark when it comes to predicting how much money it could reasonably be expected to make. While predicting box office revenue is no doubt challenging, the process is made more difficult when bias gets in the way. Executives routinely overestimated revenue forecasts because they based their projections on a very small sample size. Rather than developing a rigorous method for comparing similar films, they tended to look at one (and often only one) successful film and use it as a baseline. This caused them to misread the market, resulting in the production of films that went on to lose studios money (and probably cost a few executives their jobs).

To solve this problem, decision makers need to take an outside view that breaks the information bubble. They need to consider all relevant data regardless of whether or not it conforms to pre-existing assumptions; in fact, it’s usually better for decisions when information does contradict assumptions because it forces people to reassess their position. It can also be helpful to try to knock down ideas or decisions rather than propping them up. If the position holds up under scrutiny, then people can have more confidence that it’s as strong as it appears.

Don’t be Influenced by Past Decisions

Many people make decisions on the basis of what’s known as the “gambler’s fallacy,” or the belief that past events influence how otherwise unrelated events will occur in the future. In a decision-making context, this means that people are often more likely to let their previous decisions influence current decisions rather than evaluating each decision on its own merits.

Researchers have found, for instance, that immigration judges are more likely to reject an asylum seeker if they approved a previous case (all other factors being equal). The influence became more pronounced with each subsequent approval. Similar findings were produced among loan officers and baseball umpires. In every case, decisions were being made not only on the existing situation, but also on a deeply ingrained bias about something being “due” to happen.

As any investor can point out, this not a sound basis for decision making, especially financial decisions. While trends and history are useful for modeling possible outcomes, it’s important to approach each decision as an independent event. Just because an organization has decided against doing something several times in the past doesn’t mean that it’s “ready” to do so now or that it should “stay the course.” Those decisions should be made on the basis of the best available information and independent of previous decisions.

Whatever decision is ultimately made, the work isn’t done until after it has been implemented and evaluated. Many biases are informed by a perception of past events, so it’s important to understand the precise reasons why an initiative succeeded or failed to avoid new biases from developing. Just because a deliberative decision-making process led to a decision that didn’t pan out, for instance, doesn’t mean that the process itself was flawed.

Bias is a powerful force in decision-making. Left unaccounted for, it can lead people to ignore vital information and contribute to costly miscalculations. By acknowledging the challenge bias poses to making well-informed decisions, leaders and organizations can take steps to account for its powerful influence.