Why Talk About Confirmation Bias?
For the past several months, Brand Buddha’s strategy team has been collaborating with a healthcare company to solve a long-standing patient acquisition and adherence problem. Working with EMR data, one expects to sift through mountains of information in search of patterns and, ultimately, answers.
With five strategists on the Brand Buddha team and an equitable number on the client-side, there were, at the outset, no shortages of opinion as to what the data would ultimately tell us as to why the company struggled with deeper engagement, conversion, and adherence.
Yet, once the button is pushed, patterns inevitably begin to emerge.
It is interesting how quickly the data began to show the reasoning for the company’s troubles. What is more interesting is how the company’s executives often continue to cling to previous explanations with a kung-fu death grip long after the data has shown their implausibility.
This is, of course, the problem. Confirmation bias. It’s our tendency to bury our heads in the sand and selectively interpret information that confirms our prior beliefs.
Where do your beliefs and opinions come from? If you’re like most people, you honestly believe that your convictions are rational, logical, and impartial, based on the result of years of experience and objective analysis of the information you have available. In reality, all of us are susceptible to a tricky problem known as confirmation bias—our beliefs are often based on paying attention to the information that upholds them while at the same time tending to ignore the information that challenges them.
Here is a quick example: in Brand Buddha’s offices, we have an entire floor of creative professionals—designers, artists, writers, etc. There are a lot of left-handed people in our group. Imagine that a person holds a belief that left-handed people are more creative than right-handed people. If a person came to Brand Buddha and walked through our creative departments, they would find a population that is both left-handed and creative, thus reinforcing their belief and assigning greater importance to this “evidence” that supports what they already believe.
I hope you can see where I am going with this.
It is easy to understand how a CEO, or Marketing Officer, might stick to strategic plans despite data that overwhelmingly shows their strategies to be non-optimal. Being bound by the tendency only to notice what fits with our beliefs, and having the propensity to scrutinize ideas more when they run contrary to our beliefs is a deadly poison for optimal decision making.
Of course, even people trained to identify and mitigate bias are hard-pressed to escape their own predilections. Stand around a coffee machine on a Monday morning with data scientists (don’t kid yourselves, these guys can party), and you’re bound to hear one arguing with a marketing or sales executive saying, “big data is not about the data! It’s about me!” Indeed. Any data operation is only as good as the individual who synthesizes and interprets the data into something useful, something that can drive decisions. This would naturally include the biases of the scientist—those party animals.
So, why are we all so bad at letting go of previously held beliefs? Well, this one is easy.
There are generally two ways to explain why people do what they do:
Challenge avoidance- Discovering you are wrong isn’t fun. It prompts one to ignore information which contradicts their beliefs.
Reinforcement seeking- On the other hand, it’s downright lovely to find out you are right. We tend to focus on only one hypothesis—our hypothesis—and disregard any alternative ideas.
So, what’s the solution? Start with a blank slate, challenge your data, and try to disprove your own ideas…or, if that seems hard or unrealistic, pay someone else to do it for you—wink, wink.
No hypothesis, yet. Treat the initial data-gathering stage as a fact-finding mission without trying to understand the specific causes of any identified fluctuations. That is, resist the temptation to immediately generate potential hypotheses, and instead wait until a more complete information set has been reviewed before considering reasons the data may differ from expectations.
Time for ideas: The rule of three. If possible, identify three potential causes for each unexpected data fluctuation. Why is three the magic number? Research has shown that data scientists who develop three hypotheses are more likely to identify misstatements than those who develop just one hypothesis correctly. From a probabilistic standpoint, the more plausible expectations brainstormed, the higher the likelihood that the underlying cause of the data aberration will be unmasked.
Highlight it. When identifying potential causes of an anomaly, take note of the information that triggered the creation of the idea. Present those data to a colleague to see whether he or she formulates a similar hypothesis. If the explanations are different, the exercise has aided you in expanding your hypothesis set. If the explanations are similar, the colleague has strengthened and validated your original ideas.
Take the opposite side of your argument. We are predisposed to look for confirmation of our ideas once formulated. By default, this can imperil the objectivity of the exercise. It is similarly easy to subconsciously ignore contradictory evidence. Unfortunately, both actions cause faulty judgments. So instead of searching for confirmatory evidence, try to disconfirm your initial suspicions. Such an approach is likely to lead to stronger and more definitive conclusions.
In the end, there isn’t much you can do to escape bias in data analysis. It’s how our brain beats back ideas that cause us confusion, conflict, and anxiety. Charles Darwin, aware that bias clouded his judgment, wrote down every conflict of thinking he encountered with the expectation that if he documented it, he could confront it. While this is practically unrealistic for most of us, developing such a system as it relates to strategic decision-making should be considered and, at the very least, the starting point before diving into a data-driven operation.