Cognitive Biases
Our ability
to think broadly and make abstract connections in the world around us allows us
to interpret nuances in information that are so far beyond
the capabilities of machines.
However, we
are fallible, and it turns
out there's been all sorts of research that's identified very specific ways in
which we make mistakes, and there are a lot of them.
Take a look
at this diagram categorized by Buster Benson and arranged by JM3.
Each one of
these items is a specific way we alter information to deviate from reality.
We call
these phenomena cognitive biases, and most of us are susceptible to most, if not all,
of them.
Cognitive
biases can impact the way we look at data and influence
our interpretations of analysis.
We
will do is look at some of the most common and
potentially
damaging biases that play on to our analytical process.
Specifically
we'll look at confirmation bias, the framing effect, availability,
or vividness bias, anchoring and fundamental attribution error.
Confirmation Bias
Seeking information that matches your belief
There are
really two ways that we can exhibit this bias.
The first is
by selectively gathering information, that is we
only seek out data that would serve to support a hypothesis and fail to seek
out data that might disprove the hypothesis.
The second is by selectively interpreting information. This happens when we only focus on data that supports our hypothesis even when we have data that refutes it.
For example:
Suppose we
feel like our customer care centers are not doing a good job.
We decide to
send a survey to a sample of customers that have called in, but exclude
customers who have received credits on their bills because
these might be customers whose satisfaction has been bought.
This would be an example of selectively seeking out information.
This would be an example of selectively seeking out information.
In business
there is often pressure to show good results. And this can
subconsciously play into how we gather and interpret information.
People can
also have strong beliefs about how things should be done, so you tend to
go after information that supports their agenda.
As data
analysts, we should try our best to remain objective and avoid both
of these traps.
Framing Effect
The framing effect is the tendency to draw different conclusions from information based on how it's presented.
Let's say
we've a new product "Fat Free Yogurt"
Which do you think would be the healthier yogurt?
If you take a moment to think about it, both these statements say the same thing. Both yogurts contain 1% fat, and so both are therefore are 99% fat-free. However; when asked which yogurt seems like the healthier option; people are much more likely to show a clear preference for the “99% Fat-Free” option.
Rationally,
decision makers should make the same choice in both scenarios but they
don't.
It turns out
that framing things in a positive way can elicit
different results than in framing them in a more negative way.
As data
analysts, we need to keep this in mind both as we look at options and make our own
conclusions.
As well as
how we present options and recommend results to our decision maker.
It turns out
there are other ways in which how we are exposed to information influences
our thinking.
Vividness Bias
Availability or vividness biasIs the tendency to believe recent or vivid events are more likely to occur.
There are
all sorts of common examples of this, like how people perceive the risk of
a plane crash or getting attacked by a shark.
Both events
are exceedingly rare, but are often
perceived to happen more frequently than they actually do.
In the
business context, when will we see this bias impact
analysis is when small samples of input highlight touchy topics.
For example, let's say we have a focus group of seven people, and two of them say they have a problem with our product.
This input
is vivid. Not only is
there a problem but we're hearing about the problem through
a personal
and possibly passionate interaction with a customer. It's also
two out of seven people.
So we might
jump to the conclusion that nearly 30% of our products have issues.
This may or
may not be that case. But
cognitively we tend to over value these vivid examples.
What's
really interesting about this example is not that it happened, these type
of analyses are pretty common.
Anchoring
Anchoring is our tendency to focus or rely too heavily on the first piece of information that's available to us.
For example,
shoppers may respond more favorably to a product that was priced at $1,000 and marked down to $600 than one that was simply listed for $600.This is
because the shopper initially anchors at $1,000 and relative to
$1,000, $600 seems like a good deal.
The way we
look at data can also be subject to anchoring.
Let's say
that we're responsible for doing the monthly 12-month sales forecast.
We make some
assumptions and run our first forecast, which
predicts average sales growth of 8%.
The next
month we run the same forecast and come up with
an average sales growth of 14%.
Even though
we think our methodology is sound, because the estimate is so much higher
than the first forecast, we back off some of our assumptions and take the
forecast down to a more reasonable 12%. In this
case, how we
interpreted the second forecast was anchored by our initial forecast.
As analysts,
we need to question our analyses.
But it's
also important to question why we're questioning our analysis. And make
sure we're asking ourselves the right questions. The last
cognitive bias we'll discuss is called the fundamental attribution error.
Fundamental
attribution error impacts how we interpret things that we observe people
doing.
Specifically
we have a tendency to focus on attributes or intentions
of the person themselves versus the situation or the
environment when explaining a person's behavior.
Where this
comes into play in analytics is often in the process of translating
analytical observations into strategies and tactics for action.
If we
misinterpret why and how customers behave, we may end
up taking action that is ineffective or counterproductive.
These are only a few of the many ways we as humans are prone to misinterpret information in situations.
If you find this interesting I encourage you to explore some of the other biases we saw in the larger framework.
By being aware of these biases and how they effect us, we can train ourselves to avoid them and minimize their influence on our work.
Post a Comment