Fast as lighting, when we consume (new) information we run through existing structures and connections in our minds, so that new knowledge and information can be connected to what we already know. We are, as it were, trying to fit this new information into our brain, which either happens comparatively easily or with a considerable amount of effort.
During the internalization process, people should ideally be aware of which fixed beliefs and assumptions they, unknowingly or in fact knowingly, use. Unfortunately, various studies show that we tend to see what we want to see, and that our brain is extremely lazy and prejudiced. “We like to think that our judgements, beliefs, and opinions are based on solid reasoning. But we may have to think again.”
1. We perceive things as organized patterns and exaggerate what we see
By nature, the human brain looks for patterns, symmetry, unity and consistency. People tend to see things as complete objects even when they are not. The same goes for perceiving things as following a pattern when they are, in fact, random. Consequently, a false overall picture is created.
Research related to the Gestalt theory has shown that human beings tend to see a shape (such as a circle), as being completely enclosed even when it is not. Apparently, we just ignore the gaps in its border. In such cases, the risk exists that we overestimate our perceptions and create a logical reconstruction of the whole whilst a piece of information is missing.
2. Jumping to conclusions and “wishful thinking”
Instead of excluding matters meticulously and gradually, people tend to jump to conclusions, and in doing so arrive at an incomplete assessment of certain situations. People often believe in things because they might be beneficial to themselves, even though facts point to the contrary.
3. The hammer and the nail mentality / “Every man to his own trade”
We see the world according to our own abilities and tools; if all we have is a hammer, every problem looks like a nail. Or: a Dutch tourist abroad will most probably notice Dutch license plates. In short, we only see what we want to see or what we expect to see. This principle is known as selective perception (Dearborn and Simon, 1958).
4. Persistent fixation on a hypothesis
People sometimes ignore or reject information because it does not fit the ‘consistent’ image they have developed. This is a form of so-called cognitive dissonance, in which a person ignores information (or does not take it seriously) because otherwise they might have to change their behavior or opinion (Festinger, 1957).
5. Confirmation bias
These errors occur when someone goes in search for only those examples that positively fit their hypothesis (Vandenbosch, 1997). For example, if someone assumes that a particular product performs poorly and expects that this product is responsible for the overall loss suffered by the company, they will only examine this particular product and leave out other factors. If it turns out that this product is indeed performing poorly, the initial hypothesis will be considered correct even though other products might be just as (or even more) to blame.
6. The perceived conformation bias
Whereas the confirmation bias will look only to those instances that were expected to be positive against the hypothesis, the perceived confirmation bias focuses only on features relevant to the initial hypothesis, and not to features appropriate to the alternatives. So someone will look exclusively into those features (of a product or customer or …) that confirm the hypothesis. This one is especially important when it comes to data mining and data discovery; which data elements are selected for the analysis or mining process and which are not.
7. Concrete versus abstract
This ‘bias’ occurs when concrete information overshadows abstract information (e.g. summaries). A customer service manager who receives phone calls from two customers who are angry about delayed deliveries, will not readily trust statistics that state that the total number of delayed deliveries is currently at an ‘all time low’. “The idea of managing with a focus on ‘the big picture’ does not appeal to council members, who prefer to focus on the detail which they find more interesting.” As quoted from a study of 20 municipalities concerning performance management.
8. Relying solely on positive ‘hits’
This is the case when people unjustly think they see a causal relationship between two events and underpins this merely on the observation that both events (and only those) occurred.
9. Group polarization
Group polarization occurs when people enter a group discussion where they develop a more extreme opinion or position than their initial standpoint. When jury members in a trial for example, do not quite trust a suspect from the start, the chances are that their distrust increases during the jury’s deliberations. In short: people amplify each other’s arguments and in doing so confirm their initial opinion.
10. The majority opinion
One third of people simply side with the (erroneous) view of the majority, even if they have seen with their own eyes that this view is not correct.
Senior management most susceptible to these errors
Senior managers operating at the top of the organization are probably most susceptible to these types of errors because the information intended for the top level of the organization is often – due to its diversity – much more difficult to interpret. In addition, managers are usually busy, often interrupted and their work is highly fragmented. See also our post: The 5 Biggest Benefits of Analytics for Managers and Knowledge Workers.
It is in our (human) nature to be resistant
What we must not forget with respect to the above-mentioned prejudices, errors, and fallacies is that it is in our (human) nature to be resistant to giving up on existing ideas and changing our minds. This is not necessarily a bad habit though: it, for instance, prevents us from suddenly and radically changing our existing knowledge or opinions every time a new single piece of information comes along.
The idea that our perceptions are completely objective and that we are able to completely let go of our frame of reference is an illusion. “Thoughts without content are empty, intuitions without concepts are blind.” (Immanuel Kant, 1724-1804). However, by trying to be aware of our truisms and fixed truths, we can strive to think newly and freely (Kessels, 1997), which promotes more accurate interpretations and conclusions.
A good night’s sleep
Finally: we may sometimes analyze things exhaustively, and then “sleeping on it” can help. When we are asleep, our subconscious thought processes help us to put the information in the appropriate place. The solution then seems to present itself spontaneously. That is perhaps why both our perceptiveness and ‘intelligence’ increase when we have had a good night’s sleep.
More to discover
If you want more insight into how to prevent these biases using Data Visualization techniques and decision management principles, attend our BI training course. And discover how bad our brain is at making fact-based decisions in the BBC documentary ‘How to make better decisions’: