Common Cognitive Biases — Explained!

The Thinking Lane
6 min readNov 12, 2021

How sound is your reasoning, REALLY?

A cognitive bias refers to an irrational error in human reasoning. Bad reasoning might lead to misleading conclusions or judgements. This is why it is important to know more about them — so that this knowledge can be used to avoid these biases.

We discussed the meaning, causes and impacts of cognitive biases in the previous blog. Now, we will take a look into some of its most common types.

Common Types of Cognitive Biases

■ Confirmation bias: A tendency to attach more weight to considerations that support our views.

Example — When we choose only those news sources that present stories that support our own views, we are conforming to the confirmation bias.

This can result in us making poor decisions and becoming incapable of listening to views other than our own. There are a number of reasons why this might happen.

  1. It is easier to make decisions by limiting our mental resources.
  2. By feeling that our beliefs are accurate, our self-esteem is protected.

■ Belief bias: Evaluating reasoning by how believable its conclusion is. This bias is closely related to the confirmation bias.

As per this bias, one is more likely to accept the outcome of something if it matches one’s belief system. Similarly, one is also likely to accept a conclusion to be true (because it aligns with one’s belief), even if the argument makes no logical sense.

For example, look at the following argument —

Premise 1 — All birds lay eggs.

Premise 2 — Sparrow lays eggs.

Conclusion — Sparrow is a bird.

This argument might look sound at first, but looking closely, it becomes clear that it is unsound because there are a lot of non-bird creatures (like reptiles) that also lay eggs. Also, not all birds lay eggs. The flaws of this argument are not obvious to many because of the believability of the conclusion.

■ Availability heuristic: Assigning a probability to an event based on how easily or frequently it is thought of.

Heuristics refer to the general rules followed unconsciously while solving problems and estimating probabilities. In the availability heuristic, we tend to rate the probability of an event based on how much we think about it.

For example, if we come across multiple cases of burglary in newspapers and TV news, they take a seat in the front of our minds. This causes us to overestimate their probability.

It is true that if the probability of burglaries was to increase that one would think about them more often, but just because one thinks of them more often does not, in itself, increase their probability.

■ False consensus effect: Assuming our opinions and those held by people around us are shared by society at large.

Example — When we believe that the majority shares our opinion on controversial topics, we are conforming to the false consensus effect.

This has many implications. We not only incorrectly believe that everyone agrees with us, but we also start over-valuing our own opinion. This often leads to a disregard for others’ opinions.

This effect is the product of many reasons:

  1. More often than not, our circle of friends and family hold the same opinions as us. This leads to the false belief that everyone else does, too.
  2. By believing that one’s views are the same as the society at large, one feels ‘normal’ and good about oneself.

■ Bandwagon effect: The unconscious tendency to align our beliefs with those of other people.

This is a very powerful source of cognitive distortion. In an interesting experiment by psychologist Solomon Asch, it was observed that what we think we see can be altered by what we hear other people say they see.

For example — Suppose there is a question and one is pretty sure of its answer. Still, it is likely that one’s confidence in the answer will waver if others express that they are positive that another answer is the right one.

It is often observed that when opinions begin to favor one answer, more and more people switch to it (making it a popular opinion, hence continuing the vicious cycle of the bandwagon bias). Needless to say, just because an opinion is popular doesn’t automatically mean it is the correct opinion to hold.

■ Negativity bias: Attaching more weight to negative information than to positive information.

To understand negative bias, positive bias should be covered first. As understood from the bandwagon effect, people have an unconscious habit of aligning their views with others. If there is something that ‘everyone likes’, one is prone to acquire a liking for the same. This is the positivity bias. More powerful than this is the negativity bias. This is because people tend to weigh negative information more than positive.

For example — One is more likely to ‘like’ chocolate if one hears that “nobody does not NOT like chocolate” (as compared to “everybody likes chocolate”). Another example is the general habit of people caring more about avoiding loss than acquiring profit.

Interestingly, the human brain displays more neural activity in response to negative information than it does for positive, showing that negativity bias is hard-wired into us.

■ Loss aversion: Being more strongly motivated to avoid a loss than to accrue a gain. This cognitive bias follows from the negativity bias as discussed above.

For example — In a deal, one would be more motivated to avoid loss than acquire profit.

■ In-group bias: A set of cognitive biases that make us view people who belong to our group differently from people who don’t.

In general, one finds it easier to form negative opinions about people outside of one’s society, nationality, religion (or any other group) than for people inside these groups. In the same way, one tends to view members from one’s own group in a (more) positive light as compared to ‘outsiders’.

For example — one may believe that the members belonging to one’s own group are more skilled, hard-working or kind, whereas (succumbing to stereotypes and prejudices) the members of other groups are not.

■ Fundamental attribution error: Having one understanding of the behavior of people in the in-group and another for people not in the in-group. This follows from the in-group bias.

It stems from our tendency to not be as appreciative of others in a particular circumstance as one would have been of oneself.

For example — One may attribute one’s success to hard work and failures to bad luck while attributing others’ success to good luck and failure to personal shortcomings.

■ Obedience to authority: A tendency to comply with instructions from an authority.

People tend to obey authority more or less blindly and unquestioningly. The power of authority has been seen to precede the power of good judgment and rational thought, as corroborated by several studies, such as that conducted by Stanley Millgram.

For example — Millgram observed that most subjects in the experiment were willing to deliver a lethal dose of electric shock to innocent people when ordered to do so by an experimenter in a white coat.

■ Overconfidence effect: A cognitive bias that leads one to overestimate the level of one’s skill, talent or self-belief.

An example of this bias is the desirability effect (assuming something will happen because one wants it to happen).

Interestingly, Daniel Kahneman, in his book Thinking Fast and Slow, has called overconfidence “the most significant of cognitive biases”.

This bias is also notorious for laying the groundwork and foundation for all the other biases, as it takes away one’s humility about psychological vulnerabilities, paving the way for a certain level of disregard towards the rules of reasoning.

■ Better-than-average illusion: A self-deception cognitive bias that leads us to overestimate our own abilities relative to those of others. This follows from the overconfidence effect.

Most people rate their own abilities to be “better than average”, even though that would be statistically impossible. This is why it is also known as the self-enhancement bias.

For example — Most people believe that their critical thinking skills are better than average.

Interestingly, even after being informed of this particular bias, most people do not budge from their belief of being better than average.

Conclusion

Since these biases are hardwired into our brain, they are tough to spot and even tougher to get rid of. The first step towards better reasoning is to equip ourselves with knowledge about spotting them. Once we know we have succumbed to a bias (or someone we are talking to has), it becomes easier to get rid of them (or point them out politely to our friend). At the end of the day, the power of reasoning might be the biggest superpower we are capable of having.

Also readYour brain’s way of deceiving you — Cognitive Biases

--

--

The Thinking Lane

Hi! I am Kritika Parakh. I am a philosophy grad trying to make sense of philosophical topics. Any criticism/corrections/comments are welcome.