3.4 Cognitive Biases

Fallacies, as we have been seeing, are common mistakes that we make in reasoning especially when we are trying to support a conclusion that we have insufficient evidence to support, ways in which we claim more than we really know. In recent years cognitive psychologists have also explored the ways in which not only get our arguments wrong, but also tend to get things wrong in our own thinking, how as Thomas Gilovich puts it in the title of his book How We Know What Isn’t So.5 That is, we have a tendency to fall prey to biases and mistakes in our own reasoning whatever it is that we may end up defending later in our arguments. In this section we’ll look at some of the most important and relevant of these cognitive biases.

In general we might classify these biases into two general types: “hot biases,” or motivated irrationality where our interests, emotional responses or visceral reactions to things influences our thinking process; and “cold biases” or unmotivated irrationality which are a result of certain mental shortcuts and routines we rely on even in situations where they do not really apply. Let’s look at some examples.

Hot biases involve bending our reasoning to fit our wants and desires. Because of this definition I believe that hot biases can be pretty easy to spot throughout our daily lives and interactions with others. That is if we take a step back and evaluate our reasoning and emotions within the situation. However, unlike hot biases, cold biases are not results of our desires, but rather they are more like “bugs” in our mental operating systems. When you get into cold biases, I believe that this is where things could get tricky.

– Ryan Moore

The reason I believe that cold biases can be very tricky to spot is that they are simply caused by the way our minds are wired as humans. This means that when dealing with these cold biases in any situation, the odds are already stacked against us. Even when attempting to avoid the use of bias in any situation, debate, or reasoning process, cold biases may rear their ugly heads without us even fully understanding that they are present.

One cold bias strikes me in particular, “The fundamental attribution error.” What this refers to is our tendency to be more generous with ourselves when trying to excuse our actions or get ourselves off the hook than we tend to be regarding others. The reason this bias struck me was because, without even noticing it, I have just recently fallen victim to this cold bias in my own life.

To make a long story somewhat short, my girlfriend was in a fender-bender about 2 months back. She was at fault, as she rear-ended the person in front of her. Everybody was okay, and there was minimal damage, but there was still enough damage for it to be noticeable. Well, when I got home from work, she told me the story of what had happened and showed me the damage. In this situation, after she showed me the damage and explained it, I was giving her grief about being more careful when driving probably for the rest of the night. At that point in time, I didn’t want to hear any excuses. I was just annoyed that the car was going to need some work. However, not even two weeks after that incident, guess what. I rear-ended someone on my way into work. Honestly though, what are the actual chances? Again, luckily nobody was hurt (besides my ego), and there was very minimal damage. All I had was a cracked headlight and the other car had a small scuff. Well, later that night when I got home from work, I explained to my girlfriend what had happened. However, this time, I was ALL for using every excuse in the book of how it wasn’t my fault (when in reality, if you rear-end someone, it’s almost always your fault). Looking back on this situation, I fell victim to cold bias. While my girlfriend and I basically committed the same fault, I was much more generous with myself when I was the one who messed up.

Overall, it is my opinion that cold biases can be much more difficult to spot and stop as compared to hot biases.

Hot Biases

Hot biases are also known as examples of “motivated irrationality” because they involve bending our reasoning to suit our wants and desires. Our motivations may not always be clear even to us, so we may not realize that we are caught up in such biases. Luckily, however, there are steps we can take to avoid falling prey to them.

Confirmation bias

Confirmation bias is our often unconscious tendency to give more weight to evidence supporting our pre-existing beliefs or hypotheses and our tendency to downplay the significance of evidence against them. The result is similar to cherry picking but it may not be a deliberate attempt to mislead, but more of a product of other unconscious tendencies.

I keep seeing more and more evidence in favor of my hypothesis! What about the evidence against it? Well that must be based on faulty data collection.

Given that we have a strong tendency to fall prey to this bias, what steps might we take to avoid it? For one, we can use various “blind” methods of data collection and analysis to protect our reasoning from the errors involving confirmation bias. “Double blind” medical trials follow these precautions – to avoid confirmation bias, the patients involved in the trial of a new drug, for example, don’t know whether they are getting the drug or a pacebo, and the researchers also don’t know whether each particular patient they examine afterwards are in the test group who receive dthe drug of the control group that didn’t

Group think

Just like our pre-existing beliefs can influence what we take the evidence to support, likewise with our connections to other people can do so. When someone in a group we identify with comes up with an idea we have tendency to give it more weight than in fact it deserves. A great example of this occurred during the Kennedy administration with the Bay of Pigs incident. The plan was to send a small group of armed anti-Castro Cuban soldiers to the Bay of Pigs in Cuba with the thought that that would be enough to incite a full-scale revolt against the Communist government by the rest of the Cuban population. Enough members of Kennedy’s National Security Council which planned and approved the incursion were strongly in favor of it that they collectively ignored their own military intelligence which indicated that there was little popular support for revolting against the Cuban government. So when the invasion happened in January 1961 the initiative was quickly defeated and the Cuban exiles who landed in the beach were quickly arrested and imprisoned much to the embarrassment of the Kennedy Administration.6

Wishful thinking

These last two biases could be considered specific examples of a more general tendency we have towards wishful thinking, which is the tendency to project our own desires onto reality and fool ourselves into thinking that reality conforms to how we would like it to be.

I just know that the Yankees will win the World Series, they just can’t let me down again!

All of these biases fail to make a distinction between what we would like to be the case and what really is in fact the case. We may end up lucky and happen to find out that reality conforms to how we would like it to be, but then again we may not. We can protect ourselves against these kinds of biases first by being aware of our own tendencies to fall prey to them and then by using particular strategies to separate our analysis of the evidence from our wishes.

Cold biases

Cold biases differ from hot biases in that they are not such much results of our desires but of the ways in which our cognitive systems work. They are more like “bugs” in our mental operating systems than ways in which we twist our thoughts to conform to our desires.

Anchoring/framing effects

How much are you willing to pay for some given product or service? Well it turns out to depend not just on the features of that product or service itself but also by what is next to it on the shelf, what other options you are presented with and even the initial asking price. These are examples of the ways in which the context of our choices influences the content of our choices. Many examples of this can be found in the marketplace. The notorious “bait and switch” tactic relies on this bias.

Today only, you can get a new Toyota for 20% off the sticker price! So act now and drive your new car away today!

There are numerous ways in which advertisers and salespeople try to influence our choices. First the “sticker price” may or may not reflect anything about the reality of the product in question but may be intentionally inflated to anchor our minds to a price higher than we would actually pay. It turns out that doing so will make us more willing to pay a higher price than we would if the original non-discounted price were set closer to what we might actually pay, or what the manufacturer expects us to pay. Likewise, adding extra options that we may not actually be interested in independently can influence our willingness to buy a product, especially when it is compared to an otherwise equivalent product without those potions. In both cases our ability to compare what is offered with what we want is corrupted by the context within which the product is presented.7

The fundamental attribution error

When it comes to explaining human behavior, and trying to figure out the relative weights of internal factors such as needs, desires and the personality of the actor on the one hand, and external situational factors on the other hand it turns out that we come up with different weights when thinking about our own behavior and that of other people. our own and others. We tend, for example, to put more weight on things outside of our own control when accounting for our bad behavior than we do when accounting for the behavior of others.

The fact that I tripped when walking down the street was because the sidewalk was uneven. When you trip on the other hand, it’s evidence that you are a klutz.

In other words we are more generous with ourselves in terms of getting ourselves off the hook — it’s not my fault it’s the situation I was in — than we are with others — clearly it is their fault! This is true that in many, many cases, if we looked at things in the more neutral terms of finding what factor most influence all of our behavior situational or external factors often play a much greater role than we think. So we are more accurate when we reflect on ourselves than when we look at the behavior of other people. Our attribution of causal influences tends to be skewed in the wrong direction when we look at why others do what they do especially when we compare it with our understanding of what influences us ourselves.

The availability heuristic

Finally in this brief survey of some major cognitive biases we have the confusion we tend to make between how readily something comes to mind (its “salience”) and the more objective probability of its occurrence (its “frequency”). Thus because terrorist attacks are so dramatic and stand out in our minds much more than the more common and mundane threats we are exposed to we tend to be more afraid of terrorism than, for example, driving to the airport. In the aftermath of the 9/11 attacks I remember listening to an interview with an expert on national security threats. The interviewer asked him what we could all do to be safer in international air travel and his response was “Drive very carefully to the airport.” This effect, once you start to notice it, is everywhere. We focus on dramatic if also highly unlikely threats and events and ignore those that are far more likely and hence, paradoxically, not so obvious to us.

In this and the last chapter we have taken a closer look at what is involved in the justification of any claim at all. We have seen that the best arguments are both valid and sound – they work logically in that their premises really provide adequate support for their conclusions and their premises are actually true. We have also examined some of the many mistakes in reasoning to which we are prone. These mistakes are not only used in a deliberately misleading way, since all of us have a tendency to make decisions and judgments first and then come up with reasons in support of them later. Thus we often rely on fallacious reasoning to rationalize our own beliefs or we tend to read the evidence is biased ways. As we will be seeing, as we turn now to start examining some of the many different theoretical approaches to ethics, the tools of logic and critical thinking will prove very useful in trying to come up with a reasonable answer to the general question, “what is the right thing to do?”


  1. Thomas Gilovich, How We Know What Isn’t So (New York: The Free Press, 1991)↩︎

  2. “Bay of Pigs - Groupthink,” accessed December 2, 2019, https://www.globalsecurity.org/intell/ops/bay-of-pigs-groupthink.htm↩︎

  3. For more on anchoring effects see Daniel Kahneman, Thinking Fast and Slow (New York: Farrar, Straus and Giroux, 2011)↩︎