In comparison to other animals, humans possess incredible cognitive capabilities. In fact, a great deal of our intelligence as a species comes from our usage of certain tools of critical thinking that allow for more advanced and complex thought. However, these cognitive advancements have not left us invulnerable to biases and fallacies, and may even cause some miscalculations in our thinking. Some of the most salient examples of errors in our logical thinking are the availability bias, the assimilation bias, and the naturalistic fallacy.
You can learn more about these biases as well as other mental miscalculations in decision-making app LIFE Intelligence, available on both iOS and Android. LIFE Intelligence is designed to optimize your functioning in many areas of your life, including your relationships, career, and self. LIFE comprises a 9 Mission program to help you manage stress and anxiety, build positive relationships, and increase productivity at work. For more information on catching cognitive biases, download LIFE and complete Mission 5.1.
One of the cognitive tools we use most commonly is the heuristic, a sort of mental shortcut that allows us to come to conclusions and make decisions quicker and more efficiently (Levy, 2010; Lieder et al., 2018). Most of the time, heuristics work to our advantage and help us make rational decisions; sometimes, however, cutting corners can create cognitive confusions.
One of the ways in which our mental shortcuts can mislead us occurs when we try to think about how often or how likely something is to occur based on what we can remember from the past. For example, would you say you are more likely to sustain a fatal injury from a car crash or from a commercial airplane crash? Remembering tragic and sensational news stories of crashed planes, you may be more inclined to say that plane crashes are much more deadly. However, automobile crashes are hundreds of times more likely to be fatal than plane crashes, and car crashes are far more likely to occur (Locsin, n.d.).
Our tendency to forget mundane or commonplace events and instead focus on rare but sensationalized occurrences underlies one common bias that blights our reasoning capabilities. The availability bias occurs when we misjudge exactly how often a particular event occurs based on examples that readily come to mind (Levy, 2010). More specifically, we might misrepresent the frequency (“How many are there?”), the incidence (“How often does this occur?”), or the likelihood (“How likely will this be to happen?”) of a particular event.
Aside from causing us to overestimate the likelihood of accidents or plane crashes, the availability bias can also fool us into unfairly stereotyping people or discounting statistical data in favor of memorable testimonials (Levy, 2010). In fact, the availability bias can even color how we think about ourselves and the struggles we have faced (Davidai & Gilovich, 2016).
In one study on the availability bias, Davidai and Gilovich (2016) found that both Democrats and Republicans claim that the electoral map works against them, football fans focus more on the challenges their teams face, and people tend to believe that their parents were harder on them than on their siblings. Countless other examples exist in which individuals find it harder to remember their blessings, good luck, or ways in which they have been helped, instead recalling the annoyances, distractions, or setbacks in their path. This “negativity bias” seems to be partially a product of the availability bias—when negative events emerge, they need dedicated focus and effort to overcome, and they often need to be resolved quickly; by contrast, positive events do not require much attention and can be forgotten more easily (Davidai & Gilovich, 2016).
For example, if you are driving along a busy street with many traffic lights, you hardly notice or remember how many green lights you passed because you just drove right through them. However, you have to stop and wait at every red light, and so it seems like most of the traffic lights on the road were red. Much like red lights, problems we face in our life force us to stop and work on them, while our blessings pass by us because they do not require immediate attention.
Because of this effect, negative events seem to be more vivid and memorable, making them more available in one’s mind. This can ultimately reduce one’s gratitude and increase feelings of resentment, which can have negative impacts on mental and physical health (Davidai & Gilovich, 2016). In order to prevent this from happening, we need to be more aware of how often good or bad things happen to us; one of the greatest tools to accomplish this is gratitude journaling—the practice of recording three or more events that went well each day. Gratitude journaling can help to place more focus on one’s blessings, counteracting the natural bias toward more negative events.
While the availability bias concerns our perceptions of the likelihood or frequency of certain events, the assimilation bias relates to our tendency to reject evidence that contradicts our beliefs. The assimilation bias occurs when we interpret new information in a way that supports our pre-existing beliefs, especially when this new information is ambiguous (Levy, 2010). We may ignore, overlook, or forget information that does not confirm our beliefs, instead focusing on whatever may validate them.
(Note that this bias is similar to confirmation bias, but while confirmation bias focuses on our tendency to seek out information that validates our beliefs, assimilation bias relates more to how we think about information in a way that validates our beliefs.)
One of the greatest examples of the assimilation bias at work is found in Rosenhan’s (1973) classic study in which he and seven colleagues feigned a mental illness in order to be admitted into a psychiatric hospital. The researchers lied about their names and occupations and pretended that they were hearing voices in their head that said “empty,” “hollow,” and “thud” (Levy, 2010). All eight of the investigators were diagnosed as psychotic. Once admitted to the mental hospital, the researchers immediately stopped acting abnormally and returned to behaving as they normally would, with the goal of “proving” their sanity and being released from the hospital.
Despite the fact that Rosenhan and his colleagues were psychologically healthy, most of their behavior inside the psychiatric hospital was seen as “abnormal” or “psychotic.” For example, hospital workers watching researchers taking notes or even sitting outside before lunch saw these behaviors as typical manifestations of their psychoticism. Even though they clearly possessed no pathological symptoms, none of the researchers were released from the hospital in less than a week. Furthermore, when the patients were released, they were mostly discharged with the diagnosis of “schizophrenia, in remission” (Levy, 2010). The hospital workers were not seeing Rosenhan and his colleagues as people but rather as a diagnosis, meaning that everything they saw in the patients was twisted to fit this pathological perspective.
Rosenhan’s (1973) study exposes an extreme case of the assimilation bias, but this error creeps up in our critical thinking on a daily basis. For instance, the assimilation bias is often seen in debates about socio-political issues (Greitemeyer, 2009). Whatever your position is on capital punishment, abortion, or similar legal and ethical questions, you are likely to see evidence that confirms your belief as conclusive and compelling and evidence that disputes your belief as inconsequential or untrustworthy (Cohen et al., 2000).
One of the reasons why we may use this bias, some researchers believe, is to protect our sense of identity (Cohen et al., 2000). Our beliefs and values are central to how we perceive ourselves, and when our beliefs are challenged, we feel challenged. In order to prevent the dissonance of feeling that we may be wrong about something important to us, we subconsciously filter out information that challenges what we think while accepting information that affirms our beliefs. In fact, Cohen et al. (2000) found that when people feel more self-affirmed and confident in their identity, they are more likely to be open to evidence that disputes their beliefs.
In order to protect ourselves from falling victim to the assimilation bias, we would do best to remember our general tendency to favor information that aligns with our beliefs, especially when we encounter evidence that seems to contradict them (Levy, 2010). Although it can be difficult to keep an open mind and remain flexible to change, this is one of the best ways we can ensure that we do not ignore real evidence in favor of sustaining baseless beliefs.
The naturalistic fallacy is one of the most ubiquitous errors in thinking and has haunted philosophers and scientists for centuries. This fallacy occurs when we confuse what ought to be with what is, defining what is good based on what is observable (Levy, 2010). One of the most common variations of this fallacy is found in the bandwagon fallacy, which occurs when someone believes that just because “everyone else is doing it” it must be correct. However, we often make this error in thinking about nature itself, thinking that something is good just because it is “natural”—an error that salespeople and marketing professionals frequently take advantage of.
Another example of the naturalistic fallacy that can open the door for abusive uses of scientific research relates to findings of evolutionary psychology which demonstrate that, in early human civilizations, women often were gatherers and caregivers. Many have used this information to uphold a patriarchal system and conclude that women have no place in areas of politics or war as they are “naturally” caregivers (Levy, 2010). Even assuming that this research is entirely valid and truthful, however, these arguments are not logically sound. This is because they equate what is and has been with what ought to be. Women are no more destined to be caregivers in modern society than men are destined to be hunters.
Along the same lines, Ismail and colleagues (2011) found that participants who read evolutionary explanations for bug killing or for male promiscuity were more likely to be accepting of these behaviors. In this case, it seems that participants mistakenly believed that to explain is to excuse; this subtle but important distinction can lead many into erroneously using the naturalistic fallacy.
Preventing against the naturalistic fallacy can be difficult. In one study, even after participants learned about and were warned against the fallacy, they still misinterpreted scientific data, thinking that scientists were advocating for a certain moral position when they were not (Friedrich, 2005). For example, after reading a study that found that children exposed to snack commercials were less likely to select healthy foods, they believed that the study had concluded that snack companies ought to show greater concern for children’s vulnerability to these commercials.
One of the ways in which we can work against this fallacy, however, is to pay greater attention to information that could be misinterpreted. We should pay close attention whenever reading an article, listening to a news report, or hearing a secondhand account, and we should be careful to separate the facts of the matter from the potential moral implications (Levy, 2010). We may be particularly likely to use the naturalistic fallacy when distracted (Friedrich, 2005), which is why it is important to focus on the new information in these instances.
The availability bias, assimilation bias, and the naturalistic fallacy are common cognitive biases that are easy to slip into without knowing it. These biases operate largely by taking advantage of mental shortcuts that we use to save on decision-making time but that leave us vulnerable to errors in judgement. As with many biases, one of the best ways to combat these is to simply slow down when taking in new information and think about the ways in which it can be misinterpreted. These errors are remarkably commonplace and can be quite resistant to our attempts to stop them, which is why it is always important to be on the lookout for them.
LIFE Intelligence is one app for every aspect of your LIFE. Our 9-topic self-development journey provides science-backed content, exercises, and reflections to help you better understand and manage yourself and others. Our mood tracker and emotional management toolkit helps you deal with difficult situations on the fly. These combined help you comprehensively manage stress and anxiety, improve work productivity and career fulfillment, and build lasting relationships.
Cohen, G. L., Aronson, J., & Steele, C. M. (2000). When beliefs yield to evidence: Reducing biased evaluation by affirming the self. Personality & Social Psychology Bulletin, 26(9), 1151-1164. https://journals.sagepub.com/doi/abs/10.1177/01461672002611011?journalCode=pspc
Davidai, S., & Gilovich, T. (2016). The headwinds/tailwinds asymmetry: An availability bias in assessments of barriers and blessings. Journal of Personality and Social Psychology, 111(6), 835-851. https://pubmed.ncbi.nlm.nih.gov/27869473/
Friedrich, J. (2005). Naturalistic fallacy errors in lay interpretations of psychological science: Data and reflections on the rind, tromovitch, and bauserman (1998) controversy. Basic and Applied Social Psychology, 27(1), 59-70. https://www.tandfonline.com/doi/abs/10.1207/s15324834basp2701_6
Greitemeyer, T., Fischer, P., Frey, D., & Schulz-Hardt, S. (2009). Biased assimilation: The role of source position. European Journal of Social Psychology, 39(1), 22-39. https://onlinelibrary.wiley.com/doi/abs/10.1002/ejsp.497
Ismail, I., Martens, A., Landau, M. J., Greenberg, J., & Weise, D. R. (2011). Exploring the effects of the naturalistic fallacy: Evidence that genetic explanations increase the acceptability of killing and male promiscuity. Journal of Applied Social Psychology, 42(3), 735-750. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1559-1816.2011.00815.x
Levy, D. A. (2010). Tools of critical thinking (2nd ed.). Waveland Press, Inc.
Lieder, F., Griffiths, T. L., & Hsu, M. (2018). Overrepresentation of extreme events in decision making reflects rational use of cognitive resources. Psychological Review, 125(1), 1-32. https://pubmed.ncbi.nlm.nih.gov/29035078/
Locsin, A. (n. d.). Is air travel safer than car travel? Usatoday.com. https://traveltips.usatoday.com/air-travel-safer-car-travel-1581.html
Rosenhan, D. L. (1973). On being sane in insane places. Science, 179(4070), 250-258. https://pubmed.ncbi.nlm.nih.gov/4683124/