Anchoring Bias occurs when a person's expectation about one thing is affected by something mostly or entirely irrelevant they saw, heard, or thought before, such as an irrelevant number. In other words, it occurs when a person's beliefs or behaviors are influenced by a specific piece of information far more than they should be given how much evidence that information actually provides.
Attention Bias occurs when some information or evidence holds a disproportionate amount of a person's attention because of that person's environment or history, or because of people's natural instincts.
|
The Availability Bias occurs when someone's prediction about an event's frequency or probability is unduly influenced by how easily they can recall examples of that event. We have a whole mini-course about combating availability bias .
A Bias Blind Spot is a tendency to see oneself as being less biased or less susceptible to biases (such as those listed in this article) than others in the population.
: "In fact, viewing yourself as rational can backfire. The more objective you think you are, the more you trust your own intuitions and opinions as accurate representations of reality, and the less inclined you are to question them. 'I'm an objective person, so my views on gun control must be correct, unlike the views of all those irrational people who disagree with me,' we think." - |
Choice-Supportive Bias is a cognitive bias whereby someone who has chosen between different options later remembers the option that they chose as having more positive attributes than it did at the time (while they remember options they did not choose as having more negative attributes than they'd had at the time).
Confirmation Bias refers to a tendency for people to seek out, favor, or give more weight to information that confirms their preconceptions or hypotheses (even if the information isn't true) than information that contradicts their prior beliefs.
The Denomination Effect is a cognitive bias whereby people tend to be more likely to spend a given amount of money if it is composed of smaller individual sums than if it is composed of larger individual sums.
Hindsight Bias refers to a tendency to perceive past events as being more predictable than they were before they took place.
Optimism Bias is the tendency to be unduly optimistic about the probability of future good and bad events, overestimating the probability of positive ones while underestimating the probability of negative ones.
Motivated reasoning occurs when you are disposed to interpret new evidence in ways that support your existing beliefs, or that lead to the outcome you wish was true, even when that evidence doesn't truly support your beliefs.
What are the types of bias.
There are three main types of bias.
1. Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X.
2. Implicit biases are unconscious beliefs that lead people to form opinions or judgments, often without being fully aware they hold the unconscious beliefs. If you subtely distrust people of group X without even realizing you're doing it, then you have an implicit bias against people of group X.
3. Cognitive biases differ from explicit and implicit biases: they are a group of systematic patterns in how our beliefs, judgments, and actions differ from what they would if we were completely rational. If most people systemtaically misjudge certain types of information in such a way that you come to false conclusions, then people have a cognitive bias related to that type of information.
There is no consensus among academics regarding how many cognitive biases exist. Some have found ~40 , others find >100 , and Wikipedia lists over 180 .
As we’ve seen above, cognitive biases often appear when one is faced with a decision and has limited resources (such as time, understanding, and cognitive capacity).
For instance, when buying a banana, you can't consider every single possible other use of that money to determine whether a banana is truly the single best use. You are limited in both how much time you have to think and how much total cognitive capacity you have.
Using fast heuristics or relying on our intuition is often an effective way of coming to conclusions in these situations because such approaches require fewer resources than careful thinking. While our intuition is often reliable, there are certain cases where our intuitions systematically produce inaccurate beliefs and unhelpful behaviors - these are what we refer to as "cognitive biases".
Even when we have plenty of time to think and aren't hitting a limit on our cognitive resources, people can still be prone to cognitive biases. For instance, there are certain automatic rules of thumb that our minds evolved to use since they worked quite well for the survival of our ancestors. Unfortunately, these rules of thumb can sometimes lead us to false conclusions and unhelpful behaviors in the modern world.
Cognitive biases are not good or bad in themselves. They are an unavoidable effect of not having infinite intelligence and infinite time to think, and hence the need to rely on heuristics and intuition. We call a tendency a cognitive bias when it leads to systemic inaccuracies in our beliefs or unhelpful behaviors. In that sense, by definition, cognitive biases cause systematic problems.
However, cognitive biases do not always lead to negative outcomes in every instance. For instance, overconfidence may cause a person to try something very difficult, that they ultimately succeed at. On the other hand, for every one person who succeeds due to overconfidence, there may be multiple other people that try something that's unrealistic due to overconfidence and end up failing.
Just knowing about specific cognitive biases is a great first step to identifying them in yourself, but knowledge of the biases is often not sufficient to cause you to identify them. Once you’ve done that, it can be helpful to get to know the most common cognitive biases (such as the ones presented above) so that you can look out for them in your own thinking.
Yes and no. It is possible to reduce the influence of cognitive biases on your thinking (and this can be very beneficial!). So you may be able to avoid a cognitive bias in many particular instances. But it's not possible to completely remove all of your cognitive biases.
Unfortunately, it’s impossible to overcome all of your cognitive biases completely. However, that doesn’t mean you can’t do anything. A good first step on the path to getting your cognitive biases under control is familiarizing yourself with them
Here are a few of our interactive tools that might help:
The Planning Fallacy
The Sunk Cost Fallacy
Improve Your Frequency Predictions
Political Bias Test
Rhetorical Fallacies
Are Your Overconfident?
Calibrate Your Judgement
How Rational Are You, Really?
Metal Traps ,
However, just knowing about your cognitive biases isn’t enough . You need to take action! Here are some practical steps we recommend:
Biases such as overconfidence, confirmation bias, and the illusion of control can be reduced or avoided by having multiple points of view. Surrounding yourself and listening to people with diverse experiences, systems of beliefs, and expertise reduces the chances of falling into one of the said biases. This is also true for the source of information: it is less likely that you fall into a cognitive bias if you look for other data sources and conflict.
Actively seeking evidence against your current point of view (on important decisions) can be a helpful way to combat biases like overconfidence, confirmation bias, and motivated reasoning.
Another strategy recommended by researchers who studied cognitive biases in physicians, is to consciously consider the options you dismissed at first, so you can reach a more considered answer.
Emotional biases can be considered a subcategory of cognitive biases. What separates them from other cognitive biases is that they are based on e motions such as anger, disgust, fear, happiness, sadness, and surprise . When we're experiencing emotions, we may act in a biased way that is concordant with that emotion. For instance, anxiety may cause us to overestimate the chance of something being dangerous.
Emotional biases are linked to emotional dispositions (commonly known as ‘temperament’). Different emotional dispositions may even lead to different emotional reactions to the same occurrence of events.
Emotional biases may help us explain optimism and pessimism biases .
Cognitive biases interfere with impartiality, and they can negatively impact critical thinking in a myriad of different ways. Here are several:
Motivated reasoning leads us to underestimate the arguments for conclusions we don’t believe in and overestimate the arguments for conclusions we want to believe;
Availability bias messes with our critical thinking because it leads us to asses risk by how readily examples come to mind, rather than considering all of the relevant examples;
We are also prone to blind spot bias, meaning that we are less likely to identify biases in our own judgment than in other people's.
Cognitive biases affect decision-making in at least two ways: they help decision-making by speeding it up and cutting necessary corners when we have limited time or cognitive power, but they also hinder decision-making by causing us to come to false conclusions or take unhelpful actions in certain cases.
Research has shown some correlation between gender or sex and specific biases. For instance, researchers found that male investors tend to show greater overconfidence and optimism biases, while female investors tend to exhibit more anchoring and hindsight biases. The research makes no claims about what causes such gendered differences - e.g., socialization or biology or a mix of both.
Gender stereotypes are explicit biases, which means they are not cognitive biases. However, there are many cognitive biases that involve gender stereotypes. For example, masculine bias is the tendency to assume a person is a male based on stereotypes after hearing gender-neutral information about them, and the tendency to use gender as a description only when describing women.
Gender stereotypes are also a sign of binary thinking .
Research has shown some cognitive biases are correlated with depression . This has been found to be the case for negative interpretation bias (the tendency to interpret ambiguous scenarios as negative) and pessimistic biases, which lead people to predict future situations as unrealistically negative.
Cognitive behavioral therapy is based on the assumption that individuals with depression have distorted negative beliefs about themselves or the world (known in CBT as "cognitive distortions").
Yes. They have been studied since the early 1970s by cognitive psychologists, sociologists, and behavioral economists.
Just like every other human being, scientists can exhibit cognitive biases.They may exhibit overconfidence bias or fall prety to selection biases, for example. This has been researched as it relates to the replication crisis social psychology faces today .
There is even research on the presence of cognitive biases in scientific contexts and occuring within academic publications. Nobody, not even scientists, are immune to cognitive biases!
Both. We are born with a tendency for some cognitive biases, but we can also learn specific aspects of these biases. Our brains have evolved to be prone to all sorts of cognitive biases because those biases have been helpful in the survival of our ancestors in the environment (and under the constraints) in which they lived.
But the details of some specific cognitive biases are learned as we move through the world. For example, humans have evolved a tendency to engage in motivated reasoning, but which conclusions motivate your reasoning is something you aren’t born with and are impacted by your experiences and learning.
Want to understand cognitive biases on a deeper level? Learn about a few of the mind's mistakes with our interactive introduction to cognitive biases!
Boosting Your Productivity With AI: Using AI for Summarization
Boosting Your Productivity With AI: What different LLMs are available, and which one is right for me?
Boosting Your Productivity With AI: Using AI for Workflow design
Charlotte Ruhl
Research Assistant & Psychology Graduate
BA (Hons) Psychology, Harvard University
Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.
Learn about our Editorial Process
Saul McLeod, PhD
Editor-in-Chief for Simply Psychology
BSc (Hons) Psychology, MRes, PhD, University of Manchester
Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.
Olivia Guy-Evans, MSc
Associate Editor for Simply Psychology
BSc (Hons) Psychology, MSc Psychology of Education
Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.
On This Page:
Have you ever been so busy talking on the phone that you don’t notice the light has turned green and it is your turn to cross the street?
Have you ever shouted, “I knew that was going to happen!” after your favorite baseball team gave up a huge lead in the ninth inning and lost?
Or have you ever found yourself only reading news stories that further support your opinion?
These are just a few of the many instances of cognitive bias that we experience every day of our lives. But before we dive into these different biases, let’s backtrack first and define what bias is.
Cognitive bias is a systematic error in thinking, affecting how we process information, perceive others, and make decisions. It can lead to irrational thoughts or judgments and is often based on our perceptions, memories, or individual and societal beliefs.
Biases are unconscious and automatic processes designed to make decision-making quicker and more efficient. Cognitive biases can be caused by many things, such as heuristics (mental shortcuts) , social pressures, and emotions.
Broadly speaking, bias is a tendency to lean in favor of or against a person, group, idea, or thing, usually in an unfair way. Biases are natural — they are a product of human nature — and they don’t simply exist in a vacuum or in our minds — they affect the way we make decisions and act.
In psychology, there are two main branches of biases: conscious and unconscious. Conscious or explicit bias is intentional — you are aware of your attitudes and the behaviors resulting from them (Lang, 2019).
Explicit bias can be good because it helps provide you with a sense of identity and can lead you to make good decisions (for example, being biased towards healthy foods).
However, these biases can often be dangerous when they take the form of conscious stereotyping.
On the other hand, unconscious bias , or cognitive bias, represents a set of unintentional biases — you are unaware of your attitudes and behaviors resulting from them (Lang, 2019).
Cognitive bias is often a result of your brain’s attempt to simplify information processing — we receive roughly 11 million bits of information per second. Still, we can only process about 40 bits of information per second (Orzan et al., 2012).
Therefore, we often rely on mental shortcuts (called heuristics) to help make sense of the world with relative speed. As such, these errors tend to arise from problems related to thinking: memory, attention, and other mental mistakes.
Cognitive biases can be beneficial because they do not require much mental effort and can allow you to make decisions relatively quickly, but like conscious biases, unconscious biases can also take the form of harmful prejudice that serves to hurt an individual or a group.
Although it may feel like there has been a recent rise of unconscious bias, especially in the context of police brutality and the Black Lives Matter movement, this is not a new phenomenon.
Thanks to Tversky and Kahneman (and several other psychologists who have paved the way), we now have an existing dictionary of our cognitive biases.
Again, these biases occur as an attempt to simplify the complex world and make information processing faster and easier. This section will dive into some of the most common forms of cognitive bias.
Confirmation bias is the tendency to interpret new information as confirmation of your preexisting beliefs and opinions while giving disproportionately less consideration to alternative possibilities.
Since Watson’s 1960 experiment, real-world examples of confirmation bias have gained attention.
This bias often seeps into the research world when psychologists selectively interpret data or ignore unfavorable data to produce results that support their initial hypothesis.
Confirmation bias is also incredibly pervasive on the internet, particularly with social media. We tend to read online news articles that support our beliefs and fail to seek out sources that challenge them.
Various social media platforms, such as Facebook, help reinforce our confirmation bias by feeding us stories that we are likely to agree with – further pushing us down these echo chambers of political polarization.
Some examples of confirmation bias are especially harmful, specifically in the context of the law. For example, a detective may identify a suspect early in an investigation, seek out confirming evidence, and downplay falsifying evidence.
The confirmation bias dates back to 1960 when Peter Wason challenged participants to identify a rule applying to triples of numbers.
People were first told that the sequences 2, 4, 6 fit the rule, and they then had to generate triples of their own and were told whether that sequence fits the rule. The rule was simple: any ascending sequence.
But not only did participants have an unusually difficult time realizing this and instead devised overly-complicated hypotheses, they also only generated triples that confirmed their preexisting hypothesis (Wason, 1960).
But why does confirmation bias occur? It’s partially due to the effect of desire on our beliefs. In other words, certain desired conclusions (ones that support our beliefs) are more likely to be processed by the brain and labeled as true (Nickerson, 1998).
This motivational explanation is often coupled with a more cognitive theory.
The cognitive explanation argues that because our minds can only focus on one thing at a time, it is hard to parallel process (see information processing for more information) alternate hypotheses, so, as a result, we only process the information that aligns with our beliefs (Nickerson, 1998).
Another theory explains confirmation bias as a way of enhancing and protecting our self-esteem.
As with the self-serving bias (see more below), our minds choose to reinforce our preexisting ideas because being right helps preserve our sense of self-esteem, which is important for feeling secure in the world and maintaining positive relationships (Casad, 2019).
Although confirmation bias has obvious consequences, you can still work towards overcoming it by being open-minded and willing to look at situations from a different perspective than you might be used to (Luippold et al., 2015).
Even though this bias is unconscious, training your mind to become more flexible in its thought patterns will help mitigate the effects of this bias.
Hindsight bias refers to the tendency to perceive past events as more predictable than they actually were (Roese & Vohs, 2012). There are cognitive and motivational explanations for why we ascribe so much certainty to knowing the outcome of an event only once the event is completed.
When sports fans know the outcome of a game, they often question certain decisions coaches make that they otherwise would not have questioned or second-guessed.
And fans are also quick to remark that they knew their team was going to win or lose, but, of course, they only make this statement after their team actually did win or lose.
Although research studies have demonstrated that the hindsight bias isn’t necessarily mitigated by pure recognition of the bias (Pohl & Hell, 1996).
You can still make a conscious effort to remind yourself that you can’t predict the future and motivate yourself to consider alternate explanations.
It’s important to do all we can to reduce this bias because when we are overly confident about our ability to predict outcomes, we might make future risky decisions that could have potentially dangerous outcomes.
Building on Tversky and Kahneman’s growing list of heuristics, researchers Baruch Fischhoff and Ruth Beyth-Marom (1975) were the first to directly investigate the hindsight bias in the empirical setting.
The team asked participants to judge the likelihood of several different outcomes of former U.S. president Richard Nixon’s visit to Beijing and Moscow.
After Nixon returned back to the States, participants were asked to recall the likelihood of each outcome they had initially assigned.
Fischhoff and Beyth found that for events that actually occurred, participants greatly overestimated the initial likelihood they assigned to those events.
That same year, Fischhoff (1975) introduced a new method for testing the hindsight bias – one that researchers still use today.
Participants are given a short story with four possible outcomes, and they are told that one is true. When they are then asked to assign the likelihood of each specific outcome, they regularly assign a higher likelihood to whichever outcome they have been told is true, regardless of how likely it actually is.
But hindsight bias does not only exist in artificial settings. In 1993, Dorothee Dietrich and Matthew Olsen asked college students to predict how the U.S. Senate would vote on the confirmation of Supreme Court nominee Clarence Thomas.
Before the vote, 58% of participants predicted that he would be confirmed, but after his actual confirmation, 78% of students said that they thought he would be approved – a prime example of the hindsight bias. And this form of bias extends beyond the research world.
From the cognitive perspective, hindsight bias may result from distortions of memories of what we knew or believed to know before an event occurred (Inman, 2016).
It is easier to recall information that is consistent with our current knowledge, so our memories become warped in a way that agrees with what actually did happen.
Motivational explanations of the hindsight bias point to the fact that we are motivated to live in a predictable world (Inman, 2016).
When surprising outcomes arise, our expectations are violated, and we may experience negative reactions as a result. Thus, we rely on the hindsight bias to avoid these adverse responses to certain unanticipated events and reassure ourselves that we actually did know what was going to happen.
Self-serving bias is the tendency to take personal responsibility for positive outcomes and blame external factors for negative outcomes.
You would be right to ask how this is similar to the fundamental attribution error (Ross, 1977), which identifies our tendency to overemphasize internal factors for other people’s behavior while attributing external factors to our own.
The distinction is that the self-serving bias is concerned with valence. That is, how good or bad an event or situation is. And it is also only concerned with events for which you are the actor.
In other words, if a driver cuts in front of you as the light turns green, the fundamental attribution error might cause you to think that they are a bad person and not consider the possibility that they were late for work.
On the other hand, the self-serving bias is exercised when you are the actor. In this example, you would be the driver cutting in front of the other car, which you would tell yourself is because you are late (an external attribution to a negative event) as opposed to it being because you are a bad person.
From sports to the workplace, self-serving bias is incredibly common. For example, athletes are quick to take responsibility for personal wins, attributing their successes to their hard work and mental toughness, but point to external factors, such as unfair calls or bad weather, when they lose (Allen et al., 2020).
In the workplace, people attribute internal factors when they have hired for a job but external factors when they are fired (Furnham, 1982). And in the office itself, workplace conflicts are given external attributions, and successes, whether a persuasive presentation or a promotion, are awarded internal explanations (Walther & Bazarova, 2007).
Additionally, self-serving bias is more prevalent in individualistic cultures , which place emphasis on self-esteem levels and individual goals, and it is less prevalent among individuals with depression (Mezulis et al., 2004), who are more likely to take responsibility for negative outcomes.
Overcoming this bias can be difficult because it is at the expense of our self-esteem. Nevertheless, practicing self-compassion – treating yourself with kindness even when you fall short or fail – can help reduce the self-serving bias (Neff, 2003).
The leading explanation for the self-serving bias is that it is a way of protecting our self-esteem (similar to one of the explanations for the confirmation bias).
We are quick to take credit for positive outcomes and divert the blame for negative ones to boost and preserve our individual ego, which is necessary for confidence and healthy relationships with others (Heider, 1982).
Another theory argues that self-serving bias occurs when surprising events arise. When certain outcomes run counter to our expectations, we ascribe external factors, but when outcomes are in line with our expectations, we attribute internal factors (Miller & Ross, 1975).
An extension of this theory asserts that we are naturally optimistic, so negative outcomes come as a surprise and receive external attributions as a result.
individualistic cultures is closely related to the decision-making process. It occurs when we rely too heavily on either pre-existing information or the first piece of information (the anchor) when making a decision.
For example, if you first see a T-shirt that costs $1,000 and then see a second one that costs $100, you’re more likely to see the second shirt as cheap as you would if the first shirt you saw was $120. Here, the price of the first shirt influences how you view the second.
Sarah is looking to buy a used car. The first dealership she visits has a used sedan listed for $19,000. Sarah takes this initial listing price as an anchor and uses it to evaluate prices at other dealerships.
When she sees another similar used sedan priced at $18,000, that price seems like a good bargain compared to the $19,000 anchor price she saw first, even though the actual market value is closer to $16,000.
When Sarah finds a comparable used sedan priced at $15,500, she continues perceiving that price as cheap compared to her anchored reference price.
Ultimately, Sarah purchases the $18,000 sedan, overlooking that all of the prices seemed like bargains only in relation to the initial high anchor price.
The key elements that demonstrate anchoring bias here are:
Multiple theories seek to explain the existence of this bias.
One theory, known as anchoring and adjustment, argues that once an anchor is established, people insufficiently adjust away from it to arrive at their final answer, and so their final guess or decision is closer to the anchor than it otherwise would have been (Tversky & Kahneman, 1992).
And when people experience a greater cognitive load (the amount of information the working memory can hold at any given time; for example, a difficult decision as opposed to an easy one), they are more susceptible to the effects of anchoring.
Another theory, selective accessibility, holds that although we assume that the anchor is not a suitable answer (or a suitable price going back to the initial example) when we evaluate the second stimulus (or second shirt), we look for ways in which it is similar or different to the anchor (the price being way different), resulting in the anchoring effect (Mussweiler & Strack, 1999).
A final theory posits that providing an anchor changes someone’s attitudes to be more favorable to the anchor, which then biases future answers to have similar characteristics as the initial anchor.
Although there are many different theories for why we experience anchoring bias, they all agree that it affects our decisions in real ways (Wegner et al., 2001).
The first study that brought this bias to light was during one of Tversky and Kahneman’s (1974) initial experiments. They asked participants to compute the product of numbers 1-8 in five seconds, either as 1x2x3… or 8x7x6…
Participants did not have enough time to calculate the answer, so they had to estimate based on their first few calculations.
They found that those who computed the small multiplications first (i.e., 1x2x3…) gave a median estimate of 512, but those who computed the larger multiplications first gave a median estimate of 2,250 (although the actual answer is 40,320).
This demonstrates how the initial few calculations influenced the participant’s final answer.
Availability bias (also commonly referred to as the availability heuristic ) refers to the tendency to think that examples of things that readily come to mind are more common than what is actually the case.
In other words, information that comes to mind faster influences the decisions we make about the future. And just like with the hindsight bias, this bias is related to an error of memory.
But instead of being a memory fabrication, it is an overemphasis on a certain memory.
In the workplace, if someone is being considered for a promotion but their boss recalls one bad thing that happened years ago but left a lasting impression, that one event might have an outsized influence on the final decision.
Another common example is buying lottery tickets because the lifestyle and benefits of winning are more readily available in mind (and the potential emotions associated with winning or seeing other people win) than the complex probability calculation of actually winning the lottery (Cherry, 2019).
A final common example that is used to demonstrate the availability heuristic describes how seeing several television shows or news reports about shark attacks (or anything that is sensationalized by the news, such as serial killers or plane crashes) might make you think that this incident is relatively common even though it is not at all.
Regardless, this thinking might make you less inclined to go in the water the next time you go to the beach (Cherry, 2019).
As with most cognitive biases, the best way to overcome them is by recognizing the bias and being more cognizant of your thoughts and decisions.
And because we fall victim to this bias when our brain relies on quick mental shortcuts in order to save time, slowing down our thinking and decision-making process is a crucial step to mitigating the effects of the availability heuristic.
Researchers think this bias occurs because the brain is constantly trying to minimize the effort necessary to make decisions, and so we rely on certain memories – ones that we can recall more easily – instead of having to endure the complicated task of calculating statistical probabilities.
Two main types of memories are easier to recall: 1) those that more closely align with the way we see the world and 2) those that evoke more emotion and leave a more lasting impression.
This first type of memory was identified in 1973, when Tversky and Kahneman, our cognitive bias pioneers, conducted a study in which they asked participants if more words begin with the letter K or if more words have K as their third letter.
Although many more words have K as their third letter, 70% of participants said that more words begin with K because the ability to recall this is not only easier, but it more closely aligns with the way they see the world (knowing the first letter of any word is infinitely more common than the third letter of any word).
In terms of the second type of memory, the same duo ran an experiment in 1983, 10 years later, where half the participants were asked to guess the likelihood of a massive flood would occur somewhere in North America, and the other half had to guess the likelihood of a flood occurring due to an earthquake in California.
Although the latter is much less likely, participants still said that this would be much more common because they could recall specific, emotionally charged events of earthquakes hitting California, largely due to the news coverage they receive.
Together, these studies highlight how memories that are easier to recall greatly influence our judgments and perceptions about future events.
A final popular form of cognitive bias is inattentional blindness . This occurs when a person fails to notice a stimulus that is in plain sight because their attention is directed elsewhere.
For example, while driving a car, you might be so focused on the road ahead of you that you completely fail to notice a car swerve into your lane of traffic.
Because your attention is directed elsewhere, you aren’t able to react in time, potentially leading to a car accident. Experiencing inattentional blindness has its obvious consequences (as illustrated by this example), but, like all biases, it is not impossible to overcome.
Many theories seek to explain why we experience this form of cognitive bias. In reality, it is probably some combination of these explanations.
Conspicuity holds that certain sensory stimuli (such as bright colors) and cognitive stimuli (such as something familiar) are more likely to be processed, and so stimuli that don’t fit into one of these two categories might be missed.
The mental workload theory describes how when we focus a lot of our brain’s mental energy on one stimulus, we are using up our cognitive resources and won’t be able to process another stimulus simultaneously.
Similarly, some psychologists explain how we attend to different stimuli with varying levels of attentional capacity, which might affect our ability to process multiple stimuli simultaneously.
In other words, an experienced driver might be able to see that car swerve into the lane because they are using fewer mental resources to drive, whereas a beginner driver might be using more resources to focus on the road ahead and unable to process that car swerving in.
A final explanation argues that because our attentional and processing resources are limited, our brain dedicates them to what fits into our schemas or our cognitive representations of the world (Cherry, 2020).
Thus, when an unexpected stimulus comes into our line of sight, we might not be able to process it on the conscious level. The following example illustrates how this might happen.
The most famous study to demonstrate the inattentional blindness phenomenon is the invisible gorilla study (Most et al., 2001). This experiment asked participants to watch a video of two groups passing a basketball and count how many times the white team passed the ball.
Participants are able to accurately report the number of passes, but what they fail to notice is a gorilla walking directly through the middle of the circle.
Because this would not be expected, and because our brain is using up its resources to count the number of passes, we completely fail to process something right before our eyes.
A real-world example of inattentional blindness occurred in 1995 when Boston police officer Kenny Conley was chasing a suspect and ran by a group of officers who were mistakenly holding down an undercover cop.
Conley was convicted of perjury and obstruction of justice because he supposedly saw the fight between the undercover cop and the other officers and lied about it to protect the officers, but he stood by his word that he really hadn’t seen it (due to inattentional blindness) and was ultimately exonerated (Pickel, 2015).
The key to overcoming inattentional blindness is to maximize your attention by avoiding distractions such as checking your phone. And it is also important to pay attention to what other people might not notice (if you are that driver, don’t always assume that others can see you).
By working on expanding your attention and minimizing unnecessary distractions that will use up your mental resources, you can work towards overcoming this bias.
As we know, recognizing these biases is the first step to overcoming them. But there are other small strategies we can follow in order to train our unconscious mind to think in different ways.
From strengthening our memory and minimizing distractions to slowing down our decision-making and improving our reasoning skills, we can work towards overcoming these cognitive biases.
An individual can evaluate his or her own thought process, also known as metacognition (“thinking about thinking”), which provides an opportunity to combat bias (Flavell, 1979).
This multifactorial process involves (Croskerry, 2003):
(a) acknowledging the limitations of memory, (b) seeking perspective while making decisions, (c) being able to self-critique, (d) choosing strategies to prevent cognitive error.
Many strategies used to avoid bias that we describe are also known as cognitive forcing strategies, which are mental tools used to force unbiased decision-making.
The term cognitive bias was first coined in the 1970s by Israeli psychologists Amos Tversky and Daniel Kahneman, who used this phrase to describe people’s flawed thinking patterns in response to judgment and decision problems (Tversky & Kahneman, 1974).
Tversky and Kahneman’s research program, the heuristics and biases program, investigated how people make decisions given limited resources (for example, limited time to decide which food to eat or limited information to decide which house to buy).
As a result of these limited resources, people are forced to rely on heuristics or quick mental shortcuts to help make their decisions.
Tversky and Kahneman wanted to understand the biases associated with this judgment and decision-making process.
To do so, the two researchers relied on a research paradigm that presented participants with some type of reasoning problem with a computed normative answer (they used probability theory and statistics to compute the expected answer).
Participants’ responses were then compared with the predetermined solution to reveal the systematic deviations in the mind.
After running several experiments with countless reasoning problems, the researchers were able to identify numerous norm violations that result when our minds rely on these cognitive biases to make decisions and judgments (Wilke & Mata, 2012).
Allen, M. S., Robson, D. A., Martin, L. J., & Laborde, S. (2020). Systematic review and meta-analysis of self-serving attribution biases in the competitive context of organized sport. Personality and Social Psychology Bulletin, 46 (7), 1027-1043.
Casad, B. (2019). Confirmation bias . Retrieved from https://www.britannica.com/science/confirmation-bias
Cherry, K. (2019). How the availability heuristic affects your decision-making . Retrieved from https://www.verywellmind.com/availability-heuristic-2794824
Cherry, K. (2020). Inattentional blindness can cause you to miss things in front of you . Retrieved from https://www.verywellmind.com/what-is-inattentional-blindness-2795020
Dietrich, D., & Olson, M. (1993). A demonstration of hindsight bias using the Thomas confirmation vote. Psychological Reports, 72 (2), 377-378.
Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1 (3), 288.
Fischhoff, B., & Beyth, R. (1975). I knew it would happen: Remembered probabilities of once—future things. Organizational Behavior and Human Performance, 13 (1), 1-16.
Furnham, A. (1982). Explanations for unemployment in Britain. European Journal of social psychology, 12(4), 335-352.
Heider, F. (1982). The psychology of interpersonal relations . Psychology Press.
Inman, M. (2016). Hindsight bias . Retrieved from https://www.britannica.com/topic/hindsight-bias
Lang, R. (2019). What is the difference between conscious and unconscious bias? : Faqs. Retrieved from https://engageinlearning.com/faq/compliance/unconscious-bias/what-is-the-difference-between-conscious-and-unconscious-bias/
Luippold, B., Perreault, S., & Wainberg, J. (2015). Auditor’s pitfall: Five ways to overcome confirmation bias . Retrieved from https://www.babson.edu/academics/executive-education/babson-insight/finance-and-accounting/auditors-pitfall-five-ways-to-overcome-confirmation-bias/
Mezulis, A. H., Abramson, L. Y., Hyde, J. S., & Hankin, B. L. (2004). Is there a universal positivity bias in attributions? A meta-analytic review of individual, developmental, and cultural differences in the self-serving attributional bias. Psychological Bulletin, 130 (5), 711.
Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction?. Psychological Bulletin, 82 (2), 213.
Most, S. B., Simons, D. J., Scholl, B. J., Jimenez, R., Clifford, E., & Chabris, C. F. (2001). How not to be seen: The contribution of similarity and selective ignoring to sustained inattentional blindness. Psychological Science, 12 (1), 9-17.
Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35 (2), 136-164.
Neff, K. (2003). Self-compassion: An alternative conceptualization of a healthy attitude toward oneself. Self and Identity, 2 (2), 85-101.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175-220.
Orzan, G., Zara, I. A., & Purcarea, V. L. (2012). Neuromarketing techniques in pharmaceutical drugs advertising. A discussion and agenda for future research. Journal of Medicine and Life, 5 (4), 428.
Pickel, K. L. (2015). Eyewitness memory. The handbook of attention , 485-502.
Pohl, R. F., & Hell, W. (1996). No reduction in hindsight bias after complete information and repeated testing. Organizational Behavior and Human Decision Processes, 67 (1), 49-58.
Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science, 7 (5), 411-426.
Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In Advances in experimental social psychology (Vol. 10, pp. 173-220). Academic Press.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5 (2), 207-232.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 (4157), 1124-1131.
Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review , 90(4), 293.
Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4), 297-323.
Walther, J. B., & Bazarova, N. N. (2007). Misattribution in virtual groups: The effects of member distribution on self-serving bias and partner blame. Human Communication Research, 33 (1), 1-26.
Wason, Peter C. (1960), “On the failure to eliminate hypotheses in a conceptual task”. Quarterly Journal of Experimental Psychology, 12 (3): 129–40.
Wegener, D. T., Petty, R. E., Detweiler-Bedell, B. T., & Jarvis, W. B. G. (2001). Implications of attitude change theories for numerical anchoring: Anchor plausibility and the limits of anchor effectiveness. Journal of Experimental Social Psychology, 37 (1), 62-69.
Wilke, A., & Mata, R. (2012). Cognitive bias. In Encyclopedia of human behavior (pp. 531-535). Academic Press.
Test yourself for bias.
Cognitive biases are tendencies to selectively search for or interpret data in a way that confirms one’s existing beliefs.
by Terry Heick
Cognitive biases are a kind of ongoing cognitive ‘condition’–tendencies to selectively search for and interpret data in a way that confirms one’s existing beliefs.
A cognitive bias is an inherent thinking ‘blind spot’ that reduces thinking accuracy and results inaccurate–and often irrational–conclusions.
Much like logical fallacies, cognitive biases can be viewed as either as causes or effects but can generally be reduced to broken thinking. Not all ‘broken thinking,’ blind spots, and failures of thought are labeled, of course. But some are so common that they are given names–and once named, they’re easier to identify, emphasize, analyze, and ultimately avoid.
See also The Difference Between Logical Fallacies And Cognitive Biases
And that’s where this list comes in.
Cognitive Bias –> Confirmation Bias
For example, consider confirmation bias.
In What Is Confirmation Bias? we looked at this very common thinking mistake: the tendency to overvalue data and observation that fits with our existing beliefs.
The pattern is to form a theory (often based on emotion) supported with insufficient data, and then to restrict critical thinking and ongoing analysis, which is, of course, irrational. Instead, you look for data that fits your theory.
While it seems obvious enough to avoid, confirmation bias is particularly sinister cognitive bias, affecting not just intellectual debates, but relationships, personal finances, and even your physical and mental health. Racism and sexism, for example, can both be deepened by confirmation bias. If you have an opinion on gender roles, it can be tempting to look for ‘data’ from your daily life that reinforce your opinion on those roles.
This is, of course, all much more complex than the above thumbnail. The larger point, however, is that a failure of rational and critical thinking is not just ‘wrong’ but erosive and even toxic not just in academia, but every level of society.
See also Complete List Of Logical Fallacies With Examples
The Cognitive Bias Codex: A Visual Of 180+ Cognitive Biases
And that’s why a graphic like this is so extraordinary. In a single image, we have delineated dozens and dozens of these ‘bad cognitive patterns’ that, as a visual, underscores how commonly our thinking fails us–and a result, where we might begin to improve. Why and how to accomplish this is in a modern circumstance is at the core of TeachThought’s mission.
The graphic is structure as a circle with four quadrants categorizing the cognitive biases into four categories:
1. Too Much Information
2. Not Enough Meaning
3. Need To Act Fast
4. What Should We Remember?
We’ve listed each fallacy below moving clockwise from ‘Too Much Information’ to ‘What Should We Remember?’ Obviously, this list isn’t exhaustive–and there are even subjectivities and cultural biases embedded within (down to some of the biases themselves–the ‘IKEA effect,’ for example). The premise, though, remains intact: What are our most common failures of rational and critical thinking, and how can we avoid them in pursuit of academic and sociocultural progress?
So take a look and let me know what you think. There’s even an updated version of this graphic with all of the definitions for each of the biases–which I personally love, but is difficult to read.
Image description: Wikipedia’s complete (as of 2021) list of cognitive biases, arranged and designed by John Manoogian III. Categories and descriptions originally by Buster Benson.
Too Much Information
We notice things already primed in memory or repeated often
Availability heuristic
Attentional bias
Illusory truth effect
Mere exposure effect
Context effect
Cue-dependent forgetting
Mood-congruent memory bias
Frequency illusion
Baader-Meinhof Phenomenon
Empathy gap
Omission bias
Base rate fallacy
Bizarre, funny, visually-striking, or anthropomorphic things stick out more than non-bizarre/unfunny things
Bizarreness effect
Humor effect Von Restorff effect
Picture superiority effect
Self-relevance effect
Negativity bias
We notice when something has changed
Conservation
Contrast effect
Distinction effect
Focusing effect
Framing effect
Money illusion
Weber-Fechner law
We are drawn to details that confirm our own existing beliefs
Confirmation bias
Congruence bias
Post-purchase rationalization
Choice-support bias
Selective perception
Observer-expectancy effect
Experimenter’s bias
Observer effect
Exception bias
Ostrich effect
Subjective validation
Continued influence effect
Semmelweis reflex
We notice flaws in others more easily than we notice flaws in ourselves
Bias blind spot
Naive cynicism
Naive realism
Not Enough Meaning
We tend to find stories and data when looking at sparse data
Confabulation
Clustering illusion
Insensitivity to sample size
Neglect of Probability
Anecdotal fallacy
Illusion of validity
Masked man fallacy
Recency illusion
Gambler’s fallacy
Illusory correlation
Anthropomorphism
We fill in characteristics from stereotypes, generalities, and prior histories
Group attribution error
Ultimate attribution error
Stereotyping
Essentialism
Functional fixedness
Moral credential effect
Just-world hypothesis
argument from fallacy
Authority bias
Automation bias
Bandwagon effect
Placebo effect
We imagine things and people we’re familiar with or fond of as better
Out-group homogeneity bias
Cross-race effect
In-group bias
Halo effect
Cheerleader effect
Positivity effect
Not invented here
Reactive devaluation
Well-traveled road effect
We simplify probabilities and numbers to make them easier to think about
Mental accounting
Appeal to probability fallacy
Normalcy bias
Murphy’s Law
Zero-sum bias
Survivorship bias
Subadditivity effect
Denomination effect
Magic number 7+-2
We think we know what other people are thinking
Illusion of transparency
Curse of knowledge
Spotlight effect
Extrinsic incentive error
Illusion of external agency
Illusion of asymmetric insight
We project our current mindset and assumptions onto the past and future
Self-consistency bias
Resistant bias
Projection bias
Pro-innovation bias
Time-saving bias
Planning fallacy
Pessimism bias
Impact bias
Outcome bias
Hindsight bias
Rosy retrospection
Telescoping effect
Need To Act Fast
We favor simple-looking options and complete information over complex, ambiguous options
Less-is-better effect
Occam’s razor
Conjunction fallacy
Delmore effect
Law of Triviality
Bike-shedding effect
Rhyme as reason effect
Belief bias
Information bias
Ambiguity bias
To avoid mistakes, we aim to preserve autonomy and group status and avoid irreversible decisions
Status quo bias
Social comparison bias
Decoy effect
Reverse psychology
System justification
To get things done, we tend to complete things we’ve time & energy in
Backfire effect
Endowment effect
Processing difficulty effect
Pseudocertainty effect
Disposition effect
Zero-risk bias
IKEA effect
Loss aversion
Generation effect
Escalation of commitment
Irrational escalation
Sunk cost fallacy
To stay focused, we favor the immediate, relatable thing in front of us
Identifiable victim effect
Appeal to novelty
Hyperbolic discounting
To act, we must be confident we can make an impact and feel what we do is important
Peltzman effect
Risk compensation
Effort Justification
Trait ascription bias
Defensive attribution hypothesis
Fundamental attribution error
Illusory superiority
Illusion of control
Actor-observer bias
Self-serving bias
Barnum effect
Forer effect
Optimism effect
Egocentric effect
Dunning-Kruger effect
Lake Wobegone effect
Hard-easy effect
False consensus effect
Third-person effect
Social desirability bias
Overconfidence effect
What Should We Remember?
We store memories differently based on how they are experienced
Tip of the tongue phenomenon
Google effect
Next-in-line effect
Testing effect
Absent-mindedness
Levels of processing effect
We reduce events and lists to their key elements
Suffix effect
Serial position effect
Part-list cueing effect
Recency effect
Primary effect
Memory inhibition
Modality effect
Duration neglect
List-length effect
Serial recall effect
Misinformation effect
Leveling and sharpening
Peak-end rule
We discard specifics to form generalities
Fading affect bias
Stereotypical bias
Implicit stereotypes
Implicit association
We edit and reinforce some memories after the fact
Spacing effect
Suggestibility
False memory
Cryptomnesia
Source confusion
Misattribution of memory
Creative Commons Attribution: Share-Alike
Founder & Director of TeachThought
Academic tools.
Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.
2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.
Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as
active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)
and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.
In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.
Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.
For details on this history, see the Supplement on History .
Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.
Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.
Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)
Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.
“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.
“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)
Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).
Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.
Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).
Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).
Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).
Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).
Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).
Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.
Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.
Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as
a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)
A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.
Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.
What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as
a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)
Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.
One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.
If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.
In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.
Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).
Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.
Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:
The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).
The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).
Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.
If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.
By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.
Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.
Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.
Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)
Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).
On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.
A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.
Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.
Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.
Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .
Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.
Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).
The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.
Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.
Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.
Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).
Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.
Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).
Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.
Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).
Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.
Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.
Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.
In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.
We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).
According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).
Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.
Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .
What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.
Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .
Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.
McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).
McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.
The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.
It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.
Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:
A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as
thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)
Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should
be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)
Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.
The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:
A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.
What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.
Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .
As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal
Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >
Mirror sites.
View this site from another server:
The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University
Library of Congress Catalog Data: ISSN 1095-5054
Learn how to identify and address bias in decision making with our guide to recognizing bias in problem solving and critical thinking.
In today's world, it is becoming increasingly important to recognize bias and how it can affect our decision-making. Bias can cloud our judgement, lead us to make decisions that are not in our best interests, and limit our ability to solve problems effectively. In this guide, we will explore the concept of recognizing bias and how it can be used as a tool for developing critical thinking and problem-solving skills. We will discuss the various types of biases, why recognizing them is important, and how to identify and counteract them.
Cognitive bias.
This type of bias can lead to unfair judgments or decisions. Other common types of bias include cultural bias, which is the tendency to favor one’s own culture or group; and political bias, which is the tendency to favor one’s own political party or beliefs. In order to identify and address bias in oneself and others, it is important to be aware of potential sources of bias. This includes personal opinions, values, and preconceived notions. Being mindful of these potential sources of bias can help us become more aware of our own biases and recognize them in others.
Additionally, it is important to be open-minded and willing to consider alternative perspectives. Additionally, it is helpful to challenge our own assumptions and beliefs by questioning them and seeking out evidence that supports or refutes them. The potential implications of not recognizing or addressing bias are significant. If left unchecked, biases can lead to unfair decisions or judgments, as well as inaccurate conclusions. This can have serious consequences for individuals and organizations alike.
Strategies for identifying and addressing bias.
Recognizing bias in oneself and others is an important part of making informed decisions. There are several strategies that can be used to identify and address bias. One of the most effective strategies is to take a step back and look at the situation objectively. This involves examining the facts and assumptions that are being used to make decisions.
It can also involve assessing the potential impact of decisions on multiple stakeholders. By removing personal biases from the equation, it is possible to make more informed decisions. Another important strategy for identifying and addressing bias is to question the sources of information. It is important to consider the credibility of sources, as well as any potential biases that may be present.
Fact-checking sources and considering multiple perspectives can help identify any potential biases in the information being used. In addition, it is important to remain aware of our own biases. We all have preconceived notions about certain topics that can affect our decision-making process. By being mindful of our biases, we can avoid making decisions that are influenced by them. Finally, it is important to be open to other perspectives and willing to engage in meaningful dialogue with others.
Halo effect, what is bias.
It can be an unconscious preference that influences decision making and can lead to adverse outcomes. It is important to recognize bias because it can have a negative impact on our ability to make sound decisions and engage in problem solving and critical thinking. Bias can manifest itself in various ways, from subtle mental shortcuts to overt prejudices. Types of bias include confirmation bias, where we seek out information that confirms our existing beliefs; availability bias, where we base decisions on the information that is most readily available; and representativeness bias, where we assume that two events or objects are related because they share similar characteristics. Other forms of bias include halo effect, where a single positive quality or trait can influence the perception of an entire person; and stereotyping, which is the tendency to make judgments about individuals based on their perceived membership in a certain group. It is important to recognize bias in ourselves and others so that we can make informed decisions and engage in problem solving and critical thinking.
Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence. Personal opinions and values can lead to biased decision-making. They can be shaped by past experiences, cultural background , and other personal factors. For example, someone's opinion about a certain topic may be based on what they have previously heard or read. Similarly, preconceived notions can also lead to biased conclusions. Cultural norms can also play a role in creating bias.
For instance, people may be more likely to believe information from a source they trust or respect, even if it is not based on fact. Similarly, people may be more likely to make decisions that conform to the expectations of their culture or society. In addition, people can also be influenced by their own prejudices or stereotypes. This type of bias can lead to unfair treatment of certain individuals or groups of people. Finally, it is important to be aware of the potential for confirmation bias, where people will seek out information that confirms their existing beliefs and disregard any contradictory evidence. By recognizing and understanding these sources of bias, people can make more informed decisions and engage in more effective problem solving and critical thinking.
In conclusion, recognizing and addressing bias is an essential part of problem solving and critical thinking. Bias can come from many sources, including our own beliefs, cultural norms, and past experiences. Knowing the types of bias and strategies for identifying and addressing them can help us make informed decisions and better engage in critical thinking. Taking time to reflect on our own biases is also important for making unbiased decisions.
Ultimately, recognizing and addressing bias will improve our problem-solving and critical thinking skills.
Explore creative solutions for teams with collaborative problem solving games. Learn how to use games to help foster collaboration and problem-solving skills.
Learn about mind mapping, a creative problem solving technique that can help you brainstorm ideas for projects and topics, and more.
This article explores the challenges of finding sustainable energy sources, as well as problem solving examples for environmental problems.
Learn all about critical path analysis and how to use it as a problem solving and planning tool. This comprehensive guide covers everything from introduction to conclusion.
Which cookies do you want to accept?
Learning objectives.
By the end of this section, you will be able to:
To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.
To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.
This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.
To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.
Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:
In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.
See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.
Watch the video to orient yourself before reading the text that follows.
Confirmation bias.
One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.
Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.
Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.
In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.
Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.
In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.
The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.
Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.
Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.
Sunk cost fallacy.
Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.
A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.
There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.
In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.
Table 2.1 summarizes these common cognitive biases.
Bias | Description | Example |
---|---|---|
Confirmation bias | The tendency to search for, interpret, favor, and recall information that confirms or supports prior beliefs | As part of their morning routine, a person scans news headlines on the internet and chooses to read only those stories that confirm views they already hold. |
Anchoring bias | The tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something | When supplied with a random number and then asked to provide a number estimate in response to a question, people supply a number close to the random number they were initially given. |
Availability heuristic | The tendency to evaluate new information based on the most recent or most easily recalled examples | People in the United States overestimate the probability of dying in a criminal attack, since these types of stories are easy to vividly recall. |
Tribalism | The tendency for human beings to align themselves with groups with whom they share values and practices | People with a strong commitment to one political party often struggle to objectively evaluate the political positions of those who are members of the opposing party. |
Bandwagon fallacy | The tendency to do something or believe something because many other people do or believe the same thing | Advertisers often rely on the bandwagon fallacy, attempting to create the impression that “everyone” is buying a new product, in order to inspire others to buy it. |
Sunk cost fallacy | The tendency to attach a value to things in which resources have been invested that is greater than the value those things actually have | A business person continues to invest money in a failing venture, “throwing good money after bad.” |
Gambler’s fallacy | The tendency to reason that future chance events will be more likely if they have not happened recently | Someone who regularly buys lottery tickets reasons that they are “due to win,” since they haven’t won once in twenty years. |
As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?
Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?
This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.
Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.
Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
© Mar 1, 2024 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.
Make sure that the decisions that matter are not made based on bias..
Posted September 7, 2018 | Reviewed by Matt Huston
Though the concept of illusory superiority arguably dates back to Confucius and Socrates, it may come as a shock that its discussion in the form of the Dunning-Kruger Effect is almost 20 years old; and though it may simply be a result of an echo chamber created through my own social media , it seems to be popping up quite frequently in the news and posts that I’ve been reading lately—even through memes . For those of you unfamiliar with the phenomenon, the Dunning-Kruger Effect refers to a cognitive bias in which individuals with a low level of knowledge in a particular subject mistakenly assess their knowledge or ability as greater than it is. Similarly, it also refers to experts underestimating their own level of knowledge or ability.
But, then again, maybe it’s not my echo chamber—maybe it is part and parcel of our new knowledge economy (Dwyer, 2017; Dwyer, Hogan & Stewart, 2014) and the manner in which we quickly and effortlessly process information (right or wrong) with the help of the internet. In any case, given the frequency with which I seem to have encountered mention of this cognitive bias lately, coupled with the interest in my previous blog post " 18 Common Logical Fallacies and Persuasion Techniques ," I decided it might be interesting to compile a similar list—this time, one of cognitive biases .
A cognitive bias refers to a "systematic error" in the thinking process. Such biases are often connected to a heuristic, which is essentially a mental shortcut—heuristics allow one to make an inference without extensive deliberation and/or reflective judgment, given that they are essentially schemas for such solutions (West, Toplak, & Stanovich, 2008). Though there are many interesting heuristics out there, the following list deals exclusively with cognitive biases. Furthermore, these are not the only cognitive biases out there (e.g. there’s also the halo effect and the just world phenomenon ); rather, they are 12 common biases that affect how we make everyday decisions, from my experience.
In addition to the explanation of this effect above, experts are often aware of what they don’t know and (hopefully) engage their intellectual honesty and humility in this fashion. In this sense, the more you know, the less confident you're likely to be—not out of lacking knowledge, but due to caution. On the other hand, if you know only a little about something, you see it simplistically—biasing you to believe that the concept is easier to comprehend than it may actually be.
Just because I put the Dunning-Kruger Effect in the number one spot does not mean I consider it the most commonly engaged bias—it is an interesting effect, sure; but in my critical thinking classes, the confirmation bias is the one I constantly warn students about. We all favour ideas that confirm our existing beliefs and what we think we know. Likewise, when we conduct research, we all suffer from trying to find sources that justify what we believe about the subject. This bias brings to light the importance of, as I discussed in my previous post on " 5 Tips for Critical Thinking ," playing devil’s advocate . That is, we must overcome confirmation bias and consider both sides (or, if there are more than two, all sides) of the story. Remember, we are cognitively lazy—we don’t like changing our knowledge (schema) structures and how we think about things.
Ever fail an exam because your teacher hates you? Ever go in the following week and ace the next one because you studied extra hard despite that teacher? Congratulations, you’ve engaged the self-serving bias. We attribute successes and positive outcomes to our doing, basking in our own glory when things go right; but, when we face failure and negative outcomes, we tend to attribute these events to other people or contextual factors outside ourselves.
Similar in ways to the availability heuristic (Tversky & Kahneman, 1974) and to some extent, the false consensus effect , once you (truly) understand a new piece of information, that piece of information is now available to you and often becomes seemingly obvious. It might be easy to forget that there was ever a time you didn’t know this information and so, you assume that others, like yourself, also know this information: the curse of knowledge . However, it is often an unfair assumption that others share the same knowledge. The hindsight bias is similar to the curse of knowledge in that once we have information about an event, it then seems obvious that it was going to happen all along. I should have seen it coming!
As you probably guessed from the name, we have a tendency to overestimate the likelihood of positive outcomes, particularly if we are in good humour, and to overestimate the likelihood of negative outcomes if we are feeling down or have a pessimistic attitude. In either the case of optimism or pessimism , be aware that emotions can make thinking irrational. Remember one of my " 5 Tips for Critical Thinking ": Leave emotion at the door.
Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty thinking, given the manner in which we think in terms of winning, losing, and breaking even. For example, we generally believe that when we put something in, we should get something out—whether it’s effort, time, or money. With that, sometimes we lose… and that’s it—we get nothing in return. A sunk cost refers to something lost that cannot be recovered. Our aversion to losing (Kahneman, 2011) makes us irrationally cling to the idea of regaining even though it has already been lost (known in gambling as chasing the pot —when we make a bet and chase after it, perhaps making another bet to recoup the original [and hopefully more] even though, rationally, we should consider the initial bet as out-and-out lost). The appropriate advice of cutting your losses is applicable here.
Negativity bias is not totally separate from pessimism bias , but it is subtly and importantly distinct. In fact, it works according to similar mechanics as the sunk cost fallacy in that it reflects our profound aversion to losing. We like to win, but we hate to lose even more. So, when we make a decision, we generally think in terms of outcomes—either positive or negative. The bias comes into play when we irrationally weigh the potential for a negative outcome as more important than that of a positive outcome.
You may have heard the complaint that the internet will be the downfall of information dissemination; but, Socrates reportedly said the same thing about the written word. Declinism refers to a bias in favour of the past over and above "how things are going." Similarly, you might know a member of an older generation who prefaces grievances with, "Well, back in my day" before following up with how things are supposedly getting worse. The decline bias may result from something I’ve mentioned repeatedly in my posts—we don’t like change. People like their worlds to make sense, they like things wrapped up in nice, neat little packages. Our world is easier to engage in when things make sense to us. When things change, so must the way in which we think about them; and because we are cognitively lazy (Kahenman, 2011; Simon, 1957), we try our best to avoid changing our thought processes.
The backfire effect refers to the strengthening of a belief even after it has been challenged. Cook and Lewandowsky (2011) explain it very well in the context of changing people’s minds in their Debunking Handbook . The backfire effect may work based on the same foundation as Declinism , in that we do not like change. It is also similar to negativity bias , in that we wish to avoid losing and other negative outcomes—in this case, one’s idea is being challenged or rejected (i.e. perceived as being made out to be "wrong") and thus, they may hold on tighter to the idea than they had before. However, there are caveats to the backfire effect—for example, we also tend to abandon a belief if there's enough evidence against it with regard to specific facts .
The fundamental attribution error is similar to the self-serving bias , in that we look for contextual excuses for our failures, but generally blame other people or their characteristics for their failures. It also may stem from the availability heuristic in that we make judgments based only on the information we have available at hand.
One of the best textbook examples of this integrates stereotyping: Imagine you are driving behind another car. The other driver is swerving a bit and unpredictably starts speeding up and slowing down. You decide to overtake them (so as to no longer be stuck behind such a dangerous driver) and as you look over, you see a female behind the wheel. The fundamental attribution error kicks in when you make the judgment that their driving is poor because they’re a woman (also tying on to an unfounded stereotype). But what you probably don’t know is that the other driver has three children yelling and goofing around in the backseat, while she’s trying to get one to soccer, one to dance, and the other to a piano lesson. She’s had a particularly tough day and now she’s running late with all of the kids because she couldn’t leave work at the normal time. If we were that driver, we’d judge ourselves as driving poorly because of these reasons, not because of who we are. Tangentially, my wife is a much better driver than I am.
As we have seen through consideration of the self-serving bias and the fundamental attribution error , we have a tendency to be relatively kind when making judgments about ourselves. Simply, in-group bias refers to the unfair favouring of someone from one’s own group. You might think that you’re unbiased, impartial, and fair, but we all succumb to this bias, having evolved to be this way. That is, from an evolutionary perspective, this bias can be considered an advantage—favouring and protecting those similar to you, particularly with respect to kinship and the promotion of one’s own line.
As in the case of Declinism , to better understand the Forer effect (commonly known as the Barnum Effect ), it’s helpful to acknowledge that people like their world to make sense. If it didn’t, we would have no pre-existing routine to fall back on and we’d have to think harder to contextualise new information. With that, if there are gaps in our thinking of how we understand things, we will try to fill those gaps in with what we intuitively think makes sense, subsequently reinforcing our existing schema(s). As our minds make such connections to consolidate our own personal understanding of the world, it is easy to see how people can tend to process vague information and interpret it in a manner that makes it seem personal and specific to them. Given our egocentric nature (along with our desire for nice, neat little packages and patterns), when we process vague information, we hold on to what we deem meaningful to us and discard what is not. Simply, we better process information we think is specifically tailored to us, regardless of ambiguity. Specifically, the Forer effect refers to the tendency for people to accept vague and general personality descriptions as uniquely applicable to themselves without realizing that the same description could be applied to just about everyone else (Forer, 1949). For example, when people read their horoscope, even vague, general information can seem like it’s advising something relevant and specific to them.
While heuristics are generally useful for making inferences by providing us with cognitive shortcuts that help us stave off decision fatigue, some forms of heuristics can make our judgments irrational. Though various cognitive biases were covered in this post, these are by no means the only biases out there—just the most commonly engaged, in my experience, with respect to everyday decision-making . If you’re interested in learning more about these and other cognitive biases, I recommend checking out yourbias.is . Remember, we make thousands of decisions every day, some more important than others. Make sure that the ones that do matter are not made based on bias, but rather on reflective judgment and critical thinking.
Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland. Retrieved from http://www.skepticalscience.com/docs/Debunking_Handbook.pdf
Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press; with foreword by former APA President, Dr. Diane F. Halpern.
Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.
Forer, B. R. (1949) "The Fallacy of Personal Validation: A classroom Demonstration of Gullibility," Journal of Abnormal Psychology, 44, 118-121.
Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.
Kruger, J. &Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121–1134.
Simon, H. A. (1957). Models of man. New York: Wiley.
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.
West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930–941.
Christopher Dwyer, Ph.D., is a lecturer at the Technological University of the Shannon in Athlone, Ireland.
It’s increasingly common for someone to be diagnosed with a condition such as ADHD or autism as an adult. A diagnosis often brings relief, but it can also come with as many questions as answers.
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
The PMC website is updating on October 15, 2024. Learn More or Try it out now .
Diane f. halpern.
1 Department of Psychology, Claremont McKenna College, Emerita, Altadena, CA 91001, USA
2 Department of Psychology, Moravian College, Bethlehem, PA 18018, USA; ude.naivarom@nnud
Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. A high IQ is correlated with many important outcomes (e.g., academic prominence, reduced crime), but it does not protect against cognitive biases, partisan thinking, reactance, or confirmation bias, among others. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests. Similarly, some scholars argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Other investigators advocate for critical thinking as a model of intelligence specifically designed for addressing real-world problems. Yes, intelligence (i.e., critical thinking) can be enhanced and used for solving a real-world problem such as COVID-19, which we use as an example of contemporary problems that need a new approach.
The editors of this Special Issue asked authors to respond to a deceptively simple statement: “How Intelligence Can Be a Solution to Consequential World Problems.” This statement holds many complexities, including how intelligence is defined and which theories are designed to address real-world problems.
For the most part, we identify high intelligence as having a high score on a standardized test of intelligence. Like any test score, IQ can only reflect what is on the given test. Most contemporary standardized measures of intelligence include vocabulary, working memory, spatial skills, analogies, processing speed, and puzzle-like elements (e.g., Wechsler Adult Intelligence Scale Fourth Edition; see ( Drozdick et al. 2012 )). Measures of IQ correlate with many important outcomes, including academic performance ( Kretzschmar et al. 2016 ), job-related skills ( Hunter and Schmidt 1996 ), reduced likelihood of criminal behavior ( Burhan et al. 2014 ), and for those with exceptionally high IQs, obtaining a doctorate and publishing scholarly articles ( McCabe et al. 2020 ). Gottfredson ( 1997, p. 81 ) summarized these effects when she said the “predictive validity of g is ubiquitous.” More recent research using longitudinal data, found that general mental abilities and specific abilities are good predictors of several work variables including job prestige, and income ( Lang and Kell 2020 ). Although assessments of IQ are useful in many contexts, having a high IQ does not protect against falling for common cognitive fallacies (e.g., blind spot bias, reactance, anecdotal reasoning), relying on biased and blatantly one-sided information sources, failing to consider information that does not conform to one’s preferred view of reality (confirmation bias), resisting pressure to think and act in a certain way, among others. This point was clearly articulated by Stanovich ( 2009, p. 3 ) when he stated that,” IQ tests measure only a small set of the thinking abilities that people need.”
Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. For example, Grossmann et al. ( 2013 ) cite many studies in which IQ scores have not predicted well-being, including life satisfaction and longevity. Using a stratified random sample of Americans, these investigators found that wise reasoning is associated with life satisfaction, and that “there was no association between intelligence and well-being” (p. 944). (critical thinking [CT] is often referred to as “wise reasoning” or “rational thinking,”). Similar results were reported by Wirthwein and Rost ( 2011 ) who compared life satisfaction in several domains for gifted adults and adults of average intelligence. There were no differences in any of the measures of subjective well-being, except for leisure, which was significantly lower for the gifted adults. Additional research in a series of experiments by Stanovich and West ( 2008 ) found that participants with high cognitive ability were as likely as others to endorse positions that are consistent with their biases, and they were equally likely to prefer one-sided arguments over those that provided a balanced argument. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests (e.g., Sternberg 2019 ). Similarly, Stanovich and West ( 2014 ) argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Halpern and Butler ( 2020 ) advocate for CT as a useful model of intelligence for addressing real-world problems because it was designed for this purpose. Although there is much overlap among these more recent theories, often using different terms for similar concepts, we use Halpern and Butler’s conceptualization to make our point: Yes, intelligence (i.e., CT) can be enhanced and used for solving a real-world problem like COVID-19.
One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson ( 2020, p. 205 ): “the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life.” Using this definition, the question of whether intelligent thinking can solve a world problem like the novel coronavirus is a resounding “yes” because solutions to real-world novel problems are part of his definition. This is a popular idea in the general public. For example, over 1000 business managers and hiring executives said that they want employees who can think critically based on the belief that CT skills will help them solve work-related problems ( Hart Research Associates 2018 ).
We define CT as the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal directed--the kind of thinking involved in solving problems, formulating inferences, calculating likelihoods, and making decisions, when the thinker is using skills that are thoughtful and effective for the particular context and type of thinking task. International surveys conducted by the OECD ( 2019, p. 16 ) established “key information-processing competencies” that are “highly transferable, in that they are relevant to many social contexts and work situations; and ‘learnable’ and therefore subject to the influence of policy.” One of these skills is problem solving, which is one subset of CT skills.
The CT model of intelligence is comprised of two components: (1) understanding information at a deep, meaningful level and (2) appropriate use of CT skills. The underlying idea is that CT skills can be identified, taught, and learned, and when they are recognized and applied in novel settings, the individual is demonstrating intelligent thought. CT skills include judging the credibility of an information source, making cost–benefit calculations, recognizing regression to the mean, understanding the limits of extrapolation, muting reactance responses, using analogical reasoning, rating the strength of reasons that support and fail to support a conclusion, and recognizing hindsight bias or confirmation bias, among others. Critical thinkers use these skills appropriately, without prompting, and usually with conscious intent in a variety of settings.
One of the key concepts in this model is that CT skills transfer in appropriate situations. Thus, assessments using situational judgments are needed to assess whether particular skills have transferred to a novel situation where it is appropriate. In an assessment created by the first author ( Halpern 2018 ), short paragraphs provide information about 20 different everyday scenarios (e.g., A speaker at the meeting of your local school board reported that when drug use rises, grades decline; so schools need to enforce a “war on drugs” to improve student grades); participants provide two response formats for every scenario: (a) constructed responses where they respond with short written responses, followed by (b) forced choice responses (e.g., multiple choice, rating or ranking of alternatives) for the same situations.
There is a large and growing empirical literature to support the assertion that CT skills can be learned and will transfer (when taught for transfer). See for example, Holmes et al. ( 2015 ), who wrote in the prestigious Proceedings of the National Academy of Sciences , that there was “significant and sustained improvement in students’ critical thinking behavior” (p. 11,199) for students who received CT instruction. Abrami et al. ( 2015, para. 1 ) concluded from a meta-analysis that “there are effective strategies for teaching CT skills, both generic and content specific, and CT dispositions, at all educational levels and across all disciplinary areas.” Abrami et al. ( 2008, para. 1 ), included 341 effect sizes in a meta-analysis. They wrote: “findings make it clear that improvement in students’ CT skills and dispositions cannot be a matter of implicit expectation.” A strong test of whether CT skills can be used for real-word problems comes from research by Butler et al. ( 2017 ). Community adults and college students (N = 244) completed several scales including an assessment of CT, an intelligence test, and an inventory of real-life events. Both CT scores and intelligence scores predicted individual outcomes on the inventory of real-life events, but CT was a stronger predictor.
Heijltjes et al. ( 2015, p. 487 ) randomly assigned participants to either a CT instruction group or one of six other control conditions. They found that “only participants assigned to CT instruction improved their reasoning skills.” Similarly, when Halpern et al. ( 2012 ) used random assignment of participants to either a learning group where they were taught scientific reasoning skills using a game format or a control condition (which also used computerized learning and was similar in length), participants in the scientific skills learning group showed higher proportional learning gains than students who did not play the game. As the body of additional supportive research is too large to report here, interested readers can find additional lists of CT skills and support for the assertion that these skills can be learned and will transfer in Halpern and Dunn ( Forthcoming ). There is a clear need for more high-quality research on the application and transfer of CT and its relationship to IQ.
A pandemic occurs when a disease runs rampant over an entire country or even the world. Pandemics have occurred throughout history: At the time of writing this article, COVID-19 is a world-wide pandemic whose actual death rate is unknown but estimated with projections of several million over the course of 2021 and beyond ( Mega 2020 ). Although vaccines are available, it will take some time to inoculate most or much of the world’s population. Since March 2020, national and international health agencies have created a list of actions that can slow and hopefully stop the spread of COVID (e.g., wearing face masks, practicing social distancing, avoiding group gatherings), yet many people in the United States and other countries have resisted their advice.
Could instruction in CT encourage more people to accept and comply with simple life-saving measures? There are many possible reasons to believe that by increasing citizens’ CT abilities, this problematic trend can be reversed for, at least, some unknown percentage of the population. We recognize the long history of social and cognitive research showing that changing attitudes and behaviors is difficult, and it would be unrealistic to expect that individuals with extreme beliefs supported by their social group and consistent with their political ideologies are likely to change. For example, an Iranian cleric and an orthodox rabbi both claimed (separately) that the COVID-19 vaccine can make people gay ( Marr 2021 ). These unfounded opinions are based on deeply held prejudicial beliefs that we expect to be resistant to CT. We are targeting those individuals who beliefs are less extreme and may be based on reasonable reservations, such as concern about the hasty development of the vaccine and the lack of long-term data on its effects. There should be some unknown proportion of individuals who can change their COVID-19-related beliefs and actions with appropriate instruction in CT. CT can be a (partial) antidote for the chaos of the modern world with armies of bots creating content on social media, political and other forces deliberately attempting to confuse issues, and almost all media labeled “fake news” by social influencers (i.e., people with followers that sometimes run to millions on various social media). Here, are some CT skills that could be helpful in getting more people to think more critically about pandemic-related issues.
Early communications about the ability of masks to prevent the spread of COVID from national health agencies were not consistent. In many regions of the world, the benefits of wearing masks incited prolonged and acrimonious debates ( Tang 2020 ). However, after the initial confusion, virtually all of the global and national health organizations (e.g., WHO, National Health Service in the U. K., U. S. Centers for Disease Control and Prevention) endorse masks as a way to slow the spread of COVID ( Cheng et al. 2020 ; Chu et al. 2020 ). However, as we know, some people do not trust governmental agencies and often cite the conflicting information that was originally given as a reason for not wearing a mask. There are varied reasons for refusing to wear a mask, but the one most often cited is that it is against civil liberties ( Smith 2020 ). Reasoning by analogy is an appropriate CT skill for evaluating this belief (and a key skill in legal thinking). It might be useful to cite some of the many laws that already regulate our behavior such as, requiring health inspections for restaurants, setting speed limits, mandating seat belts when riding in a car, and establishing the age at which someone can consume alcohol. Individuals would be asked to consider how the mandate to wear a mask compares to these and other regulatory laws.
Another reason why some people resist the measures suggested by virtually every health agency concerns questions about whom to believe. Could training in CT change the beliefs and actions of even a small percentage of those opposed to wearing masks? Such training would include considering the following questions with practice across a wide domain of knowledge: (a) Does the source have sufficient expertise? (b) Is the expertise recent and relevant? (c) Is there a potential for gain by the information source, such as financial gain? (d) What would the ideal information source be and how close is the current source to the ideal? (e) Does the information source offer evidence that what they are recommending is likely to be correct? (f) Have you traced URLs to determine if the information in front of you really came from the alleged source?, etc. Of course, not everyone will respond in the same way to each question, so there is little likelihood that we would all think alike, but these questions provide a framework for evaluating credibility. Donovan et al. ( 2015 ) were successful using a similar approach to improve dynamic decision-making by asking participants to reflect on questions that relate to the decision. Imagine the effect of rigorous large-scale education in CT from elementary through secondary schools, as well as at the university-level. As stated above, empirical evidence has shown that people can become better thinkers with appropriate instruction in CT. With training, could we encourage some portion of the population to become more astute at judging the credibility of a source of information? It is an experiment worth trying.
Historical records show that refusal to wear a mask during a pandemic is not a new reaction. The epidemic of 1918 also included mandates to wear masks, which drew public backlash. Then, as now, many people refused, even when they were told that it was a symbol of “wartime patriotism” because the 1918 pandemic occurred during World War I ( Lovelace 2020 ). CT instruction would include instruction in why and how to compute cost–benefit analyses. Estimates of “lives saved” by wearing a mask can be made meaningful with graphical displays that allow more people to understand large numbers. Gigerenzer ( 2020 ) found that people can understand risk ratios in medicine when the numbers are presented as frequencies instead of probabilities. If this information were used when presenting the likelihood of illness and death from COVID-19, could we increase the numbers of people who understand the severity of this disease? Small scale studies by Gigerenzer have shown that it is possible.
The process of analyzing arguments requires that individuals rate the strength of support for and against a conclusion. By engaging in this practice, they must consider evidence and reasoning that may run counter to a preferred outcome. Kozyreva et al. ( 2020 ) call the deliberate failure to consider both supporting and conflicting data “deliberate ignorance”—avoiding or failing to consider information that could be useful in decision-making because it may collide with an existing belief. When applied to COVID-19, people would have to decide if the evidence for and against wearing a face mask is a reasonable way to stop the spread of this disease, and if they conclude that it is not, what are the costs and benefits of not wearing masks at a time when governmental health organizations are making them mandatory in public spaces? Again, we wonder if rigorous and systematic instruction in argument analysis would result in more positive attitudes and behaviors that relate to wearing a mask or other real-world problems. We believe that it is an experiment worth doing.
We believe that teaching CT is a worthwhile approach for educating the general public in order to improve reasoning and motivate actions to address, avert, or ameliorate real-world problems like the COVID-19 pandemic. Evidence suggests that CT can guide intelligent responses to societal and global problems. We are NOT claiming that CT skills will be a universal solution for the many real-world problems that we confront in contemporary society, or that everyone will substitute CT for other decision-making practices, but we do believe that systematic education in CT can help many people become better thinkers, and we believe that this is an important step toward creating a society that values and practices routine CT. The challenges are great, but the tools to tackle them are available, if we are willing to use them.
Conceptualization, D.F.H. and D.S.D.; resources, D.F.H.; data curation, writing—original draft preparation, D.F.H.; writing—review and editing, D.F.H. and D.S.D. All authors have read and agreed to the published version of the manuscript.
This research received no external funding.
No IRB Review.
No Informed Consent.
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Run a free plagiarism check in 10 minutes, generate accurate citations for free.
Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.
Critical thinking is the ability to effectively analyze information and form a judgment .
To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .
Critical thinking skills help you to:
Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.
Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.
Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.
In academic writing , critical thinking can help you to determine whether a source:
Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.
The AI-powered Citation Checker helps you avoid common mistakes such as:
Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.
Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.
However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.
You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.
However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.
You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.
There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.
However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.
When encountering information, ask:
Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:
If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.
Plagiarism
The academic proofreading tool has been trained on 1000s of academic texts. Making it the most accurate and reliable proofreading tool for students. Free citation check included.
Try for free
Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.
Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.
Critical thinking skills include the ability to:
You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.
Ask questions such as:
A credible source should pass the CRAAP test and follow these guidelines:
Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.
Being information literate means that you:
Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.
Although selective recall is a component of confirmation bias, it should not be confused with recall bias.
On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.
If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.
Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved September 18, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/
Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, get unlimited documents corrected.
✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts
Translate this page from English...
*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.
| ||
Foundation for Critical Thinking Press, 2008)
Teacher’s College, Columbia University, 1941) | ||
| ||
For example, we might:
The reasons for our poor decision making can be a consequence of heuristics and biases. In general, heuristics and biases describe a set of decision-making strategies and the way that we weigh certain types of information. The existing literature on cognitive biases and heuristics is extensive, but this post is a user-friendly summary.
Central to this post’s topic is how cognitive heuristics and biases influence our decision making. We will also learn more about how to overcome them.
Before you continue, we thought you might like to download these Positive CBT Exercises for free . These science-based exercises will provide you with detailed insight into Positive CBT and give you the tools to apply it in your therapy or coaching.
What are cognitive biases.
Examples in business and everyday life, role of biases in decision making, 2 popular experiments, 4 ways to overcome your biases, bias modification exercises and activities, a look at cognitive bias modification apps, 5 relevant books, our favorite ted talks on the topic, resources from positivepsychology.com, a take-home message.
When considering the term ‘ cognitive biases ,’ it’s important to note that there is overlap between cognitive biases and heuristics . Sometimes these two terms are used interchangeably, as though they are synonyms; however, their relationship is nuanced.
In his book, Thinking, Fast and Slow , Professor Daniel Kahneman (2011, p. 98) defines heuristics as
“ a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. ”
Tversky and Kahneman (1974, p. 1130) define the relationship between biases and heuristics as follows:
“ … cognitive biases that stem from the reliance on judgmental heuristics. ”
Gonzalez (2017, p. 251) also described the difference between the two terms:
“ Heuristics are the ‘ shortcuts ’ that humans use to reduce task complexity in judgment and choice, and biases are the resulting gaps between normative behavior and the heuristically determined behavior. ”
Created by John Manoogian III and Buster Benson, this codex is a useful tool for visually representing all of the known biases that exist to date.
The biases are arranged in a circle and can be divided into four quadrants. Each quadrant is dedicated to a specific group of cognitive biases:
The Cognitive Bias Codex is a handy visual tool that organizes biases in a meaningful way; however, it is worth pointing out that the codex lists heuristics and biases both as ‘biases.’
If you decide to rely on the Cognitive Bias Codex, then keep in mind the distinction between heuristics and biases mentioned above.
This bias is based on looking for or overvaluing information that confirms our beliefs or expectations (Edgar & Edgar, 2016; Nickerson, 1998). For example, a police officer who is looking for physical signs of lying might mistakenly classify other behaviors as evidence of lying.
This false belief describes our tendency to believe that something will happen because it hasn’t happened yet (Ayton & Fischer, 2004; Clotfelter & Cook, 1993).
For example, when betting on a roulette table, if previous outcomes have landed on red, then we might mistakenly assume that the next outcome will be black; however, these events are independent of each other (i.e., the probability of their results do not affect each other).
Gender bias describes our tendency to assign specific behavior and characteristics to a particular gender without supporting evidence (Garb, 1997).
For example, complaints of pain are taken more seriously when made by male, rather than female, patients (Gawande, 2014); women are perceived as better caregivers than men (Anthony, 2004); specific clinical syndromes are more readily diagnosed in women than in men (Garb, 1997); and students often rate female lecturers lower than male lecturers (MacNell, Driscoll, & Hunt; 2014; Mitchell & Martin, 2018).
This error describes our tendency to overgeneralize how a group of people will behave based on an interaction with only one person from that group (Pettigrew, 1979).
For example, a negative experience with someone from a different group (e.g., a different culture, gender, religion, political party, etc.) might make us say that all members of that group share the same negative characteristics. Group attribution error forms part of the explanation for prejudice in social psychology.
These detailed, science-based exercises will equip you or your clients with tools to find new pathways to reduce suffering and more effectively cope with life stressors.
By filling out your name and email address below.
Gender bias in the workplace is a well-documented and researched area of cognitive bias. Women often do not occupy top senior positions. For example, in 2010, only 15.2% of top positions in US Fortune-500 companies were held by women (Soares, 2010). Women tend to earn less than their male counterparts, and women’s salaries differ according to their marital status.
For example, consider these statistics reported by Güngör and Biernat (2009, p. 232):
“ [In 2005] … 68.1% of married and 79.8% of single mothers in the U.S. participate in the workforce, but while non-mothers earn 90 cents to a man’s dollar, mothers earn 73 cents, and single mothers earn about 60 cents.”
The social desirability bias is a concern for anyone who uses self-report data. Companies that run internal surveys investigating topics that may cast an employee in a poor light must be aware of how the social desirability bias will affect the validity of their data.
Knowing that people adjust their answers to appear more socially desirable, investigators (such as researchers and clinicians) can try to reframe their questions to be less direct, use formal tests, or anonymize responses.
Another sphere of our lives where biases can have devastating effects is in personal finance. According to Hershey, Jacobs-Lawson, and Austin (2012), there are at least 40 cognitive biases that negatively affect our ability to make sound financial decisions, thus hindering our ability to plan for retirement properly. Some of these biases include:
Below you might find revealing insight into how biases affect our decision making.
Assume that there are three doors.
You initially choose Door 1. Before revealing what’s behind your chosen door, the presenter opens a different door, Door 2, to reveal the mediocre prize. The presenter then gives you the option to either keep what’s behind your initial chosen door or change your choice, knowing what’s behind Door 2. What should you do now? Should you stay with your initial choice, Door 1, or should you switch to Door 3?
The correct answer is that you have the best chances of winning the car if you change your choice. This is called the Monty Hall problem. Here’s why you should switch:
Despite the statistics being in favor of switching, most people are hesitant to abandon their first choice and don’t accept the offer to change it.
The Monty Hall problem is an excellent example of how our intuitions and heuristics lead us to make poor decisions. However, there are lots of other cognitive biases and heuristics that also affect our decision making.
Kahneman, Slovic, Slovic, & Tversky (1982) list 13 biases that arise from the following three heuristics:
To further illustrate the effect of cognitive bias, below are two popular experiments.
Tversky and Kahneman (1974) found that our estimates are heavily influenced by the first number given to us. For example, participants were asked to estimate the percentage of African countries in the United Nations.
Before giving their answer, each participant had to spin a ‘Wheel of Fortune,’ which would determine their initial starting percentage. The result of the ‘Wheel of Fortune’ was random and meaningless. Despite this, participants’ estimate of African UN member-countries didn’t differ much from whatever random ‘Wheel of Fortune’ amount they landed on, regardless of what that amount was.
Male students were asked to rate essays written by female authors (Landy & Sigall, 1974). The quality of the essays varied: some were poorly written, and others were well written.
Additionally, some of the essays were accompanied by a photograph of the author (who was either attractive or unattractive), and others were not. Male college students rated the quality of the essay and the talent of the authors higher when:
In this study, the male students demonstrated the halo effect, applying the perceived attractiveness of the female author to the quality of the paper.
If you’ve been in a similar situation before, you can reflect on the outcomes of those previous decisions to learn how to overcome your biases.
An example of this is budgeting. We tend to underestimate how much money we need to budget for certain areas of our life. However, you can learn how much money to budget by tracking your expenditure for the last few months. Using this information from the past, you can better predict how much money you’ll need for different financial categories in the future.
There is some evidence that we make better decisions and negotiations when we consult with other people who are objective, such as mediators and facilitators (Caputo, 2016).
Therefore, before making a decision, talk to other people to consider different viewpoints and have your own views challenged. Importantly, other people might spot your own cognitive biases.
When making a decision, try to see the weaknesses in your thinking regardless of how small, unlikely, or inconsequential these weaknesses might seem. You can be more confident in your decision if it withstands serious, critical scrutiny.
A final way to protect yourself from relying on your cognitive biases is to avoid making any decisions under time pressure. Although it might not feel like it, there are very few instances when you need to make a decision immediately. Here are some tips for making a decision that can have substantial consequences:
The Positive Psychology Toolkit© is a groundbreaking practitioner resource containing over 500 science-based exercises , activities, interventions, questionnaires, and assessments created by experts using the latest positive psychology research.
Updated monthly. 100% Science-based.
“The best positive psychology resource out there!” — Emiliya Zhivotovskaya , Flourishing Center CEO
In the last decade, research has looked at cognitive bias modification (CBM) since cognitive biases are associated with the severity of anxiety and depression. The relationship between cognitive biases and anxiety and depression is assumed to be causal; that is, cognitive biases cause an increase in the severity of symptoms.
CBM exercises are designed with this causal relationship in mind. If the cognitive bias is removed or reduced, then the severity of the symptoms should also lessen.
There are two categories of CBM exercises:
At least six meta-analyses report conflicting findings (Beard, Sawyer, & Hofmann, 2012; Cristea, Kok, & Cuijpers, 2015; Hakamata et al., 2010; Hallion & Ruscio, 2011; Heeren, Mogoașe, Philippot, & McNally, 2015; Mogoaşe, David, & Koster, 2014).
There are many reasons for these differences; for example, the types of studies included, the moderators included, the definition of the interventions, the outcome variable used, the clinical condition studied, and so forth. Therefore, the jury is still out on whether CBM affects symptom severity reliably.
There are many cognitive bias modification apps available for download. Before purchasing an app, research whether the creator of the app has followed sound research principles or done any research when developing the app (Zhang, Ying, Song, Fung, & Smith, 2018).
Most of the bias modification apps aim to change the attentional bias. For example, the following apps aim to train users to respond quicker to happy faces than to sad or angry faces. All hypothesize that repeated use will result in more positive moods.
The Cognitive Bias Cheatsheet is a useful way to remind oneself of the different cognitive biases that exist.
Here is a list of books relevant for anyone interested in cognitive biases.
Firstly, any list about biases would be remiss without Thinking, Fast and Slow by Daniel Kahneman (2011). In this book, Kahneman unpacks some of the most common biases that we experience when making decisions. (Available on Amazon )
In the same vein is The Drunkard’s Walk: How Randomness Rules Our Lives by Leonard Mlodinow (2009). This book addresses how humans misjudge the effect that randomness has on our decision making. (Available on Amazon )
Predictably Irrational by Dan Ariely (2008) is an excellent and very accessible book about how our behavior is often governed by seemingly random and illogical thought processes. The opening chapter is jaw dropping. (Available on Amazon )
Nassim Nicholas Taleb published a series of books – five, in fact – and I include two of them on this list: Fooled by Randomness (2005) and The Black Swan (2007). The entire series discusses various aspects of uncertainty. (Available on Amazon )
We’ve put together a list of our favorite impressive TED talks on cognitive biases.
If you want to learn more about cognitive biases, then these talks are a great jumping-off point:
Confirmation bias – nassor al hilal.
If you want to learn how to overcome your biases, then we can recommend the following:
We have useful resources that you can use when tackling cognitive biases.
First, increasing awareness of Unhelpful Thinking Styles can change the way you think about yourself and your environment. Ultimately, users will increase their awareness of their cognitive biases, and through this awareness, be able to change their behavior.
Our Neutralizing Judgmental Thoughts worksheet is also useful for combating negative thoughts and biases. This exercise helps users apply the CLEAR acronym to adopt a less critical outlook when dealing with others.
The Core Beliefs Worksheet is a useful tool for reflecting on the origin and validity of our core beliefs. This technique might help us ‘step away’ from our biases.
An approach that is always beneficial, is to understand and find ways to apply positive psychology to your every day, and this selection of positive psychology TED Talks is a good starting point.
If you’re looking for more science-based ways to help others through CBT, this collection contains 17 validated positive CBT tools for practitioners. Use them to help others overcome unhelpful thoughts and feelings and develop more positive behaviors.
These 17 Positive CBT & Cognitive Therapy Exercises [PDF] include our top-rated, ready-made templates for helping others develop more helpful thoughts and behaviors in response to challenges, while broadening the scope of traditional CBT.
Created by Experts. 100% Science-based.
We often rely on cognitive heuristics and biases when making decisions.
Heuristics can be useful in certain circumstances; however, heuristics and biases can result in poor decision making and reinforce unhealthy behavior.
There are many different types of cognitive biases, and all of us are victim to one or more.
However, being aware of our biases and how they affect our behavior is the first step toward resisting them.
We hope you enjoyed reading this article. For more information, don’t forget to download our three Positive CBT Exercises for free .
Share this article:
What our readers think.
Good Content
It’s truly appreciated your efforts I am sure this is going to be of immense help in my lectures to diploma students
Dumb. I wish the author would have used the “4 Ways to Overcome Bias” prior to writing this irrelevant article. Then maybe the contradictions could have been avoided such as the codex containing both heuristics and biases or the gender bias and marital status. Leading off with gender bias, by the way, is a dead giveaway that this is just propaganda disguised as pseudo-intellectual tripe. The author painted the issue as “women make less, so = bias.” Allow me to use the group attribution bias and call this as another article from majority female psychology “profession.”
I agree that awareness of our biases is the first step to overcoming them. But I think it’s important to also understand that we all have biases, and that they’re not necessarily a bad thing. We need biases to make decisions, and without them we would be paralyzed. The key is to be aware of our biases and to try to overcome them when they lead us astray.
You may be conflating heuristics with bias. Heuristics are shortcuts to see pattern and simplify things based on our experience and intuition. Heuristics are needed to make decisions and solve problems. But heuristics are also prone to thinking errors. Biases are thinking errors.
Thank you for this feast of information that I will be savoring over and over for weeks.
Thank you very much for the detailed information and resources. I plan to utilize this as a link with certain highlighted components for my social psychology unit with my students.
VERY GOOD ARTICLE. MAY USE IT TO TRAIN MY STUDENTS RESILIENCE AND GRIT AND PERCEPTIVE CAPACITY IN COUNSELLING SESSIONS
PATRICK MITI
Thanks for the detailed blog. I’m going to provide this as a link on my critical thinking course that I teach at a university.
Your email address will not be published.
Save my name, email, and website in this browser for the next time I comment.
Cognitive behavioral therapy (CBT) is a popular and highly effective intervention model for dealing with multiple mental health conditions (Early & Grady, 2017; Yarwood et [...]
We all try to make sense of the behaviors we observe in ourselves and others. However, sometimes this process can be marred by cognitive biases [...]
Even though we may consider ourselves logical and rational, it appears we are easily biased by a single incident or individual characteristic (Nicolau, Mellinas, & [...]
3 Positive CBT Exercises (PDF)
IMAGES
VIDEO
COMMENTS
Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.
Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking. Wikipedia lists 197 ...
The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making. Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias.
Research suggests that cognitive training can help minimize cognitive biases in thinking. Some things that you can do to help overcome biases that might influence your thinking and decision-making include: Being aware of bias: Consider how biases might influence your thinking. In one study, researchers provided feedback and information that ...
Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X. 2. Implicit biases are unconscious beliefs that lead ...
Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect, inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...
Critical Thinking. Critical Thinking is the process of using and assessing reasons to evaluate statements, assumptions, and arguments in ordinary situations. ... the representativeness heuristic, confirmation bias, attentional bias, and the anchoring effect. The field of behavioral economics, made popular by Dan Ariely (2008; 2010; 2012) and ...
The List Of Cognitive Biases: A Graphic Of 180+ Heuristics. Cognitive biases are a kind of ongoing cognitive 'condition'-tendencies to selectively search for and interpret data in a way that confirms one's existing beliefs. A cognitive bias is an inherent thinking 'blind spot' that reduces thinking accuracy and results inaccurate ...
12.2 Bias in Critical Thinking Theory and Pedagogy. Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others.
critical-thinking-I-put-"knowledge-of-the-psychology-of-human-judgment"-under-the-heading-of-background+knowledge: 1. logic 2. argumentation ... biases-will-be-a-central-feature-of-critical-thinking-education-in-the-21st-century.-What They Are and Why They're Important 3 1. Cognitive Biases: What They Are and Why They're ...
Recognizing bias is an essential part of problem solving and critical thinking. It is important to be aware of potential sources of bias, such as personal opinions, values, or preconceived notions. Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence.
This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while ...
Classify and describe cognitive biases. Apply critical reflection strategies to resist cognitive biases. To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical ...
Remember one of my "5 Tips for Critical Thinking": Leave emotion at the door. 6. The Sunk Cost Fallacy. Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty ...
Other investigators advocate for critical thinking as a model of intelligence specifically designed for addressing real-world problems. ... Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict my-side bias and one-sided thinking biases. Thinking & Reasoning. 2008; 14:129-67. doi: 10.1080/13546780701679764 ...
Contact Sales Learn More. Cognitive biases are inherent in the way we think, and many of them are unconscious. Identifying the biases you experience and purport in your everyday interactions is the first step to understanding how our mental processes work, which can help us make better, more informed decisions.
Cognitive biases can affect your decision-making skills, limit your problem-solving abilities, hamper your career success, damage the reliability of your memories, challenge your ability to ...
Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.
A weak sense critical thinker is skilled at using critical thinking tools to serve 'egocentric' and 'sociocentric' biases . Weak-sense critical thinkers can make strong and logical arguments, but they are not fair-minded as they lack the ability to take on the perspective of others ( Paul, 1992 ).
Critical Thinking as Defined by the National Council for Excellence in Critical Thinking, 1987 . ... biases, distortions, uncritically accepted social rules and taboos, self-interest, and vested interest. They strive to improve the world in whatever ways they can and contribute to a more rational, civilized society.
There are numerous examples of cognitive biases, and the list keeps growing. Here are a few examples of some of the more common ones. 1. Confirmation bias. This bias is based on looking for or overvaluing information that confirms our beliefs or expectations (Edgar & Edgar, 2016; Nickerson, 1998).
Inside Critical Thinking, Logic, and Problem Solving, youll unlock the tools and strategies needed to transform your thinking and elevate your problem-solving skills. ... Recognize Cognitive Biases: Understand how biases distort reality and learn powerful methods to overcome them. Sharpen Your Reasoning: Develop razor-sharp deductive, inductive ...