Voncharlotte ruhl, published on May 04, 2021
news to go
- Cognitive biases are unconscious thinking errors that result from problems related to memory, attention, and other mental errors.
- These biases are the result of our brain's efforts to simplify the incredibly complex world in which we live.
- confirmation bias, hindsight bias,selfish bias, anchoring bias,availability error,no framing effect, yblindness from lack of attentionare some of the most common examples of cognitive biases. Another example is thefalse consensus effect.
- Cognitive biases have a direct impact on our safety, our interactions with others, and the way we make judgments and decisions in our daily lives.
- Although these biases are unconscious, there are small steps we can take to train our minds to adopt a new way of thinking and lessen the effects of these biases.
Have you ever been so busy talking on the phone that you don't realize the light has turned green and it's your turn to cross the street?
Have you ever yelled, "I knew this was going to happen!" after his favorite baseball team gave up a huge lead and lost in the ninth inning?
Or have you ever noticed that you only read news that supports your own opinion?
These are just a few of the many cases of cognitive distortion that we experience every day of our lives. But before we delve into these different biases, let's back up a bit and define what bias really is.
Table of Contents
- Definition
- The history of cognitive distortion
- examples
- prevent prejudice
- references
What is cognitive bias?
A cognitive bias is an unconscious thought error that causes you to misinterpret information from the world around you and impairs the rationality and accuracy of decisions and judgments. Biases are unconscious and automatic processes intended to make decision making faster and more efficient. Cognitive biases can be caused by a number of different things, such as:Heuristics (mental shortcuts), social pressures and emotions.
Generally speaking, bias is a tendency to lean for or against a person, group, idea, or cause, often unfairly. Bias is natural, it's a product of human nature, and it doesn't just exist in a vacuum or in our heads, it affects the way we make decisions and act.
There are two main branches of bias in psychology: conscious and unconscious. Conscious bias or explicit bias is intentional: you are aware of your attitudes and the resulting behaviors (Lang, 2019).
Explicit bias can be good because it helps give you a sense of identity and can lead you to make good choices (for example, bias towards healthy foods).
However, these prejudices can often be dangerous when they take the form of conscious stereotyping.
Secondly,unconscious bias, or cognitive bias, represents a set of biases that are not intentional: you are not aware of your attitudes and the resulting behaviors (Lang, 2019).
Cognitive bias is often the result of your brain trying to simplify information processing: we receive about 11 million bits of information per second, but we can only process about 40 bits of information per second (Orzan et al., 2012).
As a result, we often rely on mental shortcuts (called heuristics) to make sense of the world relatively quickly. Therefore, these errors usually arise from problems related to thinking: memory, attention and other mental errors.
Cognitive bias can be beneficial because it doesn't require much mental effort and allows you to make decisions relatively quickly, but like conscious bias, unconscious bias can also take the form of a harmful bias designed to harm a person or a group. .
And while it may seem that unconscious bias has increased recently, particularly in the context of police brutality and the Black Lives Matter movement, this is not a new phenomenon.
The history of cognitive bias: a crash course
The term cognitive bias was first coined in the 1970s by Israeli psychologists Amos Tversky and Daniel Kahneman, who used it to describe people's faulty thought patterns in response to judgment and decision-making problems (Tversky & Kahnemann, 1974).
Tversky and Kahneman's research program, the Heuristics and Bias program, examined how people make decisions when they have limited resources (p.
Due to these limited resources, people are forced to rely on heuristics or quick mental shortcuts to make their decisions.
Tversky and Kahneman wanted to understand the specific biases involved in this evaluation and decision-making process.
To do this, the two researchers relied on a research paradigm that presented participants with a type of reasoning problem that had a calculated normative response (they used probability theory and statistics to calculate the expected response).
Then the actual responses of the participants were compared to the solution given to reveal the biases that occurred in the mind.
After conducting multiple experiments with myriad thinking problems, researchers were able to identify numerous norm violations that arise when our minds rely on these cognitive biases to make decisions and make judgments (Wilke & Mata, 2012).
examples
Thanks to Tversky and Kahneman (and several other psychologists who paved the way), we now have an existing dictionary of our cognitive biases. These biases also appear as an attempt to simplify the complex world and make information processing faster and easier. This section discusses some of the more common forms of cognitive bias.
confirmation bias
Confirmation bias refers to the tendency to interpret new information as confirmation of your pre-existing beliefs and opinions.
Practical examples
Since Watson's 1960 experiment, real-world examples of confirmation bias have drawn attention.
This form of bias often seeps into the research world itself, as psychologists selectively interpret data or ignore unfavorable data to arrive at results that support their original hypothesis.
Confirmation bias is also incredibly prevalent online, especially on social media. We tend to read news articles online that support our beliefs and don't seek out sources that challenge them.
Various social media platforms like Facebook help amplify our confirmation bias by providing us with stories we are likely to agree with, further pushing us into these echo chambers of political polarization.
Some examples of confirmation bias are particularly pernicious, especially in the context of law. For example, a detective may identify a suspect early in an investigation and then look for corroborating evidence and downplay distorting evidence.
try it
Confirmation bias dates back to 1960, when Peter Wason asked participants to identify a rule that applied to triples of numbers.
People were first told that the sequence 2, 4, 6 fit the rule and then they had to create their own triples and were told if that sequence fit the rule. The rule was simple: any ascending order.
But not only did the participants have an unusually difficult time acknowledging this and instead develop overly complicated hypotheses, they only generated triples that confirmed their pre-existing hypothesis (Wason, 1960).
explanations
But why does confirmation bias occur? In part it is due to the effect of longing on our beliefs. In other words, certain desirable conclusions (those that support our beliefs) are more likely to be processed by the brain and declared true (Nickerson, 1998).
This motivational explanation is often combined with a more cognitive theory.
The cognitive explanation argues that since our minds can only focus on one thing at a time, it is difficult to process alternative hypotheses in parallel (see Information Processing for more information). Therefore, we only process information that is consistent with our beliefs (Nickerson, 1998).
Another theory explains the confirmation bias as a way to increase and protect our self-esteem.
As with the egotistical bias (see below), our minds choose to reinforce our preexisting ideas because being right helps maintain our sense of identity, which is important for feeling safe in the world and maintaining positive relationships (Casad, 2019). ).
Although confirmation bias has obvious consequences, you can still work to overcome it if you are open-minded and willing to view situations from a different perspective than you might be used to (Luippold et al., 2015).
Even if this bias is unconscious, training your mind to be more flexible in its thought patterns will help mitigate the effects of this bias.
retrospective error
Hindsight bias refers to the tendency to perceive past events as more predictable than they really were (Roese & Vohs, 2012). There are both cognitive and motivational explanations for why we place such certainty in knowing the outcome of an event only after the event is complete.
Practical examples
When sports fans know the outcome of a game, they often question certain decisions made by coaches that they otherwise would not have questioned or guessed at.
And the fans are also quick to realize that they knew their team would win or lose, but of course they only make that statement after their team actually won or lost.
Although research studies have shown that hindsight bias is not necessarily mitigated by simply acknowledging bias (Pohl & Hell, 1996).
You can still make a conscious effort to remind yourself that you cannot predict the future and motivate yourself to consider alternative explanations.
It is important to do everything in our power to reduce this bias because if we are too confident in predicting the results, we could make risky decisions in the future that could have potentially dangerous results.
try it
Building on the growing list of Tversky and Kahneman heuristics, researchers Baruch Fischhoff and Ruth Beyth-Marom (1975) were the first to study hindsight bias directly in an empirical setting.
The team asked participants to rate the probability of different outcomes from former US President Richard Nixon's visit to Beijing and Moscow.
After Nixon returned to the United States, participants were asked to recall the probability of each outcome that they originally assigned.
Fischhoff and Beyth found that when events actually occurred, participants vastly overestimated the initial probability they assigned to those events.
In the same year, Fischhoff (1975) introduced a new method for testing for hindsight bias, which is still used by researchers today.
Participants are given a short story with four possible outcomes and told that one of them is true. Then, when asked to assign the probability of each particular outcome, they routinely assign a higher probability to the outcome they were told to be true, regardless of how likely it actually is.
But retrospective errors are not exclusive to artificial environments. In 1993, Dorothee Dietrich and Matthew Olsen asked college students to predict how the United States Senate would vote to confirm Supreme Court nominee Clarence Thomas.
Before the vote, 58% of participants predicted it would be confirmed, but after it was confirmed, 78% of students said they thought it would be confirmed, a prime example of hindsight bias. And this form of bias obviously goes beyond the world of research.
explanations
From a cognitive perspective, hindsight bias can result from distortions in memories of what we knew or thought we knew before an event occurred (Inman, 2016).
It is easier to remember information that is consistent with our current knowledge, so our memories are distorted in a way that is consistent with what actually happened.
Motivational explanations of hindsight bias indicate that we are motivated to live in a predictable world (Inman, 2016).
When surprising results emerge, our expectations are violated and we may experience negative reactions as a result. Therefore, we relied on hindsight bias to avoid these negative reactions to certain unforeseen events and to ensure that we really knew what was going to happen.
selfish bias
selfish biasIt refers to the tendency to take personal responsibility for positive results and blame external factors for negative results.
You might rightly wonder how this is similar to the fundamental attribution error (Ross, 1977), which identifies our tendency to overemphasize internal factors in other people's behavior while attributing external factors to our own.
The difference is that the egotistical bias has to do with value. That is, how good or bad an event or situation is. And it is only for events in which you are the actor.
In other words, if a driver cuts you off when the light turns green, the basic attribution error could make you think they're a bad person and not consider the possibility that they're late for work.
On the other hand, the egotistical bias is exerted when you are the actor. In this example, you would be the driver pulling ahead of the other car, which you think is because you are late (an external attribution to a negative event) rather than being a bad person.
Practical examples
From sports to the workplace, selfish bias is incredibly common. For example, athletes are quick to take responsibility for personal victories and attribute their successes to their hard work and mental strength, but when they lose they blame external factors such as unfair decisions or inclement weather (Allen et al., 2020).
In the workplace, people attribute internal factors when hired for a job, but external factors when fired (Furnham, 1982). And in the office itself, conflicts in the workplace are given external attributions and achievements, be it a convincing presentation or a promotion, are recognized with internal explanations (Walther & Bazarova, 2007).
Furthermore, the egotistical bias is more common inindividualistic cultures, which emphasize self-esteem and individual goals, and is less common among people with depression (Mezulis et al., 2004), who are more likely to take responsibility for negative outcomes.
Overcoming this bias can be difficult because it comes at the expense of our self-esteem. Still, practicing self-compassion—treating yourself kindly even when you're falling short or failing—can help reduce selfish bias (Neff, 2003).
explanations
The main explanation for why egotistical bias occurs is that it is a way to protect our self-esteem (similar to one of the explanations for confirmation bias).
We are quick to appreciate the positive outcomes and blame the negative ones in order to strengthen and preserve our individual egos, which are necessary for trust and healthy relationships with others (Heider, 1982).
Another theory holds that the egotistical tendency occurs when surprising events occur. When specific results do not meet our expectations, we attribute external factors, but when results meet our expectations, we attribute internal factors (Miller & Ross, 1975).
An extension of this theory is that we are inherently optimistic, so negative results come as a surprise and receive external attribution as a result.
anchor prestress
anchor biasit is closely related to the decision-making process and occurs when we rely too much on pre-existing information or first information (the anchor) when making a decision.
For example, if you first see a t-shirt that costs $1,000 and then a second one that costs $100, you are more likely to see the second t-shirt as cheap as if you see the first t-shirt costing €120. Here the price of the first shirt affects how you see the second.
Practical examples
From sports to the workplace, selfish bias is incredibly common. For example, athletes are quick to take responsibility for personal victories and attribute their successes to their hard work and mental strength, but when they lose they blame external factors such as unfair decisions or inclement weather (Allen et al., 2020).
In the workplace, people attribute internal factors when hired for a job, but external factors when fired (Furnham, 1982). And in the office itself, conflicts in the workplace are given external attributions and achievements, be it a convincing presentation or a promotion, are recognized with internal explanations (Walther & Bazarova, 2007).
Furthermore, the egotistical bias is more common in individualistic cultures that emphasize self-esteem and individual goals, and less common among people with depression (Mezulis et al., 2004), who are more likely to take responsibility for negative outcomes.
Overcoming this bias can be difficult because it comes at the expense of our self-esteem. Still, practicing self-compassion—treating yourself kindly even when you're falling short or failing—can help reduce selfish bias (Neff, 2003).
explanations
There are several theories that try to explain the existence of this bias.
A theory known as anchoring and adjusting holds that once an anchor is established, people do not stray far enough from it to arrive at their final answer, and therefore their final guess or decision is closer to the anchor than expected. which would have been the case. (Tversky and Kahnemann, 1992).
And when people experience a higher cognitive load (the amount of information that working memory can store at any given time—for example, making a difficult decision versus an easy one), they are more susceptible to anchoring effects.
Another theory, selective accessibility, holds that while we assume the anchor is not an appropriate response (or price, if we go back to the original example), when evaluating the second stimulus (or second t-shirts) it looks for ways in which it is similar or different from the anchor (the price is quite different), giving rise to the anchor effect (Mussweiler & Strack, 1999).
A final theory is that providing an anchor changes a person's attitude toward being more favorable to the anchor, which then results in future responses having similar characteristics to the original anchor.
Although there are many different theories as to why we experience anchoring bias, they all agree that it affects our decisions in real ways (Wegner et al., 2001).
try it
The first study that brought this bias to light was during one of the early experiments by Tversky and Kahneman (1974). They asked the participants to calculate the product of the numbers 1-8 in five seconds, either as 1x2x3... or 8x7x6...
The participants did not have enough time to calculate the actual answer, so they had to make an estimate based on their initial calculations.
They found that those who computed the small multiplications first (i.e., 1x2x3...) gave a median estimate of 512, but those who computed the larger multiplications first gave a median estimate of 2,250 (although the actual answer is 40,320).
This shows how the first calculations influenced the participant's final answer.
availability error
Availability bias (commonly known asheuristic availability) refers to the tendency to think that examples of things that easily come to mind are more common than they actually are.
In other words, the information that comes to mind the fastest influences our decisions about the future. And just like hindsight bias, this bias is related to a memory error.
But instead of fabricating a memory, it's an overemphasis on a particular memory.
C
At work, if someone is being considered for a promotion but their boss remembers something bad that happened years ago but left a lasting impression, that event could have a huge impact on the final decision.
Another common example is someone buying lottery tickets because the lifestyle and benefits of winning are more readily available in the mind (and the potential emotions associated with winning or other people's winnings) than the complex calculation of probability. of actually winning the lottery (Kirsch, 2019).
One last general example, used to demonstrate the availability heuristic, describes how watching various TV shows or news reports about shark attacks (or anything sensational in the news, like serial killers or plane crashes) can lead you to think that These types of incidents are relatively common, although it is not at all.
Still, this mindset might make you less inclined to jump in the water the next time you visit the beach (Cherry, 2019).
explanations
As with most cognitive biases, the best way to overcome them is to acknowledge the bias and become more aware of your thoughts and choices.
And because we fall victim to this bias when our brains rely on quick mental shortcuts to save time, slowing down our thinking and decision-making processes is a critical step in mitigating the effects of the availability heuristic.
Researchers believe this bias occurs because the brain is constantly trying to minimize the effort required to make decisions, so we rely on specific memories, the ones that are easiest for us to recall, rather than the complicated task of calculating statistical probabilities. .
There are two main types of memories that are easier to remember: 1) those that are more consistent with how we see the world, and 2) those that evoke more emotions and leave a more lasting impression.
try it
This first type of memory was identified as early as 1973 when Tversky and Kahneman, our pioneers of cognitive bias, conducted a study in which they asked participants if more words began with the letter K or if more words had K as the third letter.
Although many more words have K as their third letter, 70% of participants reported that more words start with K because the ability to remember them is not only easier, but also more like how they use it to see. the world (know the first card). of a word is infinitely more frequent than the third letter of a word).
Regarding the second type of memory, in 1983, 10 years later, the same duo conducted an experiment in which half of the participants were asked to estimate the probability of a massive flood somewhere in North America, and the another half had to guess the probability of an earthquake Flood in California.
Although the latter is much less likely, participants said this is much more common because they were able to recall specific, emotionally charged events from the California earthquakes, largely from the coverage they receive.
Together, these studies show how memories that are easier to recall powerfully influence our judgments and perceptions about future events.
blindness from lack of attention
A final popular form of cognitive bias isblindness from lack of attention. This occurs when a person fails to perceive a stimulus that is in sight because his attention is elsewhere.
For example, if you're driving a car, you may be so focused on the road that you don't even realize a car is cutting into your lane.
Because your attention is on something else, you cannot react in time, which could lead to a car accident. Experiencing involuntary blindness has its obvious consequences (as this example shows), but like all prejudice, it is not impossible to overcome.
explanations
There are many theories that try to explain why we experience this form of cognitive distortion. In reality, it is likely a combination of these explanations.
Visibility states that certain sensory stimuli (eg, bright colors) and cognitive stimuli (eg, something familiar) are more likely to be processed, so stimuli that don't fit may be missed in neither of these two categories.
Mental workload theory describes how if we focus too much of our brain's mental energy on one stimulus, we will deplete our cognitive resources and be unable to process another stimulus at the same time.
Similarly, some psychologists explain how we perceive different stimuli with different attentional capacities, and this could affect our ability to process multiple stimuli simultaneously.
In other words, an experienced driver may see the car veering sharply into the lane because they are using fewer mental resources to drive, while a novice driver may be using more resources to focus on the road ahead instead of concentrating on the road ahead. process this auto dodge.
A final explanation argues that because our attentional and processing resources are limited, our brain dedicates them to what fits our schemata, or our cognitive representations of the world (Cherry, 2020).
So when an unexpected stimulus enters our field of vision, we may not be able to consciously process it. The following example illustrates how this can be done.
try it
The best known study demonstrating the phenomenon of accidental blindness is the study on invisible gorillas (Most et al., 2001). In this experiment, participants were asked to watch a video of two separate groups playing basketball and count the number of times the white team played the ball.
Participants can accurately count the number of passes, but what they don't notice is a gorilla running through the center of the circle.
Since this would not be expected and because our brain is using its resources by counting the number of iterations, we completely fail to process whatever is in front of our eyes.
A real-life example of accidental blindness occurred in 1995 when Boston police officer Kenny Conley was in pursuit of a suspect and passed a group of officers who were mistakenly stopping an undercover police officer.
Conley was found guilty of perjury and obstruction of justice for allegedly seeing the fight between the undercover cop and the other officers and lying about it to protect the officers, but he kept his word that he had not actually seen it (due to lack of of attention). blindness) and was finally exonerated (Pickel, 2015).
The key to overcoming involuntary blindness is to maximize your alertness by avoiding distractions like checking your phone. And it's also important to pay attention to what other people may not notice (if you're that driver, don't always assume others can see you).
By working to expand your awareness and minimize unnecessary distractions that drain your mental resources, you can work to overcome this bias.
Prevent cognitive bias
As we know, recognizing these prejudices is the first step in overcoming them. But there are other little strategies we can follow to train our subconscious to think differently.
From boosting our memory and minimizing distractions to slowing down our decision making and improving our logical skills, we can work to overcome these cognitive biases.
A person can assess their own thought process, also known as metacognition ("thinking about thinking"), providing a way to combat bias (Flavell, 1979).
This multifactorial process includes (Croskerry, 2003):
(a) recognition of memory limitations,
(b) seek perspective when making decisions,
(c) being able to criticize yourself, and
(d) Selection of strategies to avoid cognitive errors.
Many of the bias avoidance strategies we have described are also known as cognitive enforcement strategies, which are mental tools used to enforce unbiased decision making.
About the Author
Charlotte Ruhl is a member of the Class of 2022 at Harvard University. She is studying Psychology with a minor in African American Studies. On campus, Charlotte works in an implicit social cognition research lab, is an editor for the Undergraduate Law Review, and plays softball.
To reference this article:
To reference this article:
Ruhl, C. (2021, May 04).Examples of cognitive biases. Just psychology. www.simplypsychology.org/cognitive-bias.html
APA style references
Allen, MS, Robson, DA, Martin, LJ, & Laborde, S. (2020). Systematic review and meta-analysis of self-serving attribution biases in the competitive context of organized sport.Personality and Social Psychology Bulletin, 46(7), 1027-1043.
Cassad, B. (2019).confirmation bias. Retrieved from https://www.britannica.com/science/confirmation-bias
Cherry, K. (2019).How Availability Heuristics Affect Your Decision Making. Retrieved from https://www.verywellmind.com/availability-heuristic-2794824
Cherry, K. (2020).Accidental blindness can cause you to miss things in front of you. Retrieved from https://www.verywellmind.com/what-is-inattentional-blindness-2795020
Dietrich, D. & Olson, M. (1993). A retrospective demonstration of bias using Thomas's confirmation vote.Psychological reports, 72(2), 377-378.
Fischhoff, B. (1975). Hindsight is not the same as forecasting: the impact of knowledge of the outcome on judgment under uncertainty.Journal of Experimental Psychology: Human Cognition and Performance, 1(3), 288.
Fischhoff, B and Beyth, R (1975). I knew it would happen: probabilities remembered at once: future things.Organizational Behavior and Human Performance, 13(1), 1-16.
Furnham, A. (1982). UK unemployment claims. European Journal of Social Psychology, 12(4), 335-352.
Heider, F. (1982).The psychology of interpersonal relationships.. Psychology Editor.
Inman, M. (2016).retrospective error. Retrieved from https://www.britannica.com/topic/hindsight-bias
Lang, R. (2019).What is the difference between conscious and unconscious bias?: Frequent questions. Retrieved from https://engageinlearning.com/faq/compliance/unconsciente-bias/what-is-the-difference-entre-consciente-y-inconsciente-bias/
Luippold, B., Perreault, S. y Wainberg, J. (2015).Auditor Pitfalls: Five Ways to Overcome Confirmation Bias. Retrieved from https://www.babson.edu/academics/executive-education/babson-insight/finance-and-accounting/auditors-pitfall-five-ways-to-overcome-confirmation-bias/
Mezulis, A. H., Abramson, L. Y., Hyde, J. S. y Hankin, B. L. (2004).Is there a universal positivity bias in attributions? A meta-analytic review of individual, developmental, and cultural differences in selfish attribution bias. Psychological Bulletin, 130(5), 711.
Miller, D.T. and Ross, M. (1975). Interested biases in causality attribution: fact or fiction?Psychological Bulletin, 82(2), 213.
Most, S.B., Simons, D.J., Scholl, B.J., Jimenez, R., Clifford, E., & Chabris, CF (2001). How not to be seen: the contribution of similarity and selective ignorance to persistent involuntary blindness.psychology, 12(1), 9-17.
Mussweiler, T. & Strack, F. (1999). Consistent testing on hypotheses and semantic priming in the anchoring paradigm: a selective accessibility model.Journal of Experimental Social Psychology, 35(2), 136-164.
Neff, K. (2003). Self-compassion: An alternative conceptualization of a healthy attitude towards oneself.I and identity, 2(2), 85-101.
Nickerson, RS (1998). Confirmation bias: a ubiquitous phenomenon in many forms.Journal of General Psychology, 2(2), 175-220.
Orzan, G., Zara, IA, & Purcarea, V.L. (2012). Neuromarketing techniques in drug advertising. A discussion and agenda for future research.Magazine of Medicine and Life, 5(4), 428.
Pickel, KL (2015). recollection of an eyewitness.The care manual, 485-502.
Pohl, RF and Hell, W. (1996). No reduction in posterior bias after full reporting and repeat testing.Organizational behavior and human decision-making processes, 67.(1), 49-58.
Roese, NJ and Vohs, KD (2012). hindsight error.Perspectives in Psychology, 7(5), 411-426.
Ross, L. (1977). The intuitive psychologist and his shortcomings: biases in the attribution process.In Advances in experimental social psychology(Vol. 10, pp. 173-220). Academic Press.
Tversky, A and Kahneman, D (1973). Availability: A heuristic to assess frequency and probability.cognitive psychology, 5th(2), 207-232.
Tversky, A and Kahneman, D (1974). Judgment under uncertainty: heuristics and biases.science, 185(4157), 1124-1131.
Tversky, A and Kahneman, D (1983). Extensional versus intuitive reasoning: the fallacy of conjunction in the evaluation of probabilities.psychological review, 90(4), 293.
Tversky, A and Kahneman, D (1992). Advances in the Theory of Prospective: Cumulative Accounting of Uncertainty. Risk and Uncertainty Magazine, 5(4), 297-323.
Walther, JB and Bazarova, NN (2007). Misattribution in virtual groups: The effects of member distribution on self-serving bias and partner guilt.Research in human communication, 33(1), 1-26.
Wason, Peter C. (1960), "On the impossibility of eliminating hypotheses in a conceptual task".Quarterly Journal of Experimental Psychology, 12(3): 129–40.
Wegener, DT, Petty, RE, Detweiler-Bedell, BT, & Jarvis, WBG (2001). Implications of attitude change theories for numerical anchoring: the plausibility of anchoring and the limits of anchoring effectiveness.Journal of Experimental Social Psychology, 37(1), 62-69.
Wilke, A. and Mata, R. (2012). cognitive distortion.In Encyclopedia of Human Behavior(pp. 531-535). Academic Press.
To reference this article:
To reference this article:
Ruhl, C. (2021, May 04).Examples of cognitive biases. Just psychology. www.simplypsychology.org/cognitive-bias.html
Test yourself for bias
- Implicit project (IAT test)
from Harvard University - Implicit association test
From the Social Psychology Network - Test yourself for hidden biases
teach tolerance
listens
- How the concept of implicit bias arose
with Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: Hidden Bias of Good People
5:28 minutes; contains transcript - understand your racial bias
With John Dovidio, PhD, Yale University
von der American Psychological Association
11:09 minutes; contains transcript - Talking about implicit bias in surveillance
Mit Jack Glaser, Goldman School of Public Policy, University of California Berkeley
21:59 minutes - Implicit bias: a factor in health communication
con la Dra. Winston Wong, Kaiser Permanente
19:58 minutes - Bias, Black Lives, and Academic Medicine
dr. David Ansell on Your Health Radio (August 1, 2015)
21:42 minutes
videos
- Uncover hidden biases
Google Conversation with Dr. Mahzarin Banaji, Harvard University - Effects of implicit bias in the justice system
9:14 minutes - Students Talk: What Bias Means to Them
2:17 minutes - Weight distortion in health care
from Yale University
16:56 minutes - Gender and race bias in facial recognition technology
4:43 minutes
newspaper article
- An implicit bias manual
Mitchell G (2018). An implicit bias manual.Virginia Journal of Social Policy & the Law, 25, 27–59. - Implicit association test at 7 years: a methodological and conceptual review
Nosek, BA, Greenwald, AG, & Banaji, MR (2007). The implicit association test at 7 years: a methodological and conceptual review.Automatic processes in thought and social behavior 4, 265-292. - Implicit racial/ethnic biases among health professionals and their impact on health care outcomes: a systematic review
Hall, W.J., Chapman, M.V., Lee, KM, Merino, Y.M., Thomas, T.W., Payne, B.K., ... & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health professionals and its impact on health care outcomes: a systematic review.American Journal of Public Health, 105(12), e60-e76. - Reducing racial bias in healthcare providers: lessons from sociocognitive psychology
Burgess, D., Van Ryn, M., Dovidio, J. & Saha, S. (2007). Reducing racial bias in healthcare providers: lessons from sociocognitive psychology.Journal of General Internal Medicine, 22(6), 882-887. - Integration of implicit biases in director training
Boysen, Georgia (2010). Integration of implicit biases in the training of directors.Training and Supervision of Counselors, 49(4), 210–227. - Cognitive distortions and errors as a cause, and best journalistic practices as an effect.
Christian, S. (2013). Cognitive distortions and errors as a cause, and best journalistic practices as an effect.Journal of Ethics in the Media, 28(3), 160–174. - Empathy intervention to reduce implicit bias in student teachers
Whitford, DK and Emerson, AM (2019). Empathy intervention to reduce implicit bias in student teachers.Psychological reports, 122(2), 670–688.
Heim | About us | Privacy Policy | announce | Contact Us
go back up
The content on Simply Psychology is for informational and educational purposes only. Our website is not intended to be a substitute for professional medical advice, diagnosis or treatment.
© Simply Scholar Ltd - All Rights Reserved