Tag Archive | "Cognitive Bias"

proud man

The Dunning–Kruger effect

The Dunning–Kruger effect is a cognitive bias in which an unskilled person makes poor decisions and arrives at erroneous conclusions, but their incompetence denies them the metacognitive ability to realize their mistakes. The unskilled therefore suffer from illusory superiority, rating their own ability as above average, much higher than it actually is, while the highly skilled underrate their abilities, suffering from illusory inferiority. This leads to the perverse situation in which less competent people rate their own ability higher than more competent people. It also explains why actual competence may weaken self-confidence: because competent individuals falsely assume that others have an equivalent understanding. “Thus, the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others.”

The Dunning–Kruger effect was put forward by Justin Kruger and David Dunning. Similar notions have been expressed–albeit less scientifically–for some time. Dunning and Kruger themselves quote Charles Darwin (“Ignorance more frequently begets confidence than does knowledge”) and Bertrand Russell (“One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision.”). The Dunning-Kruger effect is not, however, concerned narrowly with high-order cognitive skills (much less their application in the political realm during a particular era, which is what Russell was talking about.) Nor is it specifically limited to the observation that ignorance of a topic is conducive to overconfident assertions about it, which is what Darwin was saying. Indeed, Dunning et al. cite a study saying that 94% of college professors rank their work as “above average” (relative to their peers), to underscore that the highly intelligent and informed are hardly exempt. Rather, the effect is about paradoxical defects in perception of skill, in oneself and others, regardless of the particular skill and its intellectual demands, whether it is chess, playing golf or driving a car.

Posted in Social biasesComments (0)


Congruence bias

Congruence bias is a type of cognitive bias similar to confirmation bias. Congruence bias occurs due to people’s overreliance on direct testing of a given hypothesis, and their corresponding neglect of indirect testing.

Suppose that in an experiment, a subject is presented with two buttons, and is told that pressing one of those buttons, but not the other, will open a door. The subject adopts the hypothesis that the button on the left opens the door in question. A direct test of this hypothesis would be pressing the button on the left; an indirect test would be pressing the button on the right. The latter is still a valid test because once the result of the door’s remaining closed is found, the left button is proven to be the desired button.

We can take this idea of direct and indirect testing and apply it to more complicated experiments in order to explain the presence of a congruence bias in people. In an experiment, a subject will test his own usually naive hypothesis again and again instead of trying to disprove it.

The classic example of subjects’ congruence bias is found in an experiment by Wason. Here, the experimenter gave subjects the number sequence “2, 4, 6,” telling the subjects that this sequence followed a particular rule and instructing subjects to find the rule underlying the sequence logic. Subjects provided their own number sequences as tests to see if they could ascertain the rule dictating which numbers could be included in the sequence and which could not. Most subjects respond to the task by quickly deciding that the underlying rule is “numbers ascending by 2,” and provide as tests only sequences concordant with this rule, such as “3, 5, 7,” or even “pi plus 2, plus 4, plus 6.” Each of these sequences follows the underlying rule the experimenter is thinking of, though “numbers ascending by 2″ is not the actual criterion being used. However, because subjects succeed at repeatedly testing the same singular principle, they naively believe their chosen hypothesis is correct. When a subject offers up to the experimenter the hypothesis “numbers ascending by 2″ only to be told he is wrong, much confusion usually ensues. At this point, many subjects attempt to change the wording of the rule without changing its meaning, and even those who switch to indirect testing have trouble letting go of the “+ 2″ convention, producing potential rules as idiosyncratic as “the first two numbers in the sequence are random, and the third number is the second number plus two.” Many subjects never realize that the actual rule the experimenter was using was simply just to list ascending numbers, because of the subjects’ inability to consider indirect tests of their hypotheses.

Wason attributed this failure of subjects to an inability to consider alternative hypotheses, which is the root of the congruence bias. Jonathan Baron explains that subjects could be said to be using a “congruence heuristic,” wherein a hypothesis is tested only by thinking of results that would be found if that hypothesis is true. This heuristic, which many people seem to use, ignores alternative hypotheses.

To avoid falling into the trap of the congruence bias, Baron suggests that the following two heuristics be used:

1. Ask “How likely is a yes answer, if I assume that my hypothesis is false?” Remember to choose a test that has a high probability of giving some answer if the hypothesis is true, and a low probability if it is false.

2. “Try to think of alternative hypotheses; then choose a test most likely to distinguish them — a test that will probably give different results depending on which is true.” An example of the need for the heuristic could be seen in a doctor attempting to diagnose appendicitis. In that situation, assessing a white blood cell count would not assist in diagnosis, because an elevated white blood cell count is associated with a number of maladies.

Posted in Decision-making and behavioral biasesComments (0)


Neglect of probability

The “neglect of probability bias” is the tendency to disregard probabilities when making decisions which involve a degree of uncertainty. It is a simple way in which many people violate the normative rules for making decisions.

This isn’t the only kind of cognitive bias related to probability–others include “the gambler’s fallacy,” “the hindsight bias,” and “the neglect of prior base rates effect”–but this bias differs notably from those ones, as with this bias the person involved completely disregards probability when making his decision, instead of incorrectly applying probability.

An interesting study was done in 1993 which studied this bias. The study asked children the following question:

Susan and Jennifer are arguing about whether they should wear seat belts when they ride in a car. Susan says that you should. Jennifer says you shouldn’t… Jennifer says that she heard of an accident where a car fell into a lake and a woman was kept from getting out in time because of wearing her seat belt, and another accident where a seat belt kept someone from getting out of the car in time when there was a fire. What do you think about this?

One subject responded as below:

A: Well, in that case I don’t think you should wear a seat belt.
Q (interviewer): How do you know when that’s going to happen?
A: Like, just hope it doesn’t!
Q: So, should you or shouldn’t you wear seat belts?
A: Well, tell-you-the-truth we should wear seat belts.
Q: How come?
A: Just in case of an accident. You won’t get hurt as much as you will if you didn’t wear a seat belt.
Q: OK, well what about these kinds of things, when people get trapped?
A: I don’t think you should, in that case.

Here we can see that the subject completely disregards the probability of an accident happening versus the probability of getting hurt by the seat belt in making the decision. In a  normative model for this decision we would use expected-utility theory to decide which option would be most likely to maximize utility. This would involve weighing the changes in utility in each option by the probability that each option will occur, something the subject ignores.

Another subject responded to the same question:

A: If you have a long trip, you wear seat belts half way.
Q: Which is more likely?
A: That you’ll go flyin’ through the windshield.
Q: Doesn’t that mean you should wear them all the time?
A: No, it doesn’t mean that.
Q: How do you know if you’re gonna have one kind of accident or the other?
A: You don’t know. You just hope and pray that you don’t.

Here again you can see the disregard of probability in making the decision by the subject. He treats each possible outcome as equally likely.

It has been suggested that adults can suffer from this bias as well, especially when it comes to decisions like a medical decision under uncertainty. We see that this bias could make actors drastically violate expected-utility theory in their decision making, especially when a decision must be made in which one possible outcome has a much lower or higher utility but a small probability of occurring (e.g. in medical or gambling situations). In this aspect, the neglect of probability bias is similar to the neglect of prior base rates effect.

In another interesting example of near-total neglect of probability, it was discovered that an average person was willing to pay $7 to avoid a 1% chance of a painful electric shock, and $10—to avoid a 99% chance of the same shock. The suggestion is that probability is more likely to be neglected when the potential outcomes arouse strong emotion.

Posted in Decision-making and behavioral biasesComments (1)