Archive | July, 2010

cryptomnesia

Cryptomnesia

Cryptomnesia occurs when a forgotten memory returns without its being recognised as such by the subject, who believes it is something new and original. It is a memory bias whereby a person may falsely recall generating a thought, an idea, a song, or a joke, not deliberately engaging in plagiarism but rather experiencing a memory as if it were a new inspiration. Sentences in scientific papers that are identical to sentences from some of the references used to write the paper often stem from cryptomnesia.

In the first empirical study of cryptomnesia, people in a group took turns generating category examples (e.g., kinds of birds: parrot, canary, etc.). They were later asked to create new exemplars in the same categories that were not previously produced, and also to recall which words they had personally generated. People inadvertently plagiarized about 3–9% of the time either by regenerating another person’s thought or falsely recalling someone’s thought as their own. Similar effects have been replicated using other tasks such as word search puzzles and in brainstorming sessions.

Research has distinguished between two kinds of cryptomnesia, though they are often studied together. The distinction between these two types of plagiarism is in the underlying memory bias responsible—specifically, is it the thought that is forgotten, or the thinker? The first type of bias is one of familiarity. The plagiarizer regenerates an idea that was presented earlier, but believes the idea to be an original creation. The idea that is reproduced could be another’s idea, or one’s own from a previous time. B. F. Skinner describes his own experience of self-plagiarism;

“One of the most disheartening experiences of old age is discovering that a point you just made—so significant, so beautifully expressed—was made by you in something you published long ago.”

The second type of cryptomnesia results from an error of authorship whereby the ideas of others are remembered as one’s own. In this case, the plagiarizer correctly recognizes that the idea is from an earlier time, but falsely remembers having been the origin for the idea. Various terms have been coined to distinguish these two forms of plagiarism — occurrence forgetting vs. source forgetting and generation errors vs. recognition errors. The two types of cryptomnesia appear to be independent: no relationship has been found between error rates and the two types are precipitated by different causes.

Cryptomnesia is more likely to occur when the ability to properly monitor sources is impaired. For example, people are more likely to falsely claim ideas as their own when they were under high cognitive load at the time they first considered the idea. Plagiarism increases when people are away from the original source of the idea, and decreases when participants are specifically instructed to pay attention to the origin of their ideas. False claims are also more prevalent for ideas originally suggested by persons of the same sex, presumably because the perceptual similarity of the self to a same-sex person exacerbates source confusion. In other studies it has been found that the timing of the idea is also important: if another person produces an idea immediately before the self produces an idea, the other’s idea is more likely to be claimed as one’s own, ostensibly because the person is too busy preparing for their own turn to properly monitor source information.

Posted in Memory errors0 Comments

Wild Horses

Herd behavior

Herd behavior describes how individuals in a group can act together without planned direction. The term pertains to the behavior of animals in herds, flocks, and schools, and to human conduct during activities such as stock market bubbles and crashes, street demonstrations, sporting events, religious gatherings, episodes of mob violence and even everyday decision making, judgment and opinion forming.

The German philosopher Friedrich Nietzsche was the first to critique what he referred to as “herd morality” and the “herd instinct” in human society. Modern psychological and economic research has identified herd behavior in humans to explain the phenomena of large numbers of people acting in the same way at the same time. The British surgeon Wilfred Trotter popularized the “herd behavior” phrase in his book, Instincts of the Herd in Peace and War (1914). In The Theory of the Leisure Class, Thorstein Veblen explained economic behavior in terms of social influences such as “emulation,” where some members of a group mimic other members of higher status. In “The Metropolis and Mental Life” (1903), early sociologist George Simmel referred to the “impulse to sociability in man”, and sought to describe “the forms of association by which a mere sum of separate individuals are made into a ‘society’ “. Other social scientists explored behaviors related to herding, such as Freud (crowd psychology), Carl Jung (collective unconscious), and Gustave Le Bon (the popular mind). Swarm theory observed in non-human societies is a related concept and is being explored as it occurs in human society.

Benign herding behaviors may be frequent in everyday decisions based on learning from the information of others, as when a person on the street decides which of two restaurants to dine in. Suppose that both look appealing, but both are empty because it is early evening; so at random, this person chooses restaurant A. Soon a couple walks down the same street in search of a place to eat. They see that restaurant A has customers while B is empty, and choose A on the assumption that having customers makes it the better choice. And so on with other passersby into the evening, with restaurant A doing more business that night than B. This phenomenon is also referred as an information cascade.

Posted in Social biases0 Comments

proud man

The Dunning–Kruger effect

The Dunning–Kruger effect is a cognitive bias in which an unskilled person makes poor decisions and arrives at erroneous conclusions, but their incompetence denies them the metacognitive ability to realize their mistakes. The unskilled therefore suffer from illusory superiority, rating their own ability as above average, much higher than it actually is, while the highly skilled underrate their abilities, suffering from illusory inferiority. This leads to the perverse situation in which less competent people rate their own ability higher than more competent people. It also explains why actual competence may weaken self-confidence: because competent individuals falsely assume that others have an equivalent understanding. “Thus, the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others.”

The Dunning–Kruger effect was put forward by Justin Kruger and David Dunning. Similar notions have been expressed–albeit less scientifically–for some time. Dunning and Kruger themselves quote Charles Darwin (“Ignorance more frequently begets confidence than does knowledge”) and Bertrand Russell (“One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision.”). The Dunning-Kruger effect is not, however, concerned narrowly with high-order cognitive skills (much less their application in the political realm during a particular era, which is what Russell was talking about.) Nor is it specifically limited to the observation that ignorance of a topic is conducive to overconfident assertions about it, which is what Darwin was saying. Indeed, Dunning et al. cite a study saying that 94% of college professors rank their work as “above average” (relative to their peers), to underscore that the highly intelligent and informed are hardly exempt. Rather, the effect is about paradoxical defects in perception of skill, in oneself and others, regardless of the particular skill and its intellectual demands, whether it is chess, playing golf or driving a car.

Posted in Social biases4 Comments

Fakeface

Pareidolia

Pareidolia (pronounced pa-ri-DOE-lee-ə) is a psychological phenomenon involving a vague and random stimulus (often an image or sound) being perceived as significant. Common examples include seeing images of animals or faces in clouds, the man in the moon, and hearing hidden messages on records played in reverse.

Evolutionary explanation

It is thought that there may be some kind of evolutionary advantage to this malfunctioning of the perceptual apparatus, particularly with regard to our tendency to see faces in commonplace objects. Carl Sagan hypothesized that as a survival technique, human beings are “hard-wired” from birth to identify the human face. While this allows people to use only minimal details to recognize faces from a distance and in poor visibility it can also lead them to interpret random images or patterns of light and shade as being faces. The evolutionary advantages of being able to identify friend from foe with split-second accuracy are numerous; prehistoric (and even modern) men and women who accidentally identify an enemy as a friend could face deadly consequences for this mistake. This is only one among many evolutionary pressures responsible for the development of the modern facial recognition capability of modern humans.

In 2009 a magnetoencephalography study found that objects incidentally perceived as faces evoke an early activation in the ventral fusiform cortex, at a time and location similar to that evoked by faces, whereas other common objects do not evoke such activation. This activation is similar to a slightly earlier peak seen for images of real faces. The authors suggest that face perception evoked by face-like objects is a relatively early process, and not a late cognitive reinterpretation phenomenon.

This study helps to explain why people identify the line drawing to the left as a “face” so quickly and without hesitation; precognitive processes are activated by the “face-like” object, which alert the observer to the emotional state and identity of the subject – even before the conscious mind begins to process – or even receive – the information. The “stick figure face,” despite its simplicity, conveys mood information (in this case, disappointment or mild unhappiness); it would be just as simple to draw a stick figure face that would be perceived (by most people) as hostile and aggressive. This robust and subtle capability is the result of eons of natural selection favoring people most able to quickly identify the mental state, for example, of threatening people, thus providing the individual an opportunity to flee and fight another day. In other words, processing this information subcortically (and therefore subconsciously) – before it is passed on to the rest of the brain for detailed processing – accelerates judgment and decision making when alacrity is paramount. This ability, though highly specialized for the processing and recognition of human emotions also functions to determine the demeanor of wildlife.

Posted in Biases in probability and belief2 Comments

gamblers_fallacy

Gambler’s fallacy

The Gambler’s fallacy’, also known as “the Monte Carlo fallacy” or the “fallacy of the maturity of chance”s, is the belief that if deviations from expected behaviour are observed in repeated independent trials of some random process then these deviations are likely to be evened out by opposite deviations in the future. For example, if a fair coin is tossed repeatedly and tails comes up a larger number of times than is expected, a gambler may incorrectly believe that this means that heads is more likely in future tosses. Such an expectation could be mistakenly referred to as being “due”. This is an informal fallacy. It is also known colloquially as the “law of averages”.

The gambler’s fallacy implicitly involves an assertion of negative correlation between trials of the random process and therefore involves a denial of the exchangeability of outcomes of the random process.

The inverse gambler’s fallacy is the belief that an unlikely outcome of a random process (such as rolling double sixes on a pair of dice) implies that the process is likely to have occurred many times before reaching that outcome.

The reversal is also a fallacy, the reverse gambler’s fallacy, in which a gambler may instead decide that tails are more likely out of some mystical preconception that fate has thus far allowed for consistent results of tails; the false conclusion being, why change if odds favor tails? Again, the fallacy is the belief that the “universe” somehow carries a memory of past results which tend to favor or disfavor future outcomes.

Posted in Biases in probability and belief0 Comments

exposure

Mere exposure effect

The exposure effect is a psychological phenomenon by which people tend to prefer things because they are familiar with them. In social psychology, this effect is sometimes called the familiarity principle. In studies of interpersonal attraction, the more often a person is seen by someone, the more pleasing and likeable that person appears to be.

Research

In the 1960s, a series of laboratory experiments by Robert Zajonc demonstrated that simply exposing subjects to an unfamiliar stimulus led them to rate it more positively than other, similar stimuli which had not been presented. Researchers have used words, Chinese characters, paintings, pictures of faces, geometric figures, and auditory stimuli in these experiments. In one variation, subjects were shown an image on a tachistoscope for a very brief duration that could not be perceived consciously. This subliminal exposure produced the same effect, though it is important to note that subliminal effects are generally weak and unlikely to occur without controlled laboratory conditions. According to Zajonc, the exposure effect is capable of taking place without conscious cognition, and that “preferences need no inferences.”

A meta-analysis of 208 experiments found that the exposure effect is robust and reliable, with an effect size of r=0.26. This analysis found that the effect is strongest when unfamiliar stimuli are presented briefly. Mere exposure typically reaches its maximum effect within 10-20 presentations, and some studies even show that liking may decline after a longer series of exposures. For example, people generally like a song more after they have heard it a few times, but many repetitions can reduce this preference. A delay between exposure and the measurement of liking actually tends to increase the strength of the effect. Curiously, the effect is weaker on children, and for drawings and paintings as compared to other types of stimuli. One social psychology experiment showed that exposure to people we initially dislike makes us dislike them even more.

Advertising

Although the exposure effect appears to have a natural place in advertising, research has been mixed as to how effective it is at enhancing consumer attitudes toward particular companies and products. According to one study, higher levels of media exposure are associated with lower reputations for companies, even when the exposure is mostly positive. A subsequent review of the research concluded that exposure leads to ambivalence because it brings about a large number of associations, which tend to be both favorable and unfavorable. Exposure is most likely to be helpful when a company or product is new and unfamiliar to consumers.

Posted in Decision-making and behavioral biases0 Comments

congruence

Congruence bias

Congruence bias is a type of cognitive bias similar to confirmation bias. Congruence bias occurs due to people’s overreliance on direct testing of a given hypothesis, and their corresponding neglect of indirect testing.

Suppose that in an experiment, a subject is presented with two buttons, and is told that pressing one of those buttons, but not the other, will open a door. The subject adopts the hypothesis that the button on the left opens the door in question. A direct test of this hypothesis would be pressing the button on the left; an indirect test would be pressing the button on the right. The latter is still a valid test because once the result of the door’s remaining closed is found, the left button is proven to be the desired button.

We can take this idea of direct and indirect testing and apply it to more complicated experiments in order to explain the presence of a congruence bias in people. In an experiment, a subject will test his own usually naive hypothesis again and again instead of trying to disprove it.

The classic example of subjects’ congruence bias is found in an experiment by Wason. Here, the experimenter gave subjects the number sequence “2, 4, 6,” telling the subjects that this sequence followed a particular rule and instructing subjects to find the rule underlying the sequence logic. Subjects provided their own number sequences as tests to see if they could ascertain the rule dictating which numbers could be included in the sequence and which could not. Most subjects respond to the task by quickly deciding that the underlying rule is “numbers ascending by 2,” and provide as tests only sequences concordant with this rule, such as “3, 5, 7,” or even “pi plus 2, plus 4, plus 6.” Each of these sequences follows the underlying rule the experimenter is thinking of, though “numbers ascending by 2” is not the actual criterion being used. However, because subjects succeed at repeatedly testing the same singular principle, they naively believe their chosen hypothesis is correct. When a subject offers up to the experimenter the hypothesis “numbers ascending by 2” only to be told he is wrong, much confusion usually ensues. At this point, many subjects attempt to change the wording of the rule without changing its meaning, and even those who switch to indirect testing have trouble letting go of the “+ 2” convention, producing potential rules as idiosyncratic as “the first two numbers in the sequence are random, and the third number is the second number plus two.” Many subjects never realize that the actual rule the experimenter was using was simply just to list ascending numbers, because of the subjects’ inability to consider indirect tests of their hypotheses.

Wason attributed this failure of subjects to an inability to consider alternative hypotheses, which is the root of the congruence bias. Jonathan Baron explains that subjects could be said to be using a “congruence heuristic,” wherein a hypothesis is tested only by thinking of results that would be found if that hypothesis is true. This heuristic, which many people seem to use, ignores alternative hypotheses.

To avoid falling into the trap of the congruence bias, Baron suggests that the following two heuristics be used:

1. Ask “How likely is a yes answer, if I assume that my hypothesis is false?” Remember to choose a test that has a high probability of giving some answer if the hypothesis is true, and a low probability if it is false.

2. “Try to think of alternative hypotheses; then choose a test most likely to distinguish them — a test that will probably give different results depending on which is true.” An example of the need for the heuristic could be seen in a doctor attempting to diagnose appendicitis. In that situation, assessing a white blood cell count would not assist in diagnosis, because an elevated white blood cell count is associated with a number of maladies.

Posted in Decision-making and behavioral biases0 Comments

dice

Neglect of probability

The “neglect of probability bias” is the tendency to disregard probabilities when making decisions which involve a degree of uncertainty. It is a simple way in which many people violate the normative rules for making decisions.

This isn’t the only kind of cognitive bias related to probability–others include “the gambler’s fallacy,” “the hindsight bias,” and “the neglect of prior base rates effect”–but this bias differs notably from those ones, as with this bias the person involved completely disregards probability when making his decision, instead of incorrectly applying probability.

An interesting study was done in 1993 which studied this bias. The study asked children the following question:

Susan and Jennifer are arguing about whether they should wear seat belts when they ride in a car. Susan says that you should. Jennifer says you shouldn’t… Jennifer says that she heard of an accident where a car fell into a lake and a woman was kept from getting out in time because of wearing her seat belt, and another accident where a seat belt kept someone from getting out of the car in time when there was a fire. What do you think about this?

One subject responded as below:

A: Well, in that case I don’t think you should wear a seat belt.
Q (interviewer): How do you know when that’s going to happen?
A: Like, just hope it doesn’t!
Q: So, should you or shouldn’t you wear seat belts?
A: Well, tell-you-the-truth we should wear seat belts.
Q: How come?
A: Just in case of an accident. You won’t get hurt as much as you will if you didn’t wear a seat belt.
Q: OK, well what about these kinds of things, when people get trapped?
A: I don’t think you should, in that case.

Here we can see that the subject completely disregards the probability of an accident happening versus the probability of getting hurt by the seat belt in making the decision. In a  normative model for this decision we would use expected-utility theory to decide which option would be most likely to maximize utility. This would involve weighing the changes in utility in each option by the probability that each option will occur, something the subject ignores.

Another subject responded to the same question:

A: If you have a long trip, you wear seat belts half way.
Q: Which is more likely?
A: That you’ll go flyin’ through the windshield.
Q: Doesn’t that mean you should wear them all the time?
A: No, it doesn’t mean that.
Q: How do you know if you’re gonna have one kind of accident or the other?
A: You don’t know. You just hope and pray that you don’t.

Here again you can see the disregard of probability in making the decision by the subject. He treats each possible outcome as equally likely.

It has been suggested that adults can suffer from this bias as well, especially when it comes to decisions like a medical decision under uncertainty. We see that this bias could make actors drastically violate expected-utility theory in their decision making, especially when a decision must be made in which one possible outcome has a much lower or higher utility but a small probability of occurring (e.g. in medical or gambling situations). In this aspect, the neglect of probability bias is similar to the neglect of prior base rates effect.

In another interesting example of near-total neglect of probability, it was discovered that an average person was willing to pay $7 to avoid a 1% chance of a painful electric shock, and $10—to avoid a 99% chance of the same shock. The suggestion is that probability is more likely to be neglected when the potential outcomes arouse strong emotion.

Posted in Decision-making and behavioral biases1 Comment



Archives