Actions

Cognitive Bias

Revision as of 21:04, 12 April 2023 by User (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

A cognitive bias (e.g. Ariely, 2008) is a systematic (non-random) error in thinking, in the sense that a judgment deviates from what would be considered desirable from the perspective of accepted norms or correct in terms of formal logic. The application of heuristics is often associated with cognitive biases, some of which, such as those arising from availability or representativeness, are ‘cold’ in the sense that they do not reflect a person’s motivation and are instead the result of errors in information processing. Other cognitive biases, especially those that have a self-serving function (e.g. optimism bias), are more motivated. Finally, some biases, such as confirmation bias, can be motivated or unmotivated (Nickerson, 1998).[1]

Psychologists study cognitive biases as they relate to memory, reasoning, and decision-making. Many kinds of cognitive biases exist. For example, a confirmation bias is a tendency to seek only information that matches what one already believes. Memory biases influence what and how easily one remembers. For example, people are more likely to recall events they find humorous and better remember information they produce themselves. People are also more likely to regard accurate memories associated with significant events or emotions (such as the memory of what one was doing when a catastrophe occurred).[2]

Cognitive biases develop for several reasons. For example, errors in memory can affect how you think about a particular event. This, in turn, influences how you think about similar events, which can lead to cognitive bias. It's also thought that cognitive bias helps us process information more quickly. Cognitive biases can cause us to make inaccurate judgments, decisions, and interpretations. Because we're constantly making judgments and processing information, we are constantly at risk for cognitive bias. At one point or another, we've all been guilty of some type of cognitive bias. Although it's impossible to completely avoid cognitive biases, it is possible to understand what they are so that we can look for them when they arise and adjust our judgments as needed.[3]


Overview of Cognitive Bias[4]

The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972 and grew out of their experience of people's innumeracy, or inability to reason intuitively with the greater orders of magnitude. Tversky, Kahneman, and colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. Tversky and Kahneman explained human differences in judgment and decision-making in terms of heuristics. Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences (Baumeister & Bushman, 2010, p. 141). Heuristics are simple for the brain to compute but sometimes introduce "severe and systematic errors" (Tversky & Kahneman, 1974, p. 1125). For example, the representativeness heuristic is defined as the tendency to "judge the frequency or likelihood" of an occurrence by the extent to which the event "resembles the typical case" (Baumeister & Bushman, 2010, p. 141). The "Linda Problem" illustrates the representativeness heuristic (Tversky & Kahneman, 1983). Participants were given a description of "Linda" that suggests Linda might well be a feminist (e.g., she is said to be concerned about discrimination and social justice issues). They were then asked whether they thought Linda was more likely to be a
(a) "bank teller" or a
(b) "bank teller and active in the feminist movement."
A majority chose answer (b). This error (mathematically, answer (b) cannot be more likely than answer (a)) is an example of the "conjunction fallacy"; Tversky and Kahneman argued that respondents chose (b) because it seemed more "representative" of persons who might fit the description of Linda. The representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgments of others (Haselton et al., 2005, p. 726). Alternatively, critics of Kahneman and Tversky such as Gerd Gigerenzer argue that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of formal logic or the probability calculus. Nevertheless, experiments such as the "Linda problem" grew into the heuristics and biases research program which spread beyond academic psychology into other disciplines including medicine and political science.


Types of Cognitive Bias[5]

Here are some of the most widespread cognitive biases:

  • The Bandwagon effect (aka herd mentality) describes the tendency to think or act in ways because other people do. Examples include the popularity of Apple products, the use of "in-group" slang and clothing style, and watching the "Housewives of..." reality-TV franchise.
  • Confirmation bias involves the inclination to seek out information that supports our own preconceived notions. The reality is that most people don't like to be wrong, so they surround themselves with people and information that confirm their beliefs. The most obvious example these days is the tendency to follow news outlets that reinforce our political beliefs.
  • Illusion of Control is the propensity to believe that we have more control over a situation than we actually do. If we don't actually have control, we fool ourselves into thinking we do. Examples include rally caps in sports and "lucky" items.
  • The Semmelweis Reflex is the predisposition to deny new information that challenges our established views. Sort of the yang to the yin of the Confirmation bias, it exemplifies the adage "if the facts don't fit the theory, throw out the facts." An example is the Seinfeld episode in which George Costanza's girlfriend simply refuses to allow him to break up with her.
  • The Causation bias suggests the tendency to assume a cause-effect relationship in situations in which none exists (or there is a correlation or association). An example is believing someone is angry with you because they haven't responded to your email when, more likely, they are busy and just haven't gotten to it yet.
  • The Overconfidence effect involves unwarranted confidence in one's own knowledge. Research has shown that people who say they are "99% certain are wrong 40% of the time." Examples include political and sports prognosticators.
  • The False Consensus effect is the penchant to believe that others agree with you more than they actually do. Examples include guys who assume that all guys like sexist humor.
  • The Fundamental Attribution Error, which involves the tendency to attribute other people's behavior to their personalities and to attribute our own behavior to the situation. An example is when someone treats you poorly, you probably assume they are a jerk, but when you're not nice to someone, it's because you are having a bad day.


See Also

Anchoring Bias
[Confirmation Bias]]
Cognitive Computing
Dunning-Kruger Effect<be /> Cognitive Analytics


References


Further Reading