Heuristics and Biases: An Excellent Introduction

My friend Elie Yudkowsky, an extraordinary thinker from the Singularity Institute for Artificial Intelligence, gave me a copy of his paper Cognitive biases potentially affecting judgment of global risks.

It is the single best discussion of heuristics and biases within the framework of cognitive psychology I’ve read. I highly recommend you print and read Elie’s paper and read slowly to better understand errors in reasoning we fallible humans seem to make all too frequently. It is rare to find a paper so accessible to the lay reader and yet also so grounded in experimental research.

Here are the heuristics Elie discusses. I quote liberally.

1. Availability.

Suppose you randomly sample a word of three or more letters from the English text. Is it more likely that the word starts with an R ("rope"), or that R is its third letter ("park")?

We judge the frequency or probability of an event by the ease with which examples of the event come to mind.

2. Hindsight bias

The "I-knew-it-all-along" effect. We think an event is much more predictable after we know the eventual outcome.

3. Black Swans

"Sometimes most of the variance in a process comes from exceptionally rare, exceptionally huge events.

Do what you can to prepare for the unanticipated. History is biased in favor of observed "acts of heroism". History books do not account for heroic preventive measures."

4. The conjunction fallacy

This is one of the most interesting. In short, we often violate the conjunction rule of probability which states that p(A & B) is less than or equal to p(A). Longer examples illustrate this well, but I’ll skip them for now. One shorter explanation has to do with the amount of detail involved in the example. "Adding additional detail onto a story must render the story less probable, yet human psychology seems to follow the rule that adding additional detail can make the story more plausible." In one experiment the first group of subjects were asked how much they’d be willing to pay for terrorism insurance covering the flight from Thailand to the US, a second group asked how much they’d be willing to pay for the round-trip flight, and a third group the complete trip to Thailand. The average willingness to pay decreased even as the scope of coverage increased (because specific detail in description decreased).

"More generally, people tend to overestimate conjunctive probabilities and underestimate disjunctive probabilities. That is, people tend to overestimate the probability that, e.g., seven events of 90% probability will all occur. Conversely, people tend to underestimate the probability that at least one of seven events of 10% probability will occur." And as Elie says, in the start-up world, funders need to consider the chances of many individual events all going right (customer demand, good employees, etc) as well as the possibility that at least one critical failure will occur (CEO  dies, bank refuses a key loan, etc).

5. Confirmation bias

People seek confirming but not falsifying evidence for a given belief. This is best exemplified in the "2-4-6" task.

Elie cites a study which examined six biases as practiced by students exposed to political literature for and against gun control and affirmative action:

  1. Prior attitude effect. Students who feel strongly about an issue will evaluate supportive arguments more favorably than contrary arguments.
  2. Disconfirmation bias. Subjects will spend more time and cognitive resources denigrating contrary arguments than supportive arguments.
  3. Confirmation bias. Subjects free to choose their information sources will seek out supportive rather than contrary sources.
  4. Attitude polarization. Exposing subjects to an apparently balanced set of pro and con arguments will exaggerate their initial polarization.
  5. Attitude strength effect. Subjects voicing stronger attitudes will be more prone to the above biases.
  6. Sophistication effect. Politically knowledgeable subjects, because they possess greater ammunition with which to counter-argue incongruent facts and arguments, will be more prone to the above biases.

6. Anchoring, adjustment, and contamination

Subjects take initial uninformative numbers as their starting point and then adjust the number up or down until they reach an answer that sounds plausible. Salary negotiations, I’ve learned, are an example where side who first offers a number has an advantage due to anchoring the discussion in a certain price ballpark.

Elie notes that almost any information could work its way into a cognitive judgment — even if it’s clearly irrelevant or absurd.

7. The affect heuristic

"Subjective impressions of "goodness" or "badness" can act as a heuristic, capable of producing fast perceptual judgments, and also systematic biases." An experimenter asked subjects whether an airport should upgrade its equipment. When the measure was described as "Saving 150 lives" it had a mean support of 10.4. When described as "Saving 98% of 150 lives" it had a mean support of 13.6. Even "Saving 85% of 150 lives" had higher support than simply "Saving 150 lives".

Wow! Elie explains: "Saving 150 lives sounds diffusely good and is therefore only weakly evaluable, while saving 98% of something is clearly very good because it is so close to the upper bound on the percentage scale."

8. Scope neglect

We would pay about the same money to save 2,000 oil-drowning birds as we would 20,000 or 200,000. The most widely accepted explanation for scope neglect is that by Kahneman et al who say that we construct an image of an exhausted bird, soaked in black oil, unable to escape, and assign a value to this single image (even though there may be thousands more). Two other hypotheses include purchase of moral satisfaction, which suggests that people spend enough money to create a "warm glow" in themselves and so the dollar amount has more to do with a person’s psychology than the birds, and good cause dump, "which says that people have some amount of money they are willing to pay for "the environment" and any question about environmental goods elicits this amount."

9. Calibration and overconfidence

Our best case and expected case scenarios are often indistinguishable.

10. Bystander apathy

This is more social psychology than heuristics and biases. The famous studies here suggest that large numbers of people are less likely to act in emergencies. In other words, the more people who are standing around during an emergency, the less likely any single person or the group as a whole will do anything about it. Being in a group diffuses individual responsibility.

Elie ends his paper with a final caution: "If you believe someone is guilty of a psychological error, then demonstrate your competence by first demolishing their consequential factual errors. If there are no factual errors, then what matters the psychology?…Do not lose track of the real-world facts of primary interest; do not let the argument become about psychology."

3 comments on “Heuristics and Biases: An Excellent Introduction
  • Now everybody knows why doctoral theses focus on narrowly defined questions. It enables us to avoid all the abovementioned minefields to the best of our abilities during data collection and analysis, and not least, writing up. 😎

  • Max Bazerman has a fantastic book on biases during decisionmaking that covers these topics in depth. I’m borrowing Ramit’s copy, but you can borrow it from him whenever–I’m finished with it.

  • I’m currently reading “Fooled by Randomness” which is another excelent (and accessable) work that points out most of the falacies that you mention above. What are the chances that someone would be half way through a book on human falability when it comes to judging random events only to stumble onto a post like this? (hint: its a lot higher than you think 🙂

Leave a Reply to Chris Yeh Cancel reply

Your email address will not be published. Required fields are marked *