Know yourself: Cognitive biases undermining the study of existential risks

Humans tend to make mistakes when thinking about existential risks. Part of the problem is that risk assessment is very difficult when risks are complex and far in the future. However, even if we acknowledge that difficulty and begin to diligently work towards averting risks, we still have to learn to think clearly.

This turns out to be another significant challenge. Humans tend think in particular ways, and some of our tendencies make it very likely that we will make serious mistakes when we confront topics as challenging as existential risks. Psychological research has uncovered a vast array of heuristics and biases during the last few decades. Some of these mistakes are particularly relevant for the study of existential risks:1

  • The availability heuristic means that if we can easily or quickly remember ideas or examples, then we tend to think that they are important and/or likely, regardless of their true importance or likelihood.
  • Humans deem Black Swan events (an event that is surprising and has a major impact) to be far less likely and less impactful than they really are.
  • The conjunction fallacy causes humans to consider more detailed stories to be more likely even though logic dictates that each additional detail makes a story less likely.
  • The confirmation bias means that humans search for (and are more likely to believe) evidence that confirms their existing beliefs rather than evidence that refutes them.
  • Anchoring makes people likely to keep their predictions very close to the first guess that enters their mind even if the source of that guess has nothing to do with the subject at hand (or is even explicitly fictional).
  • The affect heuristic means that people use how they feel about a particular subject to color all of their thoughts about that subject. A technology that poses great dangers must therefore not be useful; useful technologies must therefore not be risky. In reality, the riskiest technologies can also be the most useful, and vice-versa.
  • Scope neglect is when the scale of a particular problem (or solution) is not taken into account in the assessment of how bad (or good) it is. Famously, the deaths of 2,000 and 200,000 migrating birds are regarded very similarly. Similar challenges occur when people attempt to estimate the moral importance of the death of one human versus the deaths of a billion.
  • The overconfidence effect is the noted pattern of humans being far more confident in their knowledge and guesses than they should be.
  • The bystander effect is the tendency of humans to be less likely to act during emergencies when not alone. The larger the group of people who witness an emergency, the less likely it is that anyone acts. Existential risks have the largest possible group of human bystanders. The bystander effect means that we should expect that very few people will naturally end up acting to avert these risks, regardless of their likelihood.
  • The need for closure is the human tendency to search for certainty and to avoid uncertainty and ambiguity. We have to overcome our discomfort with uncertainty if we want to think effectively about highly uncertain subjects like existential risks.

Yudkowsky also calls attention to another problematic mode of thinking with regards to existential risks. The authors of this post have also confronted this thought pattern and corroborate Yudkowsky’s description of it:

In addition to standard biases, I have personally observed what look like harmful modes of thinking specific to existential risks. The Spanish flu of 1918 killed 25-50 million people. World War II killed 60 million people. 108 is the order of the largest catastrophes in humanity’s written history. Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking—enter into a “separate magisterium.” People who would never dream of hurting a child hear of an existential risk, and say, “Well, maybe the human species doesn’t really deserve to survive.”

Knowing about these problems in human thinking does not make you immune to them. In fact, a common finding in the psychological literature is that even when participants are taught how people tend to make mistakes on a task, they still make large mistakes. Similarly, added incentives (like monetary rewards) often don’t improve performance.

Knowledge alone isn’t enough. Motivation alone isn’t enough. Effective thinking requires dedicated practice and consistent external feedback. David Brin may have said it best when he said, “Criticism is the only known antidote to error.”

Each of us must overcome these challenges within ourselves. Restrain the urge to use these words as tools to attack the arguments of others. Criticize yourself most of all. Be an example of selfless and fearless thinking so that others may follow the trail you blaze.

Footnotes
  1. Yudkowsky, E. (2008). Cognitive biases potentially affecting judgment of global risks. Global catastrophic risks, 1, 86. []

Ben Harack

I'm an aspiring omnologist who is fascinated by humanity's potential.

Leave a Reply

Your email address will not be published.