What is an existential risk?

An existential risk is a possible future event that could potentially cause the extinction of humans or the permanent destruction of our ability to thrive.1 Some existential risks come from nature. Large-scale volcanic events, asteroid impacts, and gamma-ray bursts could have rendered humans extinct in the past and are still a danger to us today.2 During the next several million years, it is very likely that extremely dangerous natural events will occur. They could be devastating for humanity if we are unprepared. However, these events are unlikely to occur during any single century.3

Other risks are of our making. Nuclear war could have killed all humans in the late 20th century. Today that risk has been greatly reduced, even though several nations still have the weapons needed to cause a global catastrophe. Furthermore, new discoveries in areas such as biotechnology, nanotechnology, and machine intelligence may unlock technologies of immense power. These technologies will be used to do tremendous good, but their potential to do great harm cannot be ignored. Given the pace of innovation today, it is reasonable to assume that we will create at least one new existential risk during the 21st century.4 We live at a time when the greatest danger to humanity comes not from nature, but from our own actions.

It’s very hard to think clearly about existential risks.5 Unless we’re very careful, we’re likely to come to mistaken conclusions about the risks we face. But this is not something we can afford to do poorly.6 Given that the future of humanity is at stake, existential risks deserve to be studied with great care and intensity.

  1. Bostrom, N., 2002. Existential risks. Journal of Evolution and Technology 9. []
  2. Ćirković, M.M., Sandberg, A., Bostrom, N., 2010. Anthropic shadow: Observation selection effects and human extinction risks. Risk analysis 30, 1495–1506. []
  3. Sandberg, A., Matheny, J. G., & Ćirković, M. M. (2008). How can we reduce the risk of human extinction. Bulletin of the Atomic Scientists, 9. []
  4. Sandberg, A., Bostrom, N., 2008. Global catastrophic risks survey. Future of Humanity Institute, Oxford University. []
  5. Yudkowsky, E., 2008a. Cognitive biases potentially affecting judgment of global risks. Global catastrophic risks 1, 86. []
  6. Bostrom, N. (2003). Astronomical waste: The opportunity cost of delayed technological development. Utilitas, 15(03), 308-314. []

Ben Harack

I'm an aspiring omnologist who is fascinated by humanity's potential.