Existential risk
Encyclopedia
Existential risks are dangers that have the potential to destroy, or drastically restrict, human civilization. They are distinguished from other forms of risk both by their scope, affecting all of humanity, and severity; destroying or irreversibly crippling the target.
Natural disasters, such as supervolcano
s and asteroids, may pose existential risks if sufficiently powerful, though man-made events could also threaten the survival of intelligent life on earth, like catastrophic global warming
, nuclear war, or bioterrorism
.
Despite the importance of existential risks, it is a difficult subject to study directly. Mankind has never been destroyed before, and while this does not mean that it won’t be in the future, it does make modelling existential risks difficult, due in part to Survivorship bias
.
While individual threats, such as those posed by nuclear war or climate change, have been intensively studied on their own, the systematic study of existential risks did not begin until 2002.
, which almost caused the extinction of the human race, has been estimated at about 1 in every 50,000 years. However, the relative danger posed by other threats is much more difficult to calculate. Though experts at the Global Catastrophic Risk Conference suggested a 19% chance of human extinction over the next century, there was considerable disagreement about the relative prominence of any particular risk.
There are significant methodological challenges in estimating these risks. Most attention has been given to risks to human civilization over the next 100 years, but forecasting for this length of time is very difficult. The types of threats posed by nature may prove relatively constant, though new risks could be discovered. Anthropogenic threats, however, are likely to change dramatically with the development of new technology; while volcanoes have been a threat throughout history, nuclear weapons have only been an issue since the 20th century. Historically, the ability of experts to predict the future over these timescales has proved very limited, though modern probabilistic forecasting methods, like prediction markets, as well as more traditional approaches such as peer review could increase the accuracy of prediction.
Man-made threats such as nuclear war or nanotechnology are even harder to predict, due to the inherent methodological difficulties in the social sciences. During the Cuban Missile Crisis
, John F. Kennedy
estimated that there was between a third and a half chance of nuclear war. Despite this, in general it is hard to estimate the magnitude of the risk from this or other dangers, especially as both international relations and technology can change rapidly.
Existential risks pose unique challenges to prediction, even more than other long-term events, because of observation selection effects
. Unlike with most events, the failure of catastrophic events to occur in the past is not evidence against their likelihood in the future, because every world that has experienced one has no observers, so regardless of their frequency, no civilization observes existential risks in its history. These [anthropic issues] can be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the moon, or directly evaluating the likely impact of new technology.
While a variety of explanations for the Fermi paradox exist, such as that the earth may be part of a galactic zoo, one plausible explanation is that a Great Filter exists; an evolutionary step between the emergence of life on an earth-like planet and the colonization of space that is incredibly hard to take. Clearly, if this filter is ahead of us – perhaps most civilizations destroy themselves in nuclear wars – then unless humanity is very unusual, it is likely to prevent us from colonizing space.
makes past events appear to have been more predictable than they actually were, leading to overconfidence in our ability to predict the future.
Conjunction bias occurs when people overestimate the likelyhood of conjunctions; for example, considering an activist more likely to grow up into a feminist bank worker than a bank worker. Equally, people underestimate the likelihood of disjunctions. The threat of existential risks is heavily disjunctive; nuclear war or climate change or bio terrorism or asteroids or solar flares or artificial intelligence – so people tend to under-estimate its plausibility.
There are many other biases that affect how likely people think existential disasters to be, such as overconfidence and anchoring
, or how whether or not they get involved, such as bystander effect
. A different type of bias is that caused by scope insensitivity. Rather than causing people to under or over-estimate the likelihood of an existential disaster, scope insensitivity affects how bad people consider the extinction of the human race to be. While people may be motivated to donate money to alleviate the ill, the quantity they’re willing to give does not scale linearly with the magnitude of the issue; for example, people are as concerned about 200,000 birds getting stuck in oil as they are about 2,000, rather than a hundred times more concerned. Similarly, people are often more concerned about threats to individuals than to larger groups.
argues that extinction would be a great loss because our descendants could potentially survive for a billion years before the increasing heat of the Sun makes the Earth become uninhabitable. Bostrom argues that there is even greater potential in colonizing space. If our descendants colonize space, we may be able to support a very large number of people on other planets, potentially lasting for trillions of years. Therefore, reducing existential risk by even a small amount would have a very significant impact on the expected number of people that will exist in the future.
Little has been written arguing against these positions, but some scholars would disagree. Exponential discounting might make these future benefits much less significant, and some philosophers doubt the value of ensuring the existence of future generations.
Some economists have also discussed the importance of existential risks, though most of the discussion goes under the name “catastrophic risk.” Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage. Richard Posner has argued that we are doing far too little, in general, about small, hard-to-estimate risks of large scale catastrophes.
The articles "Risks to civilization, humans and planet Earth
" and "Human extinction
" list a number of potential existential risks.
Natural disasters, such as supervolcano
Supervolcano
A supervolcano is a volcano capable of producing a volcanic eruption with an ejecta volume greater than 1,000 cubic kilometers . This is thousands of times larger than most historic volcanic eruptions. Supervolcanoes can occur when magma in the Earth rises into the crust from a hotspot but is...
s and asteroids, may pose existential risks if sufficiently powerful, though man-made events could also threaten the survival of intelligent life on earth, like catastrophic global warming
Global warming
Global warming refers to the rising average temperature of Earth's atmosphere and oceans and its projected continuation. In the last 100 years, Earth's average surface temperature increased by about with about two thirds of the increase occurring over just the last three decades...
, nuclear war, or bioterrorism
Bioterrorism
Bioterrorism is terrorism involving the intentional release or dissemination of biological agents. These agents are bacteria, viruses, or toxins, and may be in a naturally occurring or a human-modified form. For the use of this method in warfare, see biological warfare.-Definition:According to the...
.
Despite the importance of existential risks, it is a difficult subject to study directly. Mankind has never been destroyed before, and while this does not mean that it won’t be in the future, it does make modelling existential risks difficult, due in part to Survivorship bias
Survivorship bias
Survivorship bias is the logical error of concentrating on the people or things that "survived" some process and inadvertently overlooking those that didn't because of their lack of visibility. This can lead to false conclusions in several different ways...
.
While individual threats, such as those posed by nuclear war or climate change, have been intensively studied on their own, the systematic study of existential risks did not begin until 2002.
Quotes
- Our approach to existential risks cannot be one of trial-and-error. There is no opportunity to learn from errors. The reactive approach — see what happens, limit damages, and learn from experience — is unworkable. Rather, we must take a proactive approach. This requires foresight to anticipate new types of threats and a willingness to take decisive preventive action and to bear the costs (moral and economic) of such actions.
- —Nick BostromNick BostromNick Bostrom is a Swedish philosopher at the University of Oxford known for his work on existential risk and the anthropic principle. He holds a PhD from the London School of Economics...
- —Nick Bostrom
Chances of an existential catastrophe
Some risks, such as that from asteroid impact, with a 1-in-a-million chance of causing our extinction in the next century, have had their probabilities predicted with considerable accuracy (though later research suggested the actual rate of large impacts could be much higher than predicted). Similarly, the frequency of volcanic eruptions of sufficient magnitude to cause catastrophic climate change, similar to the Toba EruptionToba catastrophe theory
The Toba supereruption was a supervolcanic eruption that occurred some time between 69,000 and 77,000 years ago at Lake Toba . It is recognized as one of the Earth's largest known eruptions...
, which almost caused the extinction of the human race, has been estimated at about 1 in every 50,000 years. However, the relative danger posed by other threats is much more difficult to calculate. Though experts at the Global Catastrophic Risk Conference suggested a 19% chance of human extinction over the next century, there was considerable disagreement about the relative prominence of any particular risk.
There are significant methodological challenges in estimating these risks. Most attention has been given to risks to human civilization over the next 100 years, but forecasting for this length of time is very difficult. The types of threats posed by nature may prove relatively constant, though new risks could be discovered. Anthropogenic threats, however, are likely to change dramatically with the development of new technology; while volcanoes have been a threat throughout history, nuclear weapons have only been an issue since the 20th century. Historically, the ability of experts to predict the future over these timescales has proved very limited, though modern probabilistic forecasting methods, like prediction markets, as well as more traditional approaches such as peer review could increase the accuracy of prediction.
Man-made threats such as nuclear war or nanotechnology are even harder to predict, due to the inherent methodological difficulties in the social sciences. During the Cuban Missile Crisis
Cuban Missile Crisis
The Cuban Missile Crisis was a confrontation among the Soviet Union, Cuba and the United States in October 1962, during the Cold War...
, John F. Kennedy
John F. Kennedy
John Fitzgerald "Jack" Kennedy , often referred to by his initials JFK, was the 35th President of the United States, serving from 1961 until his assassination in 1963....
estimated that there was between a third and a half chance of nuclear war. Despite this, in general it is hard to estimate the magnitude of the risk from this or other dangers, especially as both international relations and technology can change rapidly.
Existential risks pose unique challenges to prediction, even more than other long-term events, because of observation selection effects
Anthropic principle
In astrophysics and cosmology, the anthropic principle is the philosophical argument that observations of the physical Universe must be compatible with the conscious life that observes it. Some proponents of the argument reason that it explains why the Universe has the age and the fundamental...
. Unlike with most events, the failure of catastrophic events to occur in the past is not evidence against their likelihood in the future, because every world that has experienced one has no observers, so regardless of their frequency, no civilization observes existential risks in its history. These [anthropic issues] can be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the moon, or directly evaluating the likely impact of new technology.
Fermi paradox
Many extra-solar planets have been discovered, and there are likely to be many more earth-like planets, capable of supporting life. Given the relative rapidity with which life evolved on earth, and the size of the observable universe, it seems a priori likely that intelligent life would have independently arisen on other planets. As such, the absence of any sign of intelligent life beyond the earth forms an apparent paradox. Especially relevant is the absence of large-scale astro-engineering projects, suggesting that few civilizations survive to colonize space.While a variety of explanations for the Fermi paradox exist, such as that the earth may be part of a galactic zoo, one plausible explanation is that a Great Filter exists; an evolutionary step between the emergence of life on an earth-like planet and the colonization of space that is incredibly hard to take. Clearly, if this filter is ahead of us – perhaps most civilizations destroy themselves in nuclear wars – then unless humanity is very unusual, it is likely to prevent us from colonizing space.
Cognitive Bias
Research into cognitive biases reveals a number of ways in which humans fall short of unbiased rationality, many of which affect the prediction of existential risks. For example, availability bias may make people underestimate the danger of existential risk, as clearly no-one has any experience of them. Equally, hindsight biasHindsight bias
Hindsight bias, or alternatively the knew-it-all-along effect and creeping determinism, is the inclination to see events that have already occurred as being more predictable than they were before they took place. It is a multifaceted phenomenon that can affect different stages of designs,...
makes past events appear to have been more predictable than they actually were, leading to overconfidence in our ability to predict the future.
Conjunction bias occurs when people overestimate the likelyhood of conjunctions; for example, considering an activist more likely to grow up into a feminist bank worker than a bank worker. Equally, people underestimate the likelihood of disjunctions. The threat of existential risks is heavily disjunctive; nuclear war or climate change or bio terrorism or asteroids or solar flares or artificial intelligence – so people tend to under-estimate its plausibility.
There are many other biases that affect how likely people think existential disasters to be, such as overconfidence and anchoring
Anchoring
Anchoring or focalism is a cognitive bias that describes the common human tendency to rely too heavily, or "anchor," on one trait or piece of information when making decisions.-Background:...
, or how whether or not they get involved, such as bystander effect
Bystander effect
The bystander effect or Genovese syndrome is a social psychological phenomenon that refers to cases where individuals do not offer any means of help in an emergency situation to the victim when other people are present...
. A different type of bias is that caused by scope insensitivity. Rather than causing people to under or over-estimate the likelihood of an existential disaster, scope insensitivity affects how bad people consider the extinction of the human race to be. While people may be motivated to donate money to alleviate the ill, the quantity they’re willing to give does not scale linearly with the magnitude of the issue; for example, people are as concerned about 200,000 birds getting stuck in oil as they are about 2,000, rather than a hundred times more concerned. Similarly, people are often more concerned about threats to individuals than to larger groups.
Potential importance of existential risk
Some scholars have strongly favored reducing existential risk on the grounds that it greatly benefits future generations. Derek ParfitDerek Parfit
Derek Parfit is a British philosopher who specializes in problems of personal identity, rationality and ethics, and the relations between them. His 1984 book Reasons and Persons has been very influential...
argues that extinction would be a great loss because our descendants could potentially survive for a billion years before the increasing heat of the Sun makes the Earth become uninhabitable. Bostrom argues that there is even greater potential in colonizing space. If our descendants colonize space, we may be able to support a very large number of people on other planets, potentially lasting for trillions of years. Therefore, reducing existential risk by even a small amount would have a very significant impact on the expected number of people that will exist in the future.
Little has been written arguing against these positions, but some scholars would disagree. Exponential discounting might make these future benefits much less significant, and some philosophers doubt the value of ensuring the existence of future generations.
Some economists have also discussed the importance of existential risks, though most of the discussion goes under the name “catastrophic risk.” Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage. Richard Posner has argued that we are doing far too little, in general, about small, hard-to-estimate risks of large scale catastrophes.
See also
- Fermi paradoxFermi paradoxThe Fermi paradox is the apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilizations and the lack of evidence for, or contact with, such civilizations....
- Outside Context Problem
- Doomsday ArgumentDoomsday argumentThe Doomsday argument is a probabilistic argument that claims to predict the number of future members of the human species given only an estimate of the total number of humans born so far...
- Organizations formed to prevent or mitigate existential risks
- Center for Responsible NanotechnologyCenter for Responsible NanotechnologyCenter for Responsible Nanotechnology is a non-profit research and advocacy organization with a focus on molecular manufacturing and its possible effects, both positive and negative...
— for safe, efficient nanotechnology. - Singularity Institute for Artificial Intelligence — for developing Friendly AI
- Foresight InstituteForesight InstituteThe Foresight Institute is a Palo Alto, California-based nonprofit organization for promoting transformative technologies. They sponsor conferences on molecular nanotechnology, publish reports, and produce a newsletter....
— for safe nanotechnology and a society prepared to handle the consequences of such - Center for Genetics and SocietyCenter for Genetics and SocietyThe Center for Genetics and Society is a nonprofit information and public affairs organization, based in Berkeley, California, United States. It encourages responsible use and promotes the regulation of new human genetic and reproductive technologies, to confine them to what it considers...
— for the relinquishment of genetic technologies which may irrevocably change the definition of "humanHumanHumans are the only living species in the Homo genus...
" - Svalbard Global Seed VaultSvalbard Global Seed VaultThe Svalbard Global Seed Vault is a secure seedbank located on the Norwegian island of Spitsbergen near the town of Longyearbyen in the remote Arctic Svalbard archipelago, about from the North Pole. The facility preserves a wide variety of plant seeds in an underground cavern. The seeds are...
— a doomsday seedbank to prevent important agricultural and wild plants from becoming rare or extinct in the event of a global disaster - Future of Humanity InstituteFuture of Humanity InstituteThe Future of Humanity Institute is part of the Faculty of Philosophy and the James Martin 21st Century School at University of Oxford, England...
- Center for Responsible Nanotechnology
The articles "Risks to civilization, humans and planet Earth
Risks to civilization, humans and planet Earth
Various existential risks could threaten humankind as a whole, have adverse consequences for the course of human civilization, or even cause the end of planet Earth.-Types of risks:...
" and "Human extinction
Human extinction
Human extinction is the end of the human species. Various scenarios have been discussed in science, popular culture, and religion . The scope of this article is existential risks. Humans are very widespread on the Earth, and live in communities which are capable of some kind of basic survival in...
" list a number of potential existential risks.
External links
- http://www.existentialrisk.com - A website about existential risk by Nick Bostrom.
- Articles and Essays
- Existential Risks: Analyzing Human Extinction Scenarios - The original essay by Nick Bostrom
- Astronomical Waste: The Opportunity Cost of Delayed Human Development – A paper by Nick Bostrom on the ethical importance of existential risk
- Cognitive biases potentially affecting judgment of global risks - A paper by Eliezer YudkowskyEliezer YudkowskyEliezer Shlomo Yudkowsky is an American artificial intelligence researcher concerned with the singularity and an advocate of friendly artificial intelligence, living in Redwood City, California.- Biography :...
discussing how various observed cognitive biasCognitive biasA cognitive bias is a pattern of deviation in judgment that occurs in particular situations. Implicit in the concept of a "pattern of deviation" is a standard of comparison; this may be the judgment of people outside those particular situations, or may be a set of independently verifiable...
es hamper our judgement of existential risk. - Why the future doesn't need us, Wired.com, April 2000 - Bill Joy's influential call to relinquish dangerous technologies.
- http://medicine.journalfeeds.com/psychiatry/j-pers-soc-psychol/being-present-in-the-face-of-existential-threat-the-role-of-trait-mindfulness-in-reducing-defensive-responses-to-mortality-salience/20100728/http://medicine.journalfeeds.com/psychiatry/j-pers-soc-psychol/being-present-in-the-face-of-existential-threat-the-role-of-trait-mindfulness-in-reducing-defensive-responses-to-mortality-salience/20100728/Being present in the face of existential threat: The role of trait mindfulness in reducing defensive responses to mortality salience.]