Category: 

What is Existential Risk?

Article Details
  • Written By: Michael Anissimov
  • Edited By: Niki Foster
  • Last Modified Date: 19 November 2016
  • Copyright Protected:
    2003-2016
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
Contrary to popular belief, monkeys do not eat bananas in the wild because the banana is a cultivated fruit.  more...

December 6 ,  1877 :  Edison demonstrated the first sound recording.  more...

An existential risk is a disaster so great that it either wipes out all of humanity or permanently cripples us. These may be natural disasters, or man-made disasters of an intentional or accidental nature. An existential risk may have been around for a long time, or only a few decades, or perhaps it lies in our future. Examples of existential risk include large asteroid strikes, nuclear war, and rogue Artificial Intelligence.

The concept of existential risk was first articulated in its current form by Dr. Nick Bostrom, an Oxford philosopher. He uses a risk chart similar to the following to explain existential risks:

Scope of risk
Global El Niño deforestation existential risk
Local thunderstorm economic downturn hurricane
Personal papercut sprained ankle you are shot
Intensity of risk Negligible Manageable Terminal

Existential risks are global and terminal, or perhaps near-terminal. An extremely contagious virus with a 99.9% lethality rate that no one is immune to is an example of an existential risk.

Ad

Bostrom points out that our minds and institutions are ill-equipped to deal with thinking about existential risk, because we have never experienced one before – if we had, we wouldn’t be here to think about them. Like a child that doesn’t know that a stove is hot until he touches it, we have little experience with catastrophes on this level. The Bubonic plague of medieval Europe and the Spanish flu of WWI offer us a taste of what an existential disaster would be like. Dozens of millions of healthy people were struck dead in mere hours by both diseases.

In his canonical paper on the topic, Bostrom lists about a dozen existential risks and categorizes them based on their severity and recoverability. Some of the most plausible ones are listed here:

  • genetically engineered viruses
  • nanotechnological arms races
  • catastrophic nuclear war
  • out-of-control self-replicating robotics
  • superintelligent AI indifferent to humans
  • physics disaster in a particle accelerator
  • supervolcano explosion blocks out the sun

Because of the extreme severity and irreversibility of existential risk, possible countermeasures are worth brainstorming and implementing. Even if the chance of a given existential threat becoming a reality is small, the immense stakes involved demand a serious avoidance program. For human-originating threats, countermeasures include sophisticated observation-and-alert systems and regulation of certain technologies to ensure that they are not used for mass destruction. Countries suspected of possessing weapons of mass destruction are sometimes invaded by other countries worried about the long-term consequences, as the War in Iraq vividly demonstrates.

Ad

You might also Like

Recommended

Discuss this Article

Post your comments

Post Anonymously

Login

username
password
forgot password?

Register

username
password
confirm
email