Existential risk

Материал из hpluswiki
Версия от 14:14, 20 января 2022; Rodion (обсуждение | вклад) (Новая страница: «thumb|right|[[Futurist humour]] An '''existential risk''' or '''existential threat''' is a potential development that could drastically (or even totally) reduce the capabilities of humankind. == Should I worry? == According to the Global Challenges Foundation a typical person could be five times more likely to die in a mass extinction event compared to a car crash.<ref>[http://www.theatlantic.com/technology/archive/2016/04/a-human-extin...»)
(разн.) ← Предыдущая версия | Текущая версия (разн.) | Следующая версия → (разн.)
Перейти к навигации Перейти к поиску
Futurist humour

An existential risk or existential threat is a potential development that could drastically (or even totally) reduce the capabilities of humankind.

Should I worry?[править]

According to the Global Challenges Foundation a typical person could be five times more likely to die in a mass extinction event compared to a car crash.[1] The Global Priorities Project later issued a retraction of the statement.[2]

Dystopia and Risk map[править]

Map created by Deku-shrub from the data below.

Key risks and scenarios[править]

The following list of risk encompasses long-term dystopian scenarios in addition to traditionally defined existential risks.

Exotic[править]

Conflict[править]

  • Nuclear war
    • Examples: Anything from a single accidental detonation or rogue state up to collapse of MAD and muti-national warfare
    • Causing: Accelerating warfare or environmental collapse
  • Biological warfare or another catastrophic natural super virus
    • Examples: National state attacks through to a 28 days later style ultra-transmissable and deadly plague
    • Causing: destroying humanity, accelerating warfare, natural disasters or environmental collapse
  • Cyberwarfare, terrorism, rogue states
    • Examples: Contemporary nation-state hacking, WannaCry[4] type events, terrorist use of social media, terrorist attacks, contained biological or nuclear attacks
    • Causes: Accelerating warfare due to national existential risks[5], social collapse[6] or reactionary governments[7]

Environmental[править]

  • Accelerating climate change
    • Examples: This happened now :(
    • Causing: Further natural disasters, depletion of the world's resources or driving dangerous bio or geo engineering projects
  • Errant geo-engineering or GMOs
    • Examples: Solar radiation management, carbon dioxide removal projects
    • Causing: Depletion of resources, further environmental disasters, nano-pollution or environmental collapse
  • Natural disasters or Cosmic disasters
    • Examples: Asteroid strike, deadly cosmic radiation, ice shelf collapse, bee extinction, crop plagues, ice age. Could even include the Sun's supernova, proton decay and final heat death of the universe.
    • Causing: Resource depletion, social or environmental collapse
  • Resource depletion
    • Examples: Fossil fuels, clean water, serious air quality impact, rare earths, localised overpopulation
    • Causing - Social or environmental collapse
  • Nanotechnology
    • Examples: Gray goo
    • Causes: Destruction of humanity, natural disasters or environmental collapse

Future computing and intelligence[править]

Social[править]

Attitudes[править]

Stances that can be adopted towards existential risks include:

  • Inevitabilism - the view that, for example, "the victory of transhumanism is inevitable"
  • Precautionary - emphasising the downsides of action in areas of relative ignorance
  • Proactionary - emphasising the dangers of inaction in areas of relative ignorance
  • Techno-optimism - the view that technological development is likely to find good solutions to all existential risks
  • Techno-progressive - the view that existential risks must be actively studied and managedШаблон:Disputed

Things that are not existential risks[править]

  • Declining sperm count in men whilst problematic for fertility, will not spell the end of humanity[10]
  • Increased sexual liberal attitudes will not lead to a Sodom and Gomorrah[11] scenario
  • The coming of a religious apocalypse or end times are unlikely
  • Sex robots, no matter what Futurama says[12]
  • Overpopulation caused by life extension will not lead to Soylent Green type scenarios, rather potentially cause localised instabilities and resource depletion[13]

See also[править]

External links[править]

References[править]