Week 4 - Our Final Century?

Existential Risks

The case for reducing existential risk

Importance/Scale

  • Humility

    • Suddenly

      • 1950s marked the start of a new age for humanity

        • For the first time in history, it became possible for a small number of decision-makers to wreak havoc on the whole world

      • Humanity had entered a new age, where we faced not only existential risks[2] from our natural environment, but also those of our own creation.

    • Unknown Unknowns

      • Risks we haven’t thought of yet

        • If you had asked people in 1900 what the greatest risks to civilisation were, they probably wouldn’t have suggested nuclear weapons, genetic engineering or artificial intelligence

  • Our first priority should be to survive.

    • So long as civilisation continues to exist, we’ll have the chance to solve all our other problems, and have a far better future. But if we go extinct, that’s it.

    • Improving technology? Helping the poor? Changing the political system?

  • Expected Value

    • Present

      • Population of the Earth in the middle of the century will be about 10 billion, so a 3% chance of everyone dying means the expected number of deaths is about 300 million.

        • This is probably more deaths than we can expect over the next century due to the diseases of poverty, like malaria.24

    • Future

      • If civilisation ends, then we give up our entire future too.

Probability of X-Risks

  • Near-Miss

    • Russian nuclear submarine was trying to smuggle materials into Cuba when they were discovered by the American fleet.

    • The fleet began to drop dummy depth charges to force the submarine to surface. The Russian captain thought they were real depth charges and that, while out of radio communication, the third world war had started. He ordered a nuclear strike on the American fleet with one of their nuclear torpedoes.

    • Fortunately, he needed the approval of other senior officers. One, Vasili Arkhipov, disagreed, preventing war.

  • Nuclear War

    • Nuclear Winter

    • civilisation-ending nuclear war in the next century isn’t over 0.3%. That would mean the risks from nuclear weapons are greater than all the natural risks put together.

  • Climate Change

    • The most likely outcome is 2–4 degrees of warming,18 which would be bad, but survivable for our species.

    • However, some estimates give a 10% chance of warming over 6 degrees, and perhaps a 1% chance of warming of 9 degrees.

    • So, it seems like the chance of a massive climate disaster created by CO2 is perhaps similar to the chance of a nuclear war.

  • Engineered Pandemics

    • In 1918-1919, over 3% of the world’s population died of the Spanish Flu.19

    • If such a pandemic arose today, it might be even harder to contain due to rapid global transport.

    • Genetically engineer a virus that’s as contagious as the Spanish Flu, but also deadlier, and which could spread for years undetected.

    • Easier to use in comparison

      • Nuclear weapons require huge factories and rare materials to make, which makes them relatively easy to control. Designer viruses might be possible to create in a lab with a couple of people with biology PhDs.

  • Artificial Intelligence

    • Powerful AI system could be used by one group to gain control of the world, or otherwise be misused

    • Unintended consequences

      • Since it would be difficult to predict what something smarter than us would do

    • Putting the estimates together, if there’s a 75% chance that high-level machine intelligence is developed in the next century, then this means that the chance of a major AI disaster is 5% of 75%, which is about 4%.

  • Toby Ord: total existential risk this century as 1 in 6 (or about 17%)

    • Toby Ord estimated that if we sum all the natural risks together, they’re very unlikely to add up to more than a 1 in 300 chance of extinction per century.7

  • Will MacAskill: risk of extinction this century at around 1%

Neglect

  • Resources

    • Money, Workers, Political Attention, Scientific Study

  • Systematic Hindrance

    • pressures of the news and election cycle make it hard to focus on them

  • Reasons for Neglect

    • Prisoner’s dilemma

      • Not a responsibility of any single nation

      • Suppose the US invested heavily to prevent climate change. This benefits everyone in the world, but only about 5% of the world’s population lives in the US, so US citizens would only receive 5% of the benefits of this spending.

Solvability

  • Targeted Efforts

    • Engineered Pandemics

      • disease surveillance

    • Climate Change

      • develop better solar panels, or introducing a carbon tax.

    • AI

      • research into the “control problem” within computer science, to reduce the chance of unintended damage from powerful AI systems

    • Nuclear Security

      • Lower stock piles

        • deterrence benefits of nuclear weapons could be maintained with far smaller stockpiles

        • reduce the risks of accidents, as well as the chance that a nuclear war, if it occurred, would end civilisation.

  • Broad Efforts

  • Research

  • Gradually reallocate resources from broad to targeted efforts

    • Since broad efforts help even if we’re not sure about the details of the risks, they’re more attractive the more uncertain you are. The closer the more focus on targeted efforst

  • Cost-Effectiveness

    • cost per present lives saved for the next one billion dollars invested in reducing existential risk were under $100

    • save lives for only $1,000 per person

      • $100 billion spent on reducing existential threats could reduce it by over 1% over the next century

      • A one percentage point reduction in the risk would be expected to save about 100 million lives among the present generation (1% of about 10 billion people alive today). This would mean the investment would .27

    • 10,000 talented young people focused their careers on these risks, they could achieve something like a 1% reduction in the risks

      • Each person would save 1,000 lives in their career

Personal Reflection

  • Focusing solely on survival might be negative

    • Surviving 'negatively' might be worse than non-existence

Risks from Pandemics

Preventing Catastrophic Pandemics – 80,000 Hours

Importance

  • Some of the deadliest events in history have been pandemics

    • COVID-19 demonstrated that we’re still vulnerable to these events

    • The Plague of Justinian (541-542 CE)

      • killed around 6 million (about ~3% of world population)4

      • contributed to reversing the territorial gains of the Byzantine empire.

    • The Black Death (1335-1355 CE)

      • killed 20–75 million people (about 10% of world population)

      • believed to have had profound impacts on the course of European history.

    • The Columbian Exchange (1500-1600 CE)

      • likely played a major role in the loss of around 80% of Mexico’s native population during the 16th century.

      • Some groups may have lost as much as 98% of their people to these diseases.5

    • The 1918 Influenza Pandemic (1918 CE)

      • killed 50–100 million people (2.5%–5% of the world population).

  • Future Events could be even more lethal via Engineered Pandemics

    • Re-creation

      • Polio, the 1918 pandemic influenza strain, and most recently horsepox have all been recreated from scratch

    • Creation

      • Mirror Bacteria

        • “Living in an area contaminated with mirror bacteria could be similar to living with severe immunodeficiencies,” one of the scientists explained. “Any exposure to contaminated dust or soil could be fatal.”

    • Gain-of-function experiments

    • Information Hazards

      • And the scientific community has yet to establish strong enough norms to discourage and prevent the unrestricted sharing of dangerous findings, such as methods for making a virus deadlier

  • Risk

    • Natural Pandemics unlikely

      • societies have recovered from pandemics that killed as much as 50% of the population, and perhaps more.1

      • None of the worst plagues we know about in history was enough to destabilise civilization worldwide or clearly imperil our species’ future.

    • Greater than 1 in 10,000 chance of a biological catastrophe leading to human extinction within the next 100 years, and potentially as high as 1 in 100.

    • Catastrophic event (meaning an event in which 10% or more of the human population dies) by 2100: ~1–3%

    • Human extinction event: 1 in 50,000 to 1 in 100

    • Genetically engineered pathogen killing more than 1% of the population by 2100: 4–10%20

  • Source

    • Deliberate (Bad Actors)

      • What?

        • States or malicious actors with access to these pathogens could use them as offensive weapons

      • Examples

        • In 1763, British forces at Fort Pitt gave blankets from a smallpox ward to Native American tribes, aiming to spread the disease and weaken these communities. It’s unclear if this effort achieved its aims, though smallpox devastated many of these groups.

        • During World War II, the Japanese military’s Unit 731 conducted horrific human experiments and biological warfare in China. They used anthrax, cholera, and plague, killing thousands and potentially many more. The details of these events were only uncovered later.

        • In the 1960s and 1970s, the South African government developed a covert chemical and biological warfare program known as Project Coast. The program aimed to develop biological and chemical agents targeted at specific ethnic groups and political opponents, including efforts to develop sterilisation and infertility drugs.

        • In 1984, followers of the Rajneesh movement contaminated salad bars in Oregon with Salmonella, causing more than 750 infections. It was an attempt to influence an upcoming election.

        • In 2001, shortly after the September 11 attacks, anthrax spores were mailed to several news outlets and two U.S. Senators, causing 22 infections and five deaths.

    • Accidents

      • What?

        • Dangerous pathogens engineered for research purposes could also be released accidentally through a failure of lab safety.

      • Examples

        • In 1977, an unusual flu strain emerged that disproportionately sickened young people and was found to be genetically frozen in time from a 1950 strain, suggesting a lab origin from a faulty vaccine trial.

        • In 1978, a lab leak at a UK facility resulted in the last smallpox death.

        • In 1979, an apparent bioweapons lab in the USSR accidentally released anthrax spores that drifted over a town, sickening residents and animals, and killing about 60 people. Though initially covered up, Russian President Boris Yeltsin later revealed it was an airborne release from a military lab accident.

        • In 2014, dozens of CDC workers were potentially exposed to live anthrax after samples meant to be inactivated were improperly killed and shipped to lower-level labs that didn’t always use proper protective equipment.

        • We don’t really know how often this kind of thing happens because lab leaks are not consistently tracked. And there have been many more close calls.

Neglect

Solvability

  • Fields

    • Industry

    • Research

    • Policy

  • Solutions

    • Personal Protective Equipment

      • For instance, it seems plausible that there may just be no viable way to design a virus that can penetrate sufficiently secure personal protective equipment or to survive under far-UVC light.

  • Facilitators

    • We have better hygiene and sanitation than past eras, and these will likely continue to improve.

    • We can produce effective vaccinations and therapeutics.

    • We better understand disease transmission, infection, and effects on the body.

    • The human population is healthier overall.

  • Hindrances

    • Trade and air travel allow much faster and wider transmission of disease.8

    • Climate change may increase the likelihood of new zoonotic diseases.

    • Greater human population density may increase the likelihood that diseases will spread rapidly.

    • Much larger populations of domestic animals can potentially pass diseases on to humans.

Strategies for Improving Biosecurity

Concrete Biosecurity Projects (some of which could be big)

Early Detection Center

  • Pro

    • increases the amount of time we have to respond

      • (e.g. designing tailored countermeasures, using protective equipment, heading to bunkers, etc)

    • How?

      • small team of people collects samples from volunteer travelers around the world and then does a full metagenomic scan for anything that could be dangerous.[2]

Super PPE

  • Pro

    • ‘pathogen-agnostic’—we can stockpile it in advance of knowing what the threat is, in contrast to vaccines or many medical countermeasures

    • How?

      • produce a suit (and accompanying system) that is designed for severely immunocompromised people to lead relatively normal lives

Medical countermeasures

  • e.g. vaccines, antivirals, monoclonal antibodies

  • Pro

    1. producing targeted countermeasures against particularly concerning threats (or broad-spectrum countermeasures against a class of threats)

    2. creating rapid response platforms that are reliable even against deliberate adversaries.

  • Contra

    • tailored to existing pathogens (e.g. smallpox vaccines) and wouldn’t help against a novel threat

    • platforms that use viral vectors may accelerate viral engineering technology

BWC Strengthening

  • Pro

    • scouring open sources (i.e. publication records, job specs, equipment supply chains) could potentially make it difficult for a lab to get away with doing bad research, and thereby strengthen the treaty

    • Coordination

      • bilateral agreements and avoiding bureaucratic gridlock

Sterilization technology

  • Pro

    • broadly applicable, difficult to engineer around, and having little dual-use downside potential

      • rely on physical principles (e.g. ionizing radiation)

      • broadly antiseptic properties (e.g., hydrogen peroxide, bleach)

      • rather than molecular details

        • (e.g. gram-negative antibiotics)

    • How?

      • (e.g. UV light, materials science for antimicrobial surfaces, etc.)

Refuges

  • Pro

    • safeguard against catastrophic pandemics

    • e.g.

      • cycling teams of people in and out with extensive pathogen agnostic testing

      • adding a ‘civilization-reboot package’

      • capability to develop and deploy biological countermeasures from the protected space).

    • some portion of the human population is always effectively in preemptive quarantine

Comparing it to AI

  • AI might be between 10-100x more important than biosecurity

Expected value & Hits-based giving

Expected Value

Benefit

  • Expected value reasoning can help us answer questions when we're unsure of the answer

Expected value = Probability × Value

  • Combine Probability & Value

    • Low Probability x High Value

      • You should wear a seatbelt despite the fact that the odds of a crash are low for any given trip

  • Examples

    • Will you catch COVID if you go to the movies? Will you be more successful as an academic or a software engineer? Does taking a painkiller make you feel better?

    • You're a lifeguard at a busy beach when you notice a rare tide has resulted in 1,000 people being drawn out to sea.

      You only have time for one of two options:

      1. Run in and save 1 person chosen randomly, OR

      2. Call the Coast Guard, who have a 1% chance of saving everyone if they happen to already be in the area.

  • Option 1:

    100% × 1 life = 1 life saved

  • Option 2

    1% × 1,000 lives = 10 lives saved

Donating like a startup investor: Hits-based giving, explained

High Probability * Low Value

Low Probability * High Value ('hits-based giving')

  • e.g., investing into start-ups like Amazon, Google

  • Co-efficient Giving (Open Philantrophy) partially uses hits based giving

  • Requirements

    • let go of the need for strong evidence

    • Trying not to "defer to expert opinion or conventional wisdom" is one way to ensure you're not accidentally missing promising opportunities

  • Benefit

    • most past cases of philanthropic “hits” were not evidence-backed in the sense of having strong evidence directly predicting success.

Example

  • Two charities, Alpha and Beta

    • Alpha has a 100% chance of helping 1,000 people

      • EV = 1,000 * 100% = 1,000

      • Beta has a 1% chance of helping 1,000,000 people

      • EV = 10,000

      • Donors that prefer certainty might donate to Alpha. But to others,

      • Beta’s much higher expected value makes it a better donation opportunity.

Crucial considerations

Crucial consideration

What?

  • consideration that warrants a major reassessment of a cause or intervention

Benefit

  • Humility & Curiosity

    • undiscovered crucial considerations raises a very serious challenge for any attempt to do good effectively on a large scale

Last updated