5 Future Risks

In Week 5 we cover biosecurity, AI risks, dystopian pathways, and the potential of other overlooked risks.

Scale
  • What types of future existential risks are there?

    • What specific variables decreases/increases x-risk?

    • How can those future x-risks cause a existential catastrophe? What needs to happen?

    • What kind of existential catastrophe could this cause?

    • How could this make us more vulnerable for future x-risks i.e., a risk factor?

Universal

  • What do you think the relative risks of extinction are from groups trying to cause massive harm versus accidents?

  • Are we overlooking any existential risks not yet mentioned within our discussions?

Solvability
  • What do you suggest could be cost-effective interventions to reduce the likelihood of a existential catastrophe?

  • How has 'X' improved or worsened our capacity to deal well with natural x-risks?

    • X e.g., globalization, technology, government forms, etc.

    • What does this imply for strategies to deal with x-risks?

    • Is technological advancement the cause for ever-increasing existential risks?

      • What does this mean for our relationship to future technological advancement?

  • What can we learn from dealing with natural or human-made x-risks, for future x-risks?

  • Exercise - Team developing Unkonwn Powerful Technology

  • Exercise - Global Conference

Universal

  • Assuming we are a) overlooking an existential risk or b) we cannot prevent a known existential risk, what can we do to minimize the negative impacts of it anyways?

  • Will the risk of Existential catastrophe ever decrease, or will we always be under the threat of existential catastrophe?

  • How should we distribute our resources when it comes to comparing present and existential problems?

    • Should we invest most of our resources into existential risk because of the high expected value? EV = Low probability x Huge Value

Taking Ideas Seriously
  • How do you think the world in 50 years will look like, regarding all these future existential risks?

  • How do you feel about knowing about these potential x-risks?

Official Discussion Guide

Discussion 1

  • What do you think the relative risks of extinction are from groups trying to cause massive harm versus accidents?

  • Soil samples from Mars are to be processed in a facility that has a “one in a million” chance of letting an unsterilized particle escape (Ord, p. 160). However, BSL-4 labs exhibit a long history of containment failures, likely performing worse than the protocol aims for. In general, when humanity makes an effort to reduce risk to “one in a million”, what do you think the risk is actually reduced to?

Discussion 2

  • What do you think can be done to solve the unilateralist’s curse, if anything?

  • Ord references a survey on page 141 involving the top 300 researchers in ML stating that there is a 50% chance of AGI happening by 2061 and a 10% chance by 2025. What are some issues with this study? What precautions should we take from it?

  • Of existential catastrophes, which do you think is worst, and why?

Last updated