6 The Risk Landscape
Last updated
Last updated
In this chapter, we quantify the risks we’ve previously covered. We talk about risk factors, such as a great power war, and existential security factors, things that decrease the risk without directly targeting the issue itself.
Exercise - Prioritizing between x-risks
Choose a x-risk and apply the variables of prioritization
Which one should we prioritize and why?
How are these prioritization-tools helpful?
Do they compliment or influence each other?
Variables of Prioritization
Importance Tractability Neglect (ITN)
Soon Sudden Sharp
Soon vs late; Sudden vs Slow; Sharp vs 'Warning'
Stage - Origin, Scaling, Endgame
Origin - How does the catastrophe get started?
Scaling - How does the catastrophe reach a global scale?
Endgame - How does the catastrophe finish the job?
X - Factor - Risk vs Security
Overlap
What are some examples of risk or security factors? Put in another way, how can we become more/less vulnerable to existential risk?
Exercise
Create a causal chain of risk or security factors that you consider most important, while showing how this would lead to x-risk.
Example
Factors e.g., Great Power war, global economic stagnation, environmental collapse, breakdown in international order
Risk factor e.g., Capitalism increases race for power -> Neglect of AI Safety -> X-risk
Security factor e.g., Global regulations on Safe technology -> AI Safety
Are there fundamental factors contributing to several security and risk factors?
Could it be possible that it is more cost-effective to invest in security/risk factors rather than x-risk themselves?
What do the ideas imply, if taken seriously?
What are effective ways to prevent the harm from x-risks?
Targeted vs Fundamental
Would you rather invest resources into targeting the immediate causes for existential risk or the root causes?
Targeted vs Portfolio#
Should one invest in all or specific?
Part 1
Ord estimates the chance of a future near the best possible future at about 50%. What do people think of this?
Ord writes that the strongest case for broad interventions is for those that are especially neglected. What broad interventions are highly neglected?
What are other concrete risk/security factors? (Ord gives: Great Power war, global economic stagnation, environmental collapse, breakdown in international order)
Part 2
General discussion on the Soon-Sudden-Sharp framework (how is it useful, where is it useful, etc.)
Given how saturated education is, how much effort should go into increasing awareness/education/meta-work to get more people working on these issues compared to direct work?
Ord mentions wanting to increase the amount of funding going towards x-risk reduction by 100x. What seem to be the biggest bottlenecks? How do the bottlenecks interact with each other?
How much should comparative advantage and personal fit factor into career decisions given the disparate likelihoods for each risk (and the levels of uncertainty around the numbers)