8 Our Potential
Last updated
Last updated
Our potential covers ways in which we might exceed as humans. It talks about the potential time scales we could endure, the potential of space colonization, and what we might value.
Choice
'What we do with our future is up to us. Our choices determine whether we live or die; fulfill our potential or squander our chance at greatness. We are not hostages to fortune. While each of our lives may be tossed about by external forces—a sudden illness, or outbreak of war—humanity’s future is almost entirely within humanity’s control.'
What is your view on the above position of 'choice'?
Utopia - How good, could a 'best' world look like?
Evolution of our Species
Should we transform, evolve or even replace ourselves?
Populating the Universe
Should we settle down beyond planet earth?
Should we populate the universe with as many happy lives as possible?
What needs to happen to increase the probability of it to become reality?
Do we need technology to fulfill this vision?
What is it all about? What should we focus on? What is the purpose of life
Reality
How do you think Humanity will actually develop?
'A key insight motivating existential security is that there appear to be no major obstacles to humanity lasting an extremely long time, if only that were a key global priority.'
How to perceive X-risk so that it encourages us and society to take it seriously. How to market x-risk?
What is your motivation for x-risk - if any?
How can existing attitudes towards e.g., climate change, social justice be leveraged for x-risk?
What do you consider good strategies for reducing x-risk?
What did you gain from your readings and reflecctions that you consider most helpful?
Discussion 1
The biggest takeaway from the book
At least 3 non-EA/longtermist friends who might find this book interesting and recommend it to them
Of people who are convinced to work on X-risk reduction, how many are primarily motivated by our future potential (as opposed to thinking extinction would be really bad and wanting to prevent that).
Similarly, in getting more people involved, which approach seems more persuasive/compelling (We might all die, wouldn’t that suck? Vs. Think about how incredible the future could be if we survive the next few centuries)
Discussion 2
Is accelerating space colonization a potential solution to addressing risks from transformative technologies on Earth?
Where do we go from here? How do we make good things happen? How do we save the world?
What role should/will social media play? What about other large-scale actions? I’m interested in what’s happening currently and how EA/longtermists should/shouldn’t try similar forms of awareness-raising (e.g. protests/marches for future generations)
How can/should existing attitudes towards future generations, climate change, and social justice issues more broadly be leveraged to improve X-risk reduction work?
Same question but about cool new technologies like AI, synthetic bio, etc.
When you imagine the long-term future (>150 years from now), what do you imagine? I’m interested in swapping specific images, stories, etc.
On pg 234, Ord said that a Long Reflection for a time short with respect to cosmic time scales is well worth it. Is this the right time scale to compare to, or is it the future reproductive rate of the human race that’s the relevant time scale? Every year we wait is another year at Earth’s population carrying capacity, not exponentially growing, which will exponentially decrease the total potential population
To put some math to this: if there is no carrying capacity for humans when we take to the stars, we could potentially expect exponential growth of the human population of the form population(t) = A * exp(t/tau) until the death of the Universe at time T. Then, the total population that would have lived is int_0^T A * exp(t/tau) dt = A*tau*(exp(T/tau) - 1). So, if we decrease T -> T - 5*tau, we’ve eliminated >99% of the total potential population
On the other hand, exp(100) is already about the volume of the reachable Universe in units of AU^3, so we are *not* going to grow exponentially for very many “tau” times. This suggests that pushing up against fundamental physical limits will matter more than getting a head start on exponential growth.