Talking Politics Guide to ... Existential Risk

29:45
 
Share
 

Manage episode 224782451 series 1423621
By Discovered by Player FM and our community — copyright is owned by the publisher, not Player FM, and audio streamed directly from their servers.

David talks to Martin Rees about how we should evaluate the greatest threats facing the human species in the twenty-first century. Does the biggest danger come from bio-terror or bio-error, climate change, nuclear war or AI? And what prospects does space travel provide for a post-human future?

Talking Points:

Existential risk is risk that cascades globally and is a severe setback to civilization. We are now so interconnected and so empowered as a species that humans could be responsible for this kind of destruction.

  • There are natural existential risks too, such as asteroids. But what is concerning about the present moment is that humans have the ability to affect the entire biosphere.
  • This is a story about technology, but it’s also about global population growth and the depletion of resources.

There are four categories of existential risk: climate change, bioterror/bioerror, nuclear weapons, and AI/new technology.

  • Climate Change has a long tail, meaning that the risk of total catastrophe is non-negligible.
  • Bioterror/bio-error is becoming more of a risk as technology advances. It’s hard to predict the consequences of the misuse of biotech. Our social order is more vulnerable than it used to be. Overwhelmed hospitals could lead to a societal breakdown.
  • Machine learning has not yet reached the level of existential risk. Real stupidity, not artificial intelligence, will remain our chief concern in the coming decades. Still, AI could make certain kinds of cyber-attacks much worse.
  • The nuclear risk has changed since the Cold War. Today there is a greater risk that some nukes go off in a particular region, although global catastrophe is less likely.

These threats are human-made. Solving them is also our responsibility.

  • We can’t all move to Mars. Earth problems have to be dealt with here.
  • There are downsides to tech, but we will also need it. Martin describes himself as a technical optimist, but a political pessimist.

Mentioned in this episode:

Further Learning:

And as ever, recommended reading curated by our friends at the LRB can be found here: lrb.co.uk/talking

For information regarding your data privacy, visit acast.com/privacy

219 episodes available. A new episode about every 4 days averaging 39 mins duration .