Artwork

Content provided by Dr Kerry McInerney and Dr Eleanor Drage, Dr Kerry McInerney, and Dr Eleanor Drage. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Dr Kerry McInerney and Dr Eleanor Drage, Dr Kerry McInerney, and Dr Eleanor Drage or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Transhumanism and Existential Risk with Josh Schuster and Derek Woods

37:36
 
Share
 

Manage episode 349604149 series 2933689
Content provided by Dr Kerry McInerney and Dr Eleanor Drage, Dr Kerry McInerney, and Dr Eleanor Drage. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Dr Kerry McInerney and Dr Eleanor Drage, Dr Kerry McInerney, and Dr Eleanor Drage or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Ever worried that AI will wipe out humanity? Ever dreamed of merging with AI? Well these are the primary concerns of transhumanism and existential risk, which you may not have heard of, but whose key followers include Elon Musk and Nick Bostrom, author of Superintelligence. But Joshua Schuster and Derek Woods have pointed out that there are serious problems with transhumanism’s dreams and fears, including its privileging of human intelligence above all other species, its assumption that genocides are less important than mass extinction events, and its inability to be historical when speculating about the future. They argue that if we really want to make the world and its technologies less risky, we should instead encourage cooperation, and participation in social and ecological issues.

  continue reading

88 episodes

Artwork
iconShare
 
Manage episode 349604149 series 2933689
Content provided by Dr Kerry McInerney and Dr Eleanor Drage, Dr Kerry McInerney, and Dr Eleanor Drage. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Dr Kerry McInerney and Dr Eleanor Drage, Dr Kerry McInerney, and Dr Eleanor Drage or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Ever worried that AI will wipe out humanity? Ever dreamed of merging with AI? Well these are the primary concerns of transhumanism and existential risk, which you may not have heard of, but whose key followers include Elon Musk and Nick Bostrom, author of Superintelligence. But Joshua Schuster and Derek Woods have pointed out that there are serious problems with transhumanism’s dreams and fears, including its privileging of human intelligence above all other species, its assumption that genocides are less important than mass extinction events, and its inability to be historical when speculating about the future. They argue that if we really want to make the world and its technologies less risky, we should instead encourage cooperation, and participation in social and ecological issues.

  continue reading

88 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide