Artwork

Content provided by Nonzero. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nonzero or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

In Defense of AI Doomerism (Robert Wright & Liron Shapira)

1:17:34
 
Share
 

Manage episode 418586149 series 134138
Content provided by Nonzero. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nonzero or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
This is a free preview of a paid episode. To hear more, visit nonzero.substack.com
0:24 Why this pod’s a little odd 2:50 Ilya Sutskever and Jan Leike quit OpenAI—part of a larger pattern? 10:20 Bob: AI doomers need Hollywood 16:26 Does an AI arms race spell doom for alignment? 20:40 Why the “Pause AI” movement matters 24:54 AI doomerism and Don’t Look Up: compare and contrast 27:23 How Liron (fore)sees AI doom 33:18 Are Sam Altman’s concerns about AI safety sincere? 39:46 Paperclip maximizing, evolution, and the AI will to power question 51:34 Are there real-world examples of AI going rogue? 1:07:12 Should we really align AI to human values? 1:15:27 Heading to Overtime

Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Liron Shapira (Pause AI, Relationship Hero). Recorded May 06, 2024. Additional segment recorded May 15, 2024.

Twitter: https://twitter.com/NonzeroPods

  continue reading

738 episodes

Artwork
iconShare
 
Manage episode 418586149 series 134138
Content provided by Nonzero. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nonzero or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
This is a free preview of a paid episode. To hear more, visit nonzero.substack.com
0:24 Why this pod’s a little odd 2:50 Ilya Sutskever and Jan Leike quit OpenAI—part of a larger pattern? 10:20 Bob: AI doomers need Hollywood 16:26 Does an AI arms race spell doom for alignment? 20:40 Why the “Pause AI” movement matters 24:54 AI doomerism and Don’t Look Up: compare and contrast 27:23 How Liron (fore)sees AI doom 33:18 Are Sam Altman’s concerns about AI safety sincere? 39:46 Paperclip maximizing, evolution, and the AI will to power question 51:34 Are there real-world examples of AI going rogue? 1:07:12 Should we really align AI to human values? 1:15:27 Heading to Overtime

Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Liron Shapira (Pause AI, Relationship Hero). Recorded May 06, 2024. Additional segment recorded May 15, 2024.

Twitter: https://twitter.com/NonzeroPods

  continue reading

738 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide