Artwork

Content provided by Jen Gaita Siciliano. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jen Gaita Siciliano or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

A.I. News: Open Letter To Pause, Existential Risks to Humanity, Its Supporters & Deniers (S5, E8)

46:04
 
Share
 

Manage episode 361682862 series 2827659
Content provided by Jen Gaita Siciliano. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jen Gaita Siciliano or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In the episode "A.I. News: Open Letter To Pause, Existential Risks to Humanity, Its Supporters & Deniers (S5, E8)," I discuss the recent Open Letter to Pause Giant AI Experiments recently composed by the Future of Life Institute, and present the arguments for taking the risk analysis more seriously. With signers of the letter including Elon Musk, Emad Mostaque, Steve Wozniak, Max Tegmark, Tristan Harris and Aza Raskin, I share some of their positions including some recent articles, podcasts and videos discussing the dilemma.
#deeplearning #AIrevolution #humanextinction #generativeAI #blackbox #dontlookup #LivBoeree #Danielschmachtenberger #tristanharris #azaraskin #maxtegmark #eliezeryudkowsky #centerforhumanetechnology #lexfridman #Moloch #machinelearning #AGI
References:
Pause Giant AI Experiments: An Open Letter
https://futureoflife.org/open-letter/pause-giant-ai-experiments/
The A.I. Dilemma - March 9, 2023 by Center for Humane Technology
https://youtu.be/xoVJKj8lcNQ
Meditations On Moloch By Scott Alexander
https://slatestarcodex.com/2014/07/30/meditations-on-moloch/

Misalignment, AI & Moloch | Daniel Schmachtenberger and Liv Boeree https://youtu.be/KCSsKV5F4xc
Pausing AI Developments Isn't Enough. We Need to Shut it All Down
https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/
Live: Eliezer Yudkowsky - Is Artificial General Intelligence too Dangerous to Build?
https://www.youtube.com/live/3_YX6AgxxYw?feature=share
The 'Don't Look Up' Thinking That Could Doom Us With AI
https://time.com/6273743/thinking-that-could-doom-us-with-ai/
Max Tegmark: The Case for Halting AI Development | Lex Fridman Podcast #371
https://youtu.be/VcVfceTsD0A
​Please visit my website at: www.jengaitasiciliano.com
Don't forget to subscribe to the Not As Crazy As You Think YouTube channel @SicilianoJen
Connect:
Instagram: @ jengaita
LinkedIn: @ jensiciliano
Twitter: @ jsiciliano
Send us a text

  continue reading

86 episodes

Artwork
iconShare
 
Manage episode 361682862 series 2827659
Content provided by Jen Gaita Siciliano. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jen Gaita Siciliano or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In the episode "A.I. News: Open Letter To Pause, Existential Risks to Humanity, Its Supporters & Deniers (S5, E8)," I discuss the recent Open Letter to Pause Giant AI Experiments recently composed by the Future of Life Institute, and present the arguments for taking the risk analysis more seriously. With signers of the letter including Elon Musk, Emad Mostaque, Steve Wozniak, Max Tegmark, Tristan Harris and Aza Raskin, I share some of their positions including some recent articles, podcasts and videos discussing the dilemma.
#deeplearning #AIrevolution #humanextinction #generativeAI #blackbox #dontlookup #LivBoeree #Danielschmachtenberger #tristanharris #azaraskin #maxtegmark #eliezeryudkowsky #centerforhumanetechnology #lexfridman #Moloch #machinelearning #AGI
References:
Pause Giant AI Experiments: An Open Letter
https://futureoflife.org/open-letter/pause-giant-ai-experiments/
The A.I. Dilemma - March 9, 2023 by Center for Humane Technology
https://youtu.be/xoVJKj8lcNQ
Meditations On Moloch By Scott Alexander
https://slatestarcodex.com/2014/07/30/meditations-on-moloch/

Misalignment, AI & Moloch | Daniel Schmachtenberger and Liv Boeree https://youtu.be/KCSsKV5F4xc
Pausing AI Developments Isn't Enough. We Need to Shut it All Down
https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/
Live: Eliezer Yudkowsky - Is Artificial General Intelligence too Dangerous to Build?
https://www.youtube.com/live/3_YX6AgxxYw?feature=share
The 'Don't Look Up' Thinking That Could Doom Us With AI
https://time.com/6273743/thinking-that-could-doom-us-with-ai/
Max Tegmark: The Case for Halting AI Development | Lex Fridman Podcast #371
https://youtu.be/VcVfceTsD0A
​Please visit my website at: www.jengaitasiciliano.com
Don't forget to subscribe to the Not As Crazy As You Think YouTube channel @SicilianoJen
Connect:
Instagram: @ jengaita
LinkedIn: @ jensiciliano
Twitter: @ jsiciliano
Send us a text

  continue reading

86 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide