Artwork

Content provided by EA Forum Team. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by EA Forum Team or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

“Updates on the EA catastrophic risk landscape” by Benjamin_Todd

4:15
 
Share
 

Manage episode 417007714 series 3281452
Content provided by EA Forum Team. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by EA Forum Team or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Around the end of Feb 2024 I attended the Summit on Existential Risk and EAG: Bay Area (GCRs), during which I did 25+ one-on-ones about the needs and gaps in the EA-adjacent catastrophic risk landscape, and how they’ve changed.

The meetings were mostly with senior managers or researchers in the field who I think are worth listening to (unfortunately I can’t share names). Below is how I’d summarise the main themes in what was said.

If you have different impressions of the landscape, I’d be keen to hear them.

  • There's been a big increase in the number of people working on AI safety, partly driven by a reallocation of effort (e.g. Rethink Priorities starting an AI policy think tank); and partly driven by new people entering the field after its newfound prominence.
  • Allocation in the landscape seems more efficient than in the past – it's harder to identify [...]

---

First published:
May 6th, 2024

Source:
https://forum.effectivealtruism.org/posts/YDjH6ACPZq889tqeJ/updates-on-the-ea-catastrophic-risk-landscape

---

Narrated by TYPE III AUDIO.

  continue reading

256 episodes

Artwork
iconShare
 
Manage episode 417007714 series 3281452
Content provided by EA Forum Team. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by EA Forum Team or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Around the end of Feb 2024 I attended the Summit on Existential Risk and EAG: Bay Area (GCRs), during which I did 25+ one-on-ones about the needs and gaps in the EA-adjacent catastrophic risk landscape, and how they’ve changed.

The meetings were mostly with senior managers or researchers in the field who I think are worth listening to (unfortunately I can’t share names). Below is how I’d summarise the main themes in what was said.

If you have different impressions of the landscape, I’d be keen to hear them.

  • There's been a big increase in the number of people working on AI safety, partly driven by a reallocation of effort (e.g. Rethink Priorities starting an AI policy think tank); and partly driven by new people entering the field after its newfound prominence.
  • Allocation in the landscape seems more efficient than in the past – it's harder to identify [...]

---

First published:
May 6th, 2024

Source:
https://forum.effectivealtruism.org/posts/YDjH6ACPZq889tqeJ/updates-on-the-ea-catastrophic-risk-landscape

---

Narrated by TYPE III AUDIO.

  continue reading

256 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide