Artwork

Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

EA - Navigating Risks from Advanced Artificial Intelligence: A Guide for Philanthropists [Founders Pledge] by Tom Barnes

3:07
 
Share
 

Manage episode 424794521 series 2997284
Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Navigating Risks from Advanced Artificial Intelligence: A Guide for Philanthropists [Founders Pledge], published by Tom Barnes on June 21, 2024 on The Effective Altruism Forum. This week, we are releasing new research on advanced artificial intelligence (AI), the opportunities and risks it presents, and the role donations can play in positively steering it's development. As with our previous research investigating areas such as nuclear risks and catastrophic biological risks, our report on advanced AI provides a comprehensive overview of the landscape, outlining for the first time how effective donations can cost-effectively reduce risks. You can find the technical report as a PDF here, or read a condensed version here. In brief, the key points from our report are: 1. General, highly capable AI systems are likely to be developed in the next couple of decades, with the possibility of emerging in the next few years. 2. Such AI systems will radically upend the existing order - presenting a wide range of risks, scaling up to and including catastrophic threats. 3. AI companies - funded by big tech - are racing to build these systems without appropriate caution or restraint given the stakes at play. 4. Governments are under-resourced, ill-equipped and vulnerable to regulatory capture from big tech companies, leaving a worrying gap in our defenses against dangerous AI systems. 5. Philanthropists can and must step in where governments and the private sector are missing the mark. 6. We recommend special attention to funding opportunities to (1) boost global resilience, (2) improve government capacity, (3) coordinate major global players, and (4) advance technical safety research. Funding Recommendations Alongside this report, we are sharing some of our latest recommended high-impact funding opportunities: The Centre for Long-Term Resilience, the Institute for Law and AI, the Effective Institutions Project and FAR AI are four promising organizations we have recently evaluated and recommend for more funding, covering our four respective focus areas. We are in the process of evaluating more organizations, and hope to release further recommendations. Furthermore, the Founders Pledge's Global Catastrophic Risks Fund supports critical work on these issues. If you would like to make progress on a range of catastrophic risks - including from advanced AI - then please consider donating to the Fund ! About Founders Pledge Founders Pledge is a global non-profit empowering entrepreneurs to do the most good possible with their charitable giving. We equip members with everything needed to maximize their impact, from evidence-led research and advice on the world's most pressing problems, to comprehensive infrastructure for global grant-making, alongside opportunities to learn and connect. To date, they have pledged over $10 billion to charity and donated more than $950 million. We're grateful to be funded by our members and other generous donors. founderspledge.com Thanks for listening. To help us out with The Nonlinear Library or to learn more, please visit nonlinear.org
  continue reading

2446 episodes

Artwork
iconShare
 
Manage episode 424794521 series 2997284
Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Navigating Risks from Advanced Artificial Intelligence: A Guide for Philanthropists [Founders Pledge], published by Tom Barnes on June 21, 2024 on The Effective Altruism Forum. This week, we are releasing new research on advanced artificial intelligence (AI), the opportunities and risks it presents, and the role donations can play in positively steering it's development. As with our previous research investigating areas such as nuclear risks and catastrophic biological risks, our report on advanced AI provides a comprehensive overview of the landscape, outlining for the first time how effective donations can cost-effectively reduce risks. You can find the technical report as a PDF here, or read a condensed version here. In brief, the key points from our report are: 1. General, highly capable AI systems are likely to be developed in the next couple of decades, with the possibility of emerging in the next few years. 2. Such AI systems will radically upend the existing order - presenting a wide range of risks, scaling up to and including catastrophic threats. 3. AI companies - funded by big tech - are racing to build these systems without appropriate caution or restraint given the stakes at play. 4. Governments are under-resourced, ill-equipped and vulnerable to regulatory capture from big tech companies, leaving a worrying gap in our defenses against dangerous AI systems. 5. Philanthropists can and must step in where governments and the private sector are missing the mark. 6. We recommend special attention to funding opportunities to (1) boost global resilience, (2) improve government capacity, (3) coordinate major global players, and (4) advance technical safety research. Funding Recommendations Alongside this report, we are sharing some of our latest recommended high-impact funding opportunities: The Centre for Long-Term Resilience, the Institute for Law and AI, the Effective Institutions Project and FAR AI are four promising organizations we have recently evaluated and recommend for more funding, covering our four respective focus areas. We are in the process of evaluating more organizations, and hope to release further recommendations. Furthermore, the Founders Pledge's Global Catastrophic Risks Fund supports critical work on these issues. If you would like to make progress on a range of catastrophic risks - including from advanced AI - then please consider donating to the Fund ! About Founders Pledge Founders Pledge is a global non-profit empowering entrepreneurs to do the most good possible with their charitable giving. We equip members with everything needed to maximize their impact, from evidence-led research and advice on the world's most pressing problems, to comprehensive infrastructure for global grant-making, alongside opportunities to learn and connect. To date, they have pledged over $10 billion to charity and donated more than $950 million. We're grateful to be funded by our members and other generous donors. founderspledge.com Thanks for listening. To help us out with The Nonlinear Library or to learn more, please visit nonlinear.org
  continue reading

2446 episodes

所有剧集

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide