Artwork

Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

EA - University groups as impact-driven truth-seeking teams by anormative

7:37
 
Share
 

Manage episode 406723891 series 3337191
Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: University groups as impact-driven truth-seeking teams, published by anormative on March 14, 2024 on The Effective Altruism Forum.A rough untested idea that I'd like to hear others' thoughts about. This is mostly meant as a broader group strategy framing but might also have interesting implications for what university group programming should look like.EA university group organizers are often told to "backchain" our way to impact:What's the point of your university group?"To create the most impact possible, to do the greatest good we can"What do you need in order to create that?"Motivated and competent people working on solving the world's most pressing problems"And as an university group, how do you make those people?"Find altruistic people, share EA ideas with them, provide an environment where they can upskill"What specific things can you do to do that?"Intro Fellowships to introduce people to EA ideas, career planning and 1-1s for upskilling"This sort of strategic thinking is useful at times, but I think that it can also be somewhat pernicious, especially when it naively justifies the status quo strategy over other possible strategies.[1] It might instead be better to consider a wide variety of framings and figure out which is best.[2] One strategy framing I want to propose that I would be interested in testing is viewing university groups as "impact driven truth-seeking teams."What this looks likeAn impact-driven truth-seeking team is a group of students trying to figure out what they can do with their lives to have the most impact. Imagine a scrappy research team where everyone is trying to figure out the answer to this research question - "how can we do the most good?" Nobody has figured out the question yet, nobody is a purveyor of any sort of dogma, everyone is in it together to figure out how to make the world as good as possible with the limited resources we have.What does this look like? I'm not all that sure, but it might have some of these elements:An intro fellowship that serves an introduction to cause prioritization, philosophy, epistemics, etc.Regular discussions or debates about contenders for "the most pressing problem of our time"More of a focus on getting people to research and present arguments themselves than having conclusions presented to them to acceptActive cause prioritizationLive google docs with arguments for and against certain causesSpreadsheets attempting to calculate possible QALYs saved, possible x-risk reduction, etcPossibly (maybe) even trying to do novel research on open research questionsNo doubt some of the elements we identified before in our backchaining are imporant too - the career planning and the upskillingTesting fit, doing cheap tests, upskilling, getting experienceI'm sure there's much more that could be done along these lines that I'm missing or that hasn't been thought of yet at allAnother illustrative picture - imagine instead of university groups being marketing campaigns for Doing Good Better, we could each be a mini-80,000 hours research team,[3] trying to start at first principles and building our way up, assisted by the EA movement, but not constrained by it.Cause prio for it's own sake for the sake of EACurrently, the modus operandi of EA university groups seems to be selling the EA movement to students by convincing them of arguments to prioritize the primary EA causes. It's important to realize that the EA handbook serves as an introduction to the movement called Effective Altruism [4] and the various causes that it has already identified as being impactful, not as an introductory course in cause prioritization.It seems to me that this is the root of much of the unhealthy epistemics that can arise in university groups.[5]I don't think that students in my proposed team should sto...
  continue reading

2217 episodes

Artwork
iconShare
 
Manage episode 406723891 series 3337191
Content provided by The Nonlinear Fund. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Nonlinear Fund or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: University groups as impact-driven truth-seeking teams, published by anormative on March 14, 2024 on The Effective Altruism Forum.A rough untested idea that I'd like to hear others' thoughts about. This is mostly meant as a broader group strategy framing but might also have interesting implications for what university group programming should look like.EA university group organizers are often told to "backchain" our way to impact:What's the point of your university group?"To create the most impact possible, to do the greatest good we can"What do you need in order to create that?"Motivated and competent people working on solving the world's most pressing problems"And as an university group, how do you make those people?"Find altruistic people, share EA ideas with them, provide an environment where they can upskill"What specific things can you do to do that?"Intro Fellowships to introduce people to EA ideas, career planning and 1-1s for upskilling"This sort of strategic thinking is useful at times, but I think that it can also be somewhat pernicious, especially when it naively justifies the status quo strategy over other possible strategies.[1] It might instead be better to consider a wide variety of framings and figure out which is best.[2] One strategy framing I want to propose that I would be interested in testing is viewing university groups as "impact driven truth-seeking teams."What this looks likeAn impact-driven truth-seeking team is a group of students trying to figure out what they can do with their lives to have the most impact. Imagine a scrappy research team where everyone is trying to figure out the answer to this research question - "how can we do the most good?" Nobody has figured out the question yet, nobody is a purveyor of any sort of dogma, everyone is in it together to figure out how to make the world as good as possible with the limited resources we have.What does this look like? I'm not all that sure, but it might have some of these elements:An intro fellowship that serves an introduction to cause prioritization, philosophy, epistemics, etc.Regular discussions or debates about contenders for "the most pressing problem of our time"More of a focus on getting people to research and present arguments themselves than having conclusions presented to them to acceptActive cause prioritizationLive google docs with arguments for and against certain causesSpreadsheets attempting to calculate possible QALYs saved, possible x-risk reduction, etcPossibly (maybe) even trying to do novel research on open research questionsNo doubt some of the elements we identified before in our backchaining are imporant too - the career planning and the upskillingTesting fit, doing cheap tests, upskilling, getting experienceI'm sure there's much more that could be done along these lines that I'm missing or that hasn't been thought of yet at allAnother illustrative picture - imagine instead of university groups being marketing campaigns for Doing Good Better, we could each be a mini-80,000 hours research team,[3] trying to start at first principles and building our way up, assisted by the EA movement, but not constrained by it.Cause prio for it's own sake for the sake of EACurrently, the modus operandi of EA university groups seems to be selling the EA movement to students by convincing them of arguments to prioritize the primary EA causes. It's important to realize that the EA handbook serves as an introduction to the movement called Effective Altruism [4] and the various causes that it has already identified as being impactful, not as an introductory course in cause prioritization.It seems to me that this is the root of much of the unhealthy epistemics that can arise in university groups.[5]I don't think that students in my proposed team should sto...
  continue reading

2217 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide