Artificial Intelligence has suddenly gone from the fringes of science to being everywhere. So how did we get here? And where's this all heading? In this new series of Science Friction, we're finding out.
…
continue reading
Content provided by The Thesis Review and Sean Welleck. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Thesis Review and Sean Welleck or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!
Go offline with the Player FM app!
[42] Charles Sutton - Efficient Training Methods for Conditional Random Fields
MP3•Episode home
Manage episode 325998515 series 2982803
Content provided by The Thesis Review and Sean Welleck. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Thesis Review and Sean Welleck or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Charles Sutton is a Research Scientist at Google Brain and an Associate Professor at the University of Edinburgh. His research focuses on deep learning for generating code and helping people write better programs. Charles' PhD thesis is titled "Efficient Training Methods for Conditional Random Fields", which he completed in 2008 at UMass Amherst. We start with his work in the thesis on structured models for text, and compare/contrast with today's large language models. From there, we discuss machine learning for code & the future of language models in program synthesis. - Episode notes: https://cs.nyu.edu/~welleck/episode42.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
…
continue reading
49 episodes
MP3•Episode home
Manage episode 325998515 series 2982803
Content provided by The Thesis Review and Sean Welleck. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Thesis Review and Sean Welleck or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Charles Sutton is a Research Scientist at Google Brain and an Associate Professor at the University of Edinburgh. His research focuses on deep learning for generating code and helping people write better programs. Charles' PhD thesis is titled "Efficient Training Methods for Conditional Random Fields", which he completed in 2008 at UMass Amherst. We start with his work in the thesis on structured models for text, and compare/contrast with today's large language models. From there, we discuss machine learning for code & the future of language models in program synthesis. - Episode notes: https://cs.nyu.edu/~welleck/episode42.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
…
continue reading
49 episodes
All episodes
×Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.