Artificial Intelligence has suddenly gone from the fringes of science to being everywhere. So how did we get here? And where's this all heading? In this new series of Science Friction, we're finding out.
…
continue reading
Content provided by The Thesis Review and Sean Welleck. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Thesis Review and Sean Welleck or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!
Go offline with the Player FM app!
[25] Tomas Mikolov - Statistical Language Models Based on Neural Networks
MP3•Episode home
Manage episode 302418420 series 2982803
Content provided by The Thesis Review and Sean Welleck. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Thesis Review and Sean Welleck or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity. Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata. Episode notes: https://cs.nyu.edu/~welleck/episode25.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
…
continue reading
49 episodes
MP3•Episode home
Manage episode 302418420 series 2982803
Content provided by The Thesis Review and Sean Welleck. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Thesis Review and Sean Welleck or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity. Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata. Episode notes: https://cs.nyu.edu/~welleck/episode25.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
…
continue reading
49 episodes
All episodes
×Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.