Artwork

Content provided by Turpentine, Erik Torenberg, and Nathan Labenz. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Turpentine, Erik Torenberg, and Nathan Labenz or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

The State Space Model Revolution, with Albert Gu

1:44:44
 
Share
 

Manage episode 427128337 series 3452589
Content provided by Turpentine, Erik Torenberg, and Nathan Labenz. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Turpentine, Erik Torenberg, and Nathan Labenz or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Nathan hosts Albert Gu, assistant professor at CMU and co-founder of Cartesia AI, to discuss the groundbreaking Mamba architecture. In this episode of The Cognitive Revolution, we explore the state space model revolution, diving into the technical details of Mamba and Mamba 2. Join us for an insightful conversation on the future of AI architectures and their potential to transform the field.

Apply to join over 400 founders and execs in the Turpentine Network: https://hmplogxqz0y.typeform.com/to/JCkphVqj

RECOMMENDED PODCAST:

Patrick McKenzie (@patio11) talks to experts who understand the complicated but not unknowable systems we rely on. You might be surprised at how quickly Patrick and his guests can put you in the top 1% of understanding for stock trading, tech hiring, and more.

Spotify: https://open.spotify.com/show/3Mos4VE3figVXleHDqfXOH

Apple: https://podcasts.apple.com/id1753399812https://podcasts.apple.com/id1753399812

SPONSORS:

Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive

The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR

Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/

Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention “Turpentine” to skip the waitlist.

CHAPTERS:

(00:00:00) About the Show

(00:05:39) State Space Models

(00:13:05) Intuition and inspiration

(00:18:27) Surprises

(00:22:33) Sponsors: Oracle | Brave

(00:24:41) Biological inspiration

(00:25:19) MAMBA breakthrough

(00:30:59) How does the state work?

(00:36:44) What is the size of the state?

(00:39:05) Training vs. Inference (Part 1)

(00:42:04) Sponsors: Omneky | Squad

(00:43:51) Training vs. Inference (Part 2)

(00:43:51) Sequence Models

(00:49:20) Mamba inference

(00:57:53) Mamba2 vs Mamba1

(01:16:05) Overtraining and the future of SSMs

(01:17:44) Training efficiency vs inference efficiency

(01:20:52) Hybrid models

(01:25:04) Scaling Attention Layers

(01:30:23) Optimizing State

(01:34:09) The extrapolation abilities of the SSMs

(01:36:37) Sequence parallelism with Mamba 2

(01:39:20) Why are you publishing all this?

(01:40:46) Cartesia and Together

(01:41:54) Outro

  continue reading

169 episodes

Artwork
iconShare
 
Manage episode 427128337 series 3452589
Content provided by Turpentine, Erik Torenberg, and Nathan Labenz. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Turpentine, Erik Torenberg, and Nathan Labenz or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Nathan hosts Albert Gu, assistant professor at CMU and co-founder of Cartesia AI, to discuss the groundbreaking Mamba architecture. In this episode of The Cognitive Revolution, we explore the state space model revolution, diving into the technical details of Mamba and Mamba 2. Join us for an insightful conversation on the future of AI architectures and their potential to transform the field.

Apply to join over 400 founders and execs in the Turpentine Network: https://hmplogxqz0y.typeform.com/to/JCkphVqj

RECOMMENDED PODCAST:

Patrick McKenzie (@patio11) talks to experts who understand the complicated but not unknowable systems we rely on. You might be surprised at how quickly Patrick and his guests can put you in the top 1% of understanding for stock trading, tech hiring, and more.

Spotify: https://open.spotify.com/show/3Mos4VE3figVXleHDqfXOH

Apple: https://podcasts.apple.com/id1753399812https://podcasts.apple.com/id1753399812

SPONSORS:

Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive

The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR

Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/

Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention “Turpentine” to skip the waitlist.

CHAPTERS:

(00:00:00) About the Show

(00:05:39) State Space Models

(00:13:05) Intuition and inspiration

(00:18:27) Surprises

(00:22:33) Sponsors: Oracle | Brave

(00:24:41) Biological inspiration

(00:25:19) MAMBA breakthrough

(00:30:59) How does the state work?

(00:36:44) What is the size of the state?

(00:39:05) Training vs. Inference (Part 1)

(00:42:04) Sponsors: Omneky | Squad

(00:43:51) Training vs. Inference (Part 2)

(00:43:51) Sequence Models

(00:49:20) Mamba inference

(00:57:53) Mamba2 vs Mamba1

(01:16:05) Overtraining and the future of SSMs

(01:17:44) Training efficiency vs inference efficiency

(01:20:52) Hybrid models

(01:25:04) Scaling Attention Layers

(01:30:23) Optimizing State

(01:34:09) The extrapolation abilities of the SSMs

(01:36:37) Sequence parallelism with Mamba 2

(01:39:20) Why are you publishing all this?

(01:40:46) Cartesia and Together

(01:41:54) Outro

  continue reading

169 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide