Artwork

Content provided by Jon Krohn. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jon Krohn or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

801: Merged LLMs Are Smaller And More Capable, with Arcee AI's Mark McQuade and Charles Goddard

1:17:05
 
Share
 

Manage episode 429699085 series 2532807
Content provided by Jon Krohn. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jon Krohn or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Merged LLMs are the future, and we’re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with Jon Krohn. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don’t miss out!

Interested in sponsoring a SuperDataScience Podcast episode? Email natalie@superdatascience.com for sponsorship information.

In this episode you will learn:

• Explanation of Charles' job title: Chief of Frontier Research [03:31]

• Model Merging Technology combining multiple LLMs without increasing size [04:43]

• Using MergeKit for model merging [14:49]

• Evolutionary Model Merging using evolutionary algorithms [22:55]

• Commercial applications and success stories [28:10]

• Comparison of Mixture of Experts (MoE) vs. Mixture of Agents [37:57]

• Spectrum Project for efficient training by targeting specific modules [54:28]

• Future of Small Language Models (SLMs) and their advantages [01:01:22]

Additional materials: www.superdatascience.com/801

  continue reading

889 episodes

Artwork
iconShare
 
Manage episode 429699085 series 2532807
Content provided by Jon Krohn. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jon Krohn or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Merged LLMs are the future, and we’re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with Jon Krohn. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don’t miss out!

Interested in sponsoring a SuperDataScience Podcast episode? Email natalie@superdatascience.com for sponsorship information.

In this episode you will learn:

• Explanation of Charles' job title: Chief of Frontier Research [03:31]

• Model Merging Technology combining multiple LLMs without increasing size [04:43]

• Using MergeKit for model merging [14:49]

• Evolutionary Model Merging using evolutionary algorithms [22:55]

• Commercial applications and success stories [28:10]

• Comparison of Mixture of Experts (MoE) vs. Mixture of Agents [37:57]

• Spectrum Project for efficient training by targeting specific modules [54:28]

• Future of Small Language Models (SLMs) and their advantages [01:01:22]

Additional materials: www.superdatascience.com/801

  continue reading

889 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide