Artwork

Content provided by Emily Laird. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Emily Laird or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Transformers Mini Series: How do Transformers Process Text?

6:56
 
Share
 

Manage episode 425727052 series 3578824
Content provided by Emily Laird. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Emily Laird or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this episode of Generative AI 101, we explore how Transformers break down text into tokens. Imagine turning a big, colorful pile of Lego blocks into individual pieces to build something cool—this is what tokenization does for AI models. Emily explains tokens, and how they work, and shows you why they’re the magic behind GenAI’s impressive outputs. Learn how Transformers assign numerical values to tokens and process them in parallel, allowing them to understand context, detect patterns, and generate coherent text. Tune in to discover why tokenization is important for tasks like language translation and text summarization.

Connect with Emily Laird on LinkedIn

  continue reading

36 episodes

Artwork
iconShare
 
Manage episode 425727052 series 3578824
Content provided by Emily Laird. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Emily Laird or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this episode of Generative AI 101, we explore how Transformers break down text into tokens. Imagine turning a big, colorful pile of Lego blocks into individual pieces to build something cool—this is what tokenization does for AI models. Emily explains tokens, and how they work, and shows you why they’re the magic behind GenAI’s impressive outputs. Learn how Transformers assign numerical values to tokens and process them in parallel, allowing them to understand context, detect patterns, and generate coherent text. Tune in to discover why tokenization is important for tasks like language translation and text summarization.

Connect with Emily Laird on LinkedIn

  continue reading

36 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide