Artwork

Content provided by Nickle. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nickle or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Possible to turn ChatGPT, Bard, or any LLM from an amnesiac goldfish into a memory mammoth!

1:00:23
 
Share
 

Manage episode 372107452 series 3463837
Content provided by Nickle. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nickle or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

What if your ChatGPT, Bard, or any other LLMs could remember the things you said last month? For example, if you were planning to buy a home near your kids' school, the story you told your son last week, or the gift ideas your wife wouldn't like. That would be amazing, right? However, with the current short memory limit, also known as the context window, these capabilities remain a dream.

But Dr. Burtsev's research has come to our rescue! Thanks to his breakthrough, your LLMs can now accurately remember 1 million tokens, equivalent to several books' worth of information. We are now much closer to having a dreamlike Chat Agent.
Want to know more about what this means for the rest of us? Just tune in to this podcast where Mr. Bursev joins us to discuss his inspiration, how he made it possible, and many other insightful thoughts and ideas about interactive learning, human brain-inspired machine learning algorithms, AGI, Turing test, and, of course, AI safety.

Here is the paper: Scaling Transformer to 1M tokens and beyond with RMT
https://arxiv.org/abs/2304.11062
Find Mr Burtsev's profile here
https://lims.ac.uk/profile/?id=114
Here are various other resources mentioned during the show:
Mike's Linkedin page and information about the IGLU contest (Interactive Grounded Language Understanding)
https://www.linkedin.com/posts/mikhai...
The Society of Mind, Marvin Minsky
https://isbndb.com/book/9780671657130
The Human Brain Project
https://www.humanbrainproject.eu/en/b...
Yann LeCun, JEPA: A Path Towards Autonomous Machine Intelligence
https://www.reddit.com/r/MachineLearn...
Mindstorms in Natural Language-Based Societies of Mind
Jürgen Schmidhuber

  continue reading

2 episodes

Artwork
iconShare
 
Manage episode 372107452 series 3463837
Content provided by Nickle. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nickle or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

What if your ChatGPT, Bard, or any other LLMs could remember the things you said last month? For example, if you were planning to buy a home near your kids' school, the story you told your son last week, or the gift ideas your wife wouldn't like. That would be amazing, right? However, with the current short memory limit, also known as the context window, these capabilities remain a dream.

But Dr. Burtsev's research has come to our rescue! Thanks to his breakthrough, your LLMs can now accurately remember 1 million tokens, equivalent to several books' worth of information. We are now much closer to having a dreamlike Chat Agent.
Want to know more about what this means for the rest of us? Just tune in to this podcast where Mr. Bursev joins us to discuss his inspiration, how he made it possible, and many other insightful thoughts and ideas about interactive learning, human brain-inspired machine learning algorithms, AGI, Turing test, and, of course, AI safety.

Here is the paper: Scaling Transformer to 1M tokens and beyond with RMT
https://arxiv.org/abs/2304.11062
Find Mr Burtsev's profile here
https://lims.ac.uk/profile/?id=114
Here are various other resources mentioned during the show:
Mike's Linkedin page and information about the IGLU contest (Interactive Grounded Language Understanding)
https://www.linkedin.com/posts/mikhai...
The Society of Mind, Marvin Minsky
https://isbndb.com/book/9780671657130
The Human Brain Project
https://www.humanbrainproject.eu/en/b...
Yann LeCun, JEPA: A Path Towards Autonomous Machine Intelligence
https://www.reddit.com/r/MachineLearn...
Mindstorms in Natural Language-Based Societies of Mind
Jürgen Schmidhuber

  continue reading

2 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide