Artwork

Content provided by Bret Fisher. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Bret Fisher or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Flow State with VS Code AI

37:37
 
Share
 

Manage episode 428477000 series 2483573
Content provided by Bret Fisher. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Bret Fisher or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Bret and Nirmal are joined by Continue.dev co-founder, Nate Sesti, to walk through an open source replacement for GitHub Copilot.

Continue lets you use a set of open source and closed source LLMs in JetBrains and VSCode IDEs for adding AI to your coding workflow without leaving the editor.

You've probably heard about GitHub Copilot and other AI code assistants. The Continue team has created a completely open source solution as an alternative, or maybe a superset of these existing tools, because along with it being open source, it's also very configurable and allows you to choose multiple models to help you with code completion and chatbots in VSCode, JetBrains, and more are coming soon.

So this show builds on our recent Ollama show. Continue uses Ollama in the background to run a local LLM for you, if that's what you want to Continue to do for you, rather than internet LLM models.

Be sure to check out the live recording of the complete show from May 16, 2024 on YouTube (Ep. 266). Includes demos.

★Topics★
Continue.dev Website

Creators & Guests

  • (00:00) - Introduction
  • (01:52) - Meet Nate Sesti, CTO of Continue
  • (02:40) - Birth and Evolution of Continue
  • (03:56) - Continue's Features and Benefits
  • (22:24) - Running Multiple Models in Parallel
  • (26:38) - Best Hardware for Continue
  • (32:45) - Other Advantages of Continue
  • (36:08) - Getting Started with Continue

You can also support my free material by subscribing to my YouTube channel and my weekly newsletter at bret.news!

Grab the best coupons for my Docker and Kubernetes courses.
Join my cloud native DevOps community on Discord.
Grab some merch at Bret's Loot Box
Homepage bretfisher.com

  continue reading

168 episodes

Artwork
iconShare
 
Manage episode 428477000 series 2483573
Content provided by Bret Fisher. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Bret Fisher or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Bret and Nirmal are joined by Continue.dev co-founder, Nate Sesti, to walk through an open source replacement for GitHub Copilot.

Continue lets you use a set of open source and closed source LLMs in JetBrains and VSCode IDEs for adding AI to your coding workflow without leaving the editor.

You've probably heard about GitHub Copilot and other AI code assistants. The Continue team has created a completely open source solution as an alternative, or maybe a superset of these existing tools, because along with it being open source, it's also very configurable and allows you to choose multiple models to help you with code completion and chatbots in VSCode, JetBrains, and more are coming soon.

So this show builds on our recent Ollama show. Continue uses Ollama in the background to run a local LLM for you, if that's what you want to Continue to do for you, rather than internet LLM models.

Be sure to check out the live recording of the complete show from May 16, 2024 on YouTube (Ep. 266). Includes demos.

★Topics★
Continue.dev Website

Creators & Guests

  • (00:00) - Introduction
  • (01:52) - Meet Nate Sesti, CTO of Continue
  • (02:40) - Birth and Evolution of Continue
  • (03:56) - Continue's Features and Benefits
  • (22:24) - Running Multiple Models in Parallel
  • (26:38) - Best Hardware for Continue
  • (32:45) - Other Advantages of Continue
  • (36:08) - Getting Started with Continue

You can also support my free material by subscribing to my YouTube channel and my weekly newsletter at bret.news!

Grab the best coupons for my Docker and Kubernetes courses.
Join my cloud native DevOps community on Discord.
Grab some merch at Bret's Loot Box
Homepage bretfisher.com

  continue reading

168 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide