Artwork

Content provided by Jeff Wilser. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jeff Wilser or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Why Does AI Hallucinate? Can It Be Fixed? w/ EyeLevel.AI CEO Neil Katz

1:08:35
 
Share
 

Manage episode 424656930 series 3503527
Content provided by Jeff Wilser. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jeff Wilser or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

As most people who have played with AI know, it can make stuff up, or what’s often referred to as “hallucinate.” (I like to think of it as “bullshitting.”

It’s one of the trickiest problems vexing the entire AI space. Why does AI do this? How widespread is the problem? What are the solutions, and is it even something that CAN be solved?

To unravel all this, we speak with Neil Katz, the founder of EyeLevel.AI, a company that’s developing solutions to help make AI more accurate (and less likely to hallucinate) for private companies. They're building what they call the "truth serum" for AI,
We dive deeeeep into the world of AI hallucinations, as it's one of the least understood--and most important--topics in the space.
I very much enjoyed.
Find Neil and EyeLevel.AI at:
https://www.eyelevel.ai/

  continue reading

45 episodes

Artwork
iconShare
 
Manage episode 424656930 series 3503527
Content provided by Jeff Wilser. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jeff Wilser or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

As most people who have played with AI know, it can make stuff up, or what’s often referred to as “hallucinate.” (I like to think of it as “bullshitting.”

It’s one of the trickiest problems vexing the entire AI space. Why does AI do this? How widespread is the problem? What are the solutions, and is it even something that CAN be solved?

To unravel all this, we speak with Neil Katz, the founder of EyeLevel.AI, a company that’s developing solutions to help make AI more accurate (and less likely to hallucinate) for private companies. They're building what they call the "truth serum" for AI,
We dive deeeeep into the world of AI hallucinations, as it's one of the least understood--and most important--topics in the space.
I very much enjoyed.
Find Neil and EyeLevel.AI at:
https://www.eyelevel.ai/

  continue reading

45 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide