Artwork

Content provided by Hubert Dulay. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hubert Dulay or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Interview with Kai Waehner

 
Share
 

Manage episode 393682758 series 3509937
Content provided by Hubert Dulay. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hubert Dulay or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this podcast Ralph and I interview a former colleague of mine, Kai, who has extensive experience in the data streaming and real-time events space. Kai highlights the top five trends for data streaming with Kafka and Flink, including data sharing, data contracts for governance, serverless stream processing, multi-cloud adoption, and the use of generative AI in real-time contexts. We discuss the role of generative AI in providing accurate answers and the importance of real-time data integration for contextual recommendations, using the example of travel and flight cancellations. We also delve into the role of Flink as a stream processor in ensuring the accuracy and freshness of data for semantic searches and generative AI applications.

We also delve into the idea of streaming databases and whether the market is ready to embrace them. We discuss the need for data contracts and data governance to understand the flow of data through systems, as well as the responsibility of the data engineering team in creating embeddings. We also discuss integrating large language models with other applications using technologies like Kafka and provide examples of how generative AI can be integrated into existing business processes. The interview touches on the concept of a "lake house" and the separation of compute and storage for real-time analytics. The guest also highlights Confluent's approach to building Kafka in a cloud-native way and their focus on the streaming side, while emphasizing the need for accessible stream processing solutions for ordinary database users.

  continue reading

15 episodes

Artwork
iconShare
 
Manage episode 393682758 series 3509937
Content provided by Hubert Dulay. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hubert Dulay or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this podcast Ralph and I interview a former colleague of mine, Kai, who has extensive experience in the data streaming and real-time events space. Kai highlights the top five trends for data streaming with Kafka and Flink, including data sharing, data contracts for governance, serverless stream processing, multi-cloud adoption, and the use of generative AI in real-time contexts. We discuss the role of generative AI in providing accurate answers and the importance of real-time data integration for contextual recommendations, using the example of travel and flight cancellations. We also delve into the role of Flink as a stream processor in ensuring the accuracy and freshness of data for semantic searches and generative AI applications.

We also delve into the idea of streaming databases and whether the market is ready to embrace them. We discuss the need for data contracts and data governance to understand the flow of data through systems, as well as the responsibility of the data engineering team in creating embeddings. We also discuss integrating large language models with other applications using technologies like Kafka and provide examples of how generative AI can be integrated into existing business processes. The interview touches on the concept of a "lake house" and the separation of compute and storage for real-time analytics. The guest also highlights Confluent's approach to building Kafka in a cloud-native way and their focus on the streaming side, while emphasizing the need for accessible stream processing solutions for ordinary database users.

  continue reading

15 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide