Artwork

Content provided by Duncan Epping, Frank Denneman, Johan van Amersfoort, Duncan Epping, Frank Denneman, and Johan van Amersfoort. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Duncan Epping, Frank Denneman, Johan van Amersfoort, Duncan Epping, Frank Denneman, and Johan van Amersfoort or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

#076 - AI Roles Demystified: A Guide for Infrastructure Admins with Myles Gray

49:13
 
Share
 

Manage episode 420436461 series 2987137
Content provided by Duncan Epping, Frank Denneman, Johan van Amersfoort, Duncan Epping, Frank Denneman, and Johan van Amersfoort. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Duncan Epping, Frank Denneman, Johan van Amersfoort, Duncan Epping, Frank Denneman, and Johan van Amersfoort or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this conversation, Myles Gray discusses the AI workflow and its personas, the responsibilities of data scientists and developers in deploying AI models, the role of infrastructure administrators, and the challenges of deploying models at the edge. He also explains the concept of quantization and the importance of accuracy in models. Additionally, he talks about the pipeline for deploying models and the difference between unit testing and integration testing. Unit testing is used to test the functionality of a single module or function within an application. Integration testing involves testing the interaction between different components or applications. MLflow and other tools are used to store and manage ML models. Smaller models are emerging as a solution to the resource constraints of large models. Collaboration between different personas is important for ensuring security and governance in AI projects. Data governance policies are crucial for maintaining data quality and consistency.

Takeaways

  • The AI workflow involves multiple personas, including data scientists, developers, and infrastructure administrators.
  • Data scientists play a crucial role in developing AI models, while developers are responsible for deploying the models into production.
  • Infrastructure administrators need to consider the virtualization layer and ensure efficient and easy consumption of infrastructure components.
  • Deploying AI models at the edge requires quantization to reduce model size and considerations for form factor, scale, and connectivity.
  • The pipeline for deploying models involves steps such as unit testing, scanning for vulnerabilities, building container images, and pushing to a registry.
  • Unit testing focuses on testing individual components, while integration testing ensures the compatibility and functionality of the entire system. Unit testing is used to test the functionality of a single module or function within an application.
  • Integration testing involves testing the interaction between different components or applications.
  • MLflow and other tools are used to store and manage ML models.
  • Smaller models are emerging as a solution to the resource constraints of large models.
  • Collaboration between different personas is important for ensuring security and governance in AI projects.
  • Data governance policies are crucial for maintaining data quality and consistency.

Chapters

  • 00:00 Understanding the AI Workflow and Personas
  • 03:24 The Role of Data Scientists and Developers in Deploying AI Models
  • 08:47 The Responsibilities of Infrastructure Administrators
  • 15:25 Challenges of Deploying Models at the Edge
  • 20:29 The Pipeline for Deploying AI Models
  • 24:45 Unit Testing vs. Integration Testing
  • 28:22 Managing ML Models with MLflow and Other Tools
  • 32:17 The Emergence of Smaller Models
  • 39:58 Collaboration for Security and Governance in AI Projects
  • 46:32 The Importance of Data Governance

Disclaimer: The thoughts and opinions shared in this podcast are our own/guest(s), and not necessarily those of Broadcom or VMware by Broadcom.

  continue reading

80 episodes

Artwork
iconShare
 
Manage episode 420436461 series 2987137
Content provided by Duncan Epping, Frank Denneman, Johan van Amersfoort, Duncan Epping, Frank Denneman, and Johan van Amersfoort. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Duncan Epping, Frank Denneman, Johan van Amersfoort, Duncan Epping, Frank Denneman, and Johan van Amersfoort or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this conversation, Myles Gray discusses the AI workflow and its personas, the responsibilities of data scientists and developers in deploying AI models, the role of infrastructure administrators, and the challenges of deploying models at the edge. He also explains the concept of quantization and the importance of accuracy in models. Additionally, he talks about the pipeline for deploying models and the difference between unit testing and integration testing. Unit testing is used to test the functionality of a single module or function within an application. Integration testing involves testing the interaction between different components or applications. MLflow and other tools are used to store and manage ML models. Smaller models are emerging as a solution to the resource constraints of large models. Collaboration between different personas is important for ensuring security and governance in AI projects. Data governance policies are crucial for maintaining data quality and consistency.

Takeaways

  • The AI workflow involves multiple personas, including data scientists, developers, and infrastructure administrators.
  • Data scientists play a crucial role in developing AI models, while developers are responsible for deploying the models into production.
  • Infrastructure administrators need to consider the virtualization layer and ensure efficient and easy consumption of infrastructure components.
  • Deploying AI models at the edge requires quantization to reduce model size and considerations for form factor, scale, and connectivity.
  • The pipeline for deploying models involves steps such as unit testing, scanning for vulnerabilities, building container images, and pushing to a registry.
  • Unit testing focuses on testing individual components, while integration testing ensures the compatibility and functionality of the entire system. Unit testing is used to test the functionality of a single module or function within an application.
  • Integration testing involves testing the interaction between different components or applications.
  • MLflow and other tools are used to store and manage ML models.
  • Smaller models are emerging as a solution to the resource constraints of large models.
  • Collaboration between different personas is important for ensuring security and governance in AI projects.
  • Data governance policies are crucial for maintaining data quality and consistency.

Chapters

  • 00:00 Understanding the AI Workflow and Personas
  • 03:24 The Role of Data Scientists and Developers in Deploying AI Models
  • 08:47 The Responsibilities of Infrastructure Administrators
  • 15:25 Challenges of Deploying Models at the Edge
  • 20:29 The Pipeline for Deploying AI Models
  • 24:45 Unit Testing vs. Integration Testing
  • 28:22 Managing ML Models with MLflow and Other Tools
  • 32:17 The Emergence of Smaller Models
  • 39:58 Collaboration for Security and Governance in AI Projects
  • 46:32 The Importance of Data Governance

Disclaimer: The thoughts and opinions shared in this podcast are our own/guest(s), and not necessarily those of Broadcom or VMware by Broadcom.

  continue reading

80 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide