Artwork

Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Inspur’s AI Inferencing and Open Source Project – Conversations in the Cloud – Episode 202

 
Share
 

Archived series ("Inactive feed" status)

When? This feed was archived on April 30, 2021 18:09 (3y ago). Last successful fetch was on March 18, 2021 15:32 (3y ago)

Why? Inactive feed status. Our servers were unable to retrieve a valid podcast feed for a sustained period.

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 262490950 series 33174
Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this Intel Conversations in the Cloud audio podcast: Vangel Bojaxhi, Global AI & HPC Director at Inspur, joins Conversations in the Cloud to discuss AI Inferencing applications and Inspur’s new solution. Vangel describes the collaboration between Intel and Inspur, delivering optimized AI solutions for customers, and Inspur’s open source projects for deep learning inference. As an Intel Select Solution, Inspur’s AI Inferencing is a fully optimized, tested, and ready-to-go configuration. The solution reduces the deployment time and cost for end-users while ensuring scalability.

Explore Inspur’s AI Inferencing and other solutions here:
inspursystems.com

Discover Open Source Deep Learning Inference Engine Based on FPGA at:
github.com/TF2-Engine/TF2

Learn about Intel Select Solutions and other performance optimized configurations at:
intel.com/selectsolutions

  continue reading

157 episodes

Artwork
iconShare
 

Archived series ("Inactive feed" status)

When? This feed was archived on April 30, 2021 18:09 (3y ago). Last successful fetch was on March 18, 2021 15:32 (3y ago)

Why? Inactive feed status. Our servers were unable to retrieve a valid podcast feed for a sustained period.

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 262490950 series 33174
Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this Intel Conversations in the Cloud audio podcast: Vangel Bojaxhi, Global AI & HPC Director at Inspur, joins Conversations in the Cloud to discuss AI Inferencing applications and Inspur’s new solution. Vangel describes the collaboration between Intel and Inspur, delivering optimized AI solutions for customers, and Inspur’s open source projects for deep learning inference. As an Intel Select Solution, Inspur’s AI Inferencing is a fully optimized, tested, and ready-to-go configuration. The solution reduces the deployment time and cost for end-users while ensuring scalability.

Explore Inspur’s AI Inferencing and other solutions here:
inspursystems.com

Discover Open Source Deep Learning Inference Engine Based on FPGA at:
github.com/TF2-Engine/TF2

Learn about Intel Select Solutions and other performance optimized configurations at:
intel.com/selectsolutions

  continue reading

157 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide