It was the deadliest string of shark attacks the world has ever seen. In 2011, sharks in Réunion, a beautiful island, way out in the Indian Ocean started biting people way more than ever before and with lunatic violence. The epidemic forced local surfers, politicians, and business owners into a proxy war with ocean lovers and conservationists worldwide, where long simmering tensions boiled over. Réunion: Shark Attacks in Paradise is the story of what happened on this beautiful island, and t ...
…
continue reading
Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!
Go offline with the Player FM app!
How Open Source Transformers Are Accelerating AI - CitC Episode 261
MP3•Episode home
Manage episode 306329631 series 1180916
Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Jeff Boudier from Hugging Face joins host Jake Smith to talk about the company’s open source machine learning transformers (also known as “pytorch-pretrained-bert”) library. Jeff talks about how transformers have accelerated the proliferation of natural language process (NLP) models and their future use in objection detection and other machine learning tasks. He goes into detail about Optimum—an open source library to train and run models on specific hardware, like Intel Xeon CPUs, and the benefits of the Intel Neural Compressor, which is designed to help deploy low-precision inference solutions. Jeff also announces Hugging Face’s new Infinity solution that integrates the inference pipeline to achieve results in milliseconds wherever Docker containers can be deployed. For more information, visit: https://hf.co/ Follow Jake on Twitter at: https://twitter.com/jakesmithintel
…
continue reading
296 episodes
MP3•Episode home
Manage episode 306329631 series 1180916
Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Jeff Boudier from Hugging Face joins host Jake Smith to talk about the company’s open source machine learning transformers (also known as “pytorch-pretrained-bert”) library. Jeff talks about how transformers have accelerated the proliferation of natural language process (NLP) models and their future use in objection detection and other machine learning tasks. He goes into detail about Optimum—an open source library to train and run models on specific hardware, like Intel Xeon CPUs, and the benefits of the Intel Neural Compressor, which is designed to help deploy low-precision inference solutions. Jeff also announces Hugging Face’s new Infinity solution that integrates the inference pipeline to achieve results in milliseconds wherever Docker containers can be deployed. For more information, visit: https://hf.co/ Follow Jake on Twitter at: https://twitter.com/jakesmithintel
…
continue reading
296 episodes
All episodes
×Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.