Artwork

Content provided by Hewlett Packard Enterprise. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hewlett Packard Enterprise or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Can you make AI sustainable?

32:04
 
Share
 

Manage episode 400680904 series 2696067
Content provided by Hewlett Packard Enterprise. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hewlett Packard Enterprise or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this episode we are looking at the challenges AI technology faces when it comes to becoming, and then remaining sustainable.
The benefits of AI are unquestionable: from improved medical assistance and increased efficiency in the workplace, to autonomous transportation and next-level gaming experiences. But the more expansive the abilities of AI become, the more data storage that’s required.

That data storage uses a lot of energy. In fact, it has been predicted that AI servers could be using more energy than a country the size of the Netherlands by 2030.

For HPE Chief Technologist, Matt Armstrong-Barnes, the rate at which AI has grown in recent years has had an environmental impact, and he believes that’s down to people rushing into training large language models without thinking about longevity, or the need for future change. And that, in turn, has led to data being stored that is no longer needed.

The sustainability issue is something that is also a main focus of Arti Garg, Lead Sustainability & Edge Architect in the office of the CTO at Hewlett Packard Enterprise. Like Matt, Arti has kept a keen eye on the exponential growth of AI data storage and the effect that is having on the environment, and agrees that the key to a more sustainable future is in how we train models.

However, whilst training models well is important, the tech itself is a key component in more efficient AI. Shar Narasimhan is the director of product marketing for NVIDIA's data center GPU portfolio. He believes that a combination of openly available model optimisations and chipsets, CPUs, GPUs and intelligent data centers optimised for AI is a key piece of the puzzle in avoiding energy wastage, and making AI more sustainable all round.

Sources and statistics cited in this episode:
Global AI market prediction - https://www.statista.com/statistics/1365145/artificial-intelligence-market-size/#:~:text=Global%20artificial%20intelligence%20market%20size%202021%2D2030&text=According%20to%20Next%20Move%20Strategy,nearly%20two%20trillion%20U.S.%20dollars.
AI could use as much energy as a small country report - https://www.cell.com/joule/fulltext/S2542-4351(23)00365-3?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS2542435123003653%3Fshowall%3Dtrue
Industry responsible for 14% of earth’s emissions -
https://www.emerald.com/insight/content/doi/10.1108/JICES-11-2021-0106/full/html
Number of AI startups - https://tracxn.com/d/explore/artificial-intelligence-startups-in-united-states/__8hhT66RA16YeZhW3QByF6cGkAjrM6ertfKJuKbQIiJg/companies
AI model energy use increase - https://openai.com/research/ai-and-compute
European Parliament report into AI energy usage - https://www.europarl.europa.eu/RegData/etudes/STUD/2021/662906/IPOL_STU(2021)662906_EN.pdf

  continue reading

48 episodes

Artwork

Can you make AI sustainable?

Technology Untangled

53 subscribers

published

iconShare
 
Manage episode 400680904 series 2696067
Content provided by Hewlett Packard Enterprise. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hewlett Packard Enterprise or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this episode we are looking at the challenges AI technology faces when it comes to becoming, and then remaining sustainable.
The benefits of AI are unquestionable: from improved medical assistance and increased efficiency in the workplace, to autonomous transportation and next-level gaming experiences. But the more expansive the abilities of AI become, the more data storage that’s required.

That data storage uses a lot of energy. In fact, it has been predicted that AI servers could be using more energy than a country the size of the Netherlands by 2030.

For HPE Chief Technologist, Matt Armstrong-Barnes, the rate at which AI has grown in recent years has had an environmental impact, and he believes that’s down to people rushing into training large language models without thinking about longevity, or the need for future change. And that, in turn, has led to data being stored that is no longer needed.

The sustainability issue is something that is also a main focus of Arti Garg, Lead Sustainability & Edge Architect in the office of the CTO at Hewlett Packard Enterprise. Like Matt, Arti has kept a keen eye on the exponential growth of AI data storage and the effect that is having on the environment, and agrees that the key to a more sustainable future is in how we train models.

However, whilst training models well is important, the tech itself is a key component in more efficient AI. Shar Narasimhan is the director of product marketing for NVIDIA's data center GPU portfolio. He believes that a combination of openly available model optimisations and chipsets, CPUs, GPUs and intelligent data centers optimised for AI is a key piece of the puzzle in avoiding energy wastage, and making AI more sustainable all round.

Sources and statistics cited in this episode:
Global AI market prediction - https://www.statista.com/statistics/1365145/artificial-intelligence-market-size/#:~:text=Global%20artificial%20intelligence%20market%20size%202021%2D2030&text=According%20to%20Next%20Move%20Strategy,nearly%20two%20trillion%20U.S.%20dollars.
AI could use as much energy as a small country report - https://www.cell.com/joule/fulltext/S2542-4351(23)00365-3?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS2542435123003653%3Fshowall%3Dtrue
Industry responsible for 14% of earth’s emissions -
https://www.emerald.com/insight/content/doi/10.1108/JICES-11-2021-0106/full/html
Number of AI startups - https://tracxn.com/d/explore/artificial-intelligence-startups-in-united-states/__8hhT66RA16YeZhW3QByF6cGkAjrM6ertfKJuKbQIiJg/companies
AI model energy use increase - https://openai.com/research/ai-and-compute
European Parliament report into AI energy usage - https://www.europarl.europa.eu/RegData/etudes/STUD/2021/662906/IPOL_STU(2021)662906_EN.pdf

  continue reading

48 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide