Artwork

Content provided by ISG. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by ISG or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

39: AI Ethics & Responsible AI Development      

32:47
 
Share
 

Manage episode 380266464 series 2500074
Content provided by ISG. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by ISG or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Artificial intelligence (AI) and generative AI are swiftly permeating our daily lives. From customer service chatbots to healthcare diagnoses, from writing loan applications to job hiring, the disruption is here. Many believe AI is fundamentally changing the way we do things. But since this premise of large-impact change is widely accepted, the question becomes: will this change be for the better?

It's critical to address the ethics of AI: how this technology disruption could reinforce or subvert current inequalities to improve or worsen them, or even create new inequities. To do so, we must understand how and where AI is most likely to exacerbate or cause inequities. With these vulnerabilities in mind, leaders must thoughtfully consider how to proactively prevent inequity in AI adoption globally.

In this episode of Imagine Your Future, we examine ethical challenges posed by AI and discuss the need for responsible development to mitigate the potentially damaging effects. Hosts Steve Hall and Karen Collyer speak to Dr. Neelam Raina, Associate Professor International Development and Design at Middlesex University, for her perspective. Tune in to hear them share the potential impacts of disruptive technologies and key ethical considerations surrounding data usage.

  continue reading

41 episodes

Artwork
iconShare
 
Manage episode 380266464 series 2500074
Content provided by ISG. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by ISG or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Artificial intelligence (AI) and generative AI are swiftly permeating our daily lives. From customer service chatbots to healthcare diagnoses, from writing loan applications to job hiring, the disruption is here. Many believe AI is fundamentally changing the way we do things. But since this premise of large-impact change is widely accepted, the question becomes: will this change be for the better?

It's critical to address the ethics of AI: how this technology disruption could reinforce or subvert current inequalities to improve or worsen them, or even create new inequities. To do so, we must understand how and where AI is most likely to exacerbate or cause inequities. With these vulnerabilities in mind, leaders must thoughtfully consider how to proactively prevent inequity in AI adoption globally.

In this episode of Imagine Your Future, we examine ethical challenges posed by AI and discuss the need for responsible development to mitigate the potentially damaging effects. Hosts Steve Hall and Karen Collyer speak to Dr. Neelam Raina, Associate Professor International Development and Design at Middlesex University, for her perspective. Tune in to hear them share the potential impacts of disruptive technologies and key ethical considerations surrounding data usage.

  continue reading

41 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide