Artwork

Content provided by Dr. Stacey Denise | The Neuroaesthetic MD. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Dr. Stacey Denise | The Neuroaesthetic MD or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

ERASED: How AI Bias Impacts Beauty, Identity & Belonging

34:31
 
Share
 

Manage episode 504675641 series 3688290
Content provided by Dr. Stacey Denise | The Neuroaesthetic MD. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Dr. Stacey Denise | The Neuroaesthetic MD or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In the final episode of our AI & identity series, Dr. Stacey Denise speaks with Douglas Moore Jr. data scientist and responsible AI advocate about how machines are trained to see beauty, emotion, and culture. From bias in datasets to the ethics of design, this is a conversation for anyone who wants tech to feel more human.

Topics We Explore:

  • How AI models learn bias — and how it feels when they get you wrong
  • The challenge of training machines to "see" beauty, culture, and emotion
  • Why neuroaesthetic design matters for mental and emotional wellness
  • What explainability and fairness really look like in responsible AI
  • The non-negotiables of ethical, inclusive, and emotionally intelligent tech

Connect with Douglas Moore Jr. on IG

Subscription, links & Email List

⁠ ⁠Socials

  continue reading

15 episodes

Artwork
iconShare
 
Manage episode 504675641 series 3688290
Content provided by Dr. Stacey Denise | The Neuroaesthetic MD. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Dr. Stacey Denise | The Neuroaesthetic MD or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In the final episode of our AI & identity series, Dr. Stacey Denise speaks with Douglas Moore Jr. data scientist and responsible AI advocate about how machines are trained to see beauty, emotion, and culture. From bias in datasets to the ethics of design, this is a conversation for anyone who wants tech to feel more human.

Topics We Explore:

  • How AI models learn bias — and how it feels when they get you wrong
  • The challenge of training machines to "see" beauty, culture, and emotion
  • Why neuroaesthetic design matters for mental and emotional wellness
  • What explainability and fairness really look like in responsible AI
  • The non-negotiables of ethical, inclusive, and emotionally intelligent tech

Connect with Douglas Moore Jr. on IG

Subscription, links & Email List

⁠ ⁠Socials

  continue reading

15 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play