Artwork

Content provided by Anchormen. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Anchormen or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Is A.I. better at avoiding bias?

51:15
 
Share
 

Manage episode 343942146 series 3238641
Content provided by Anchormen. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Anchormen or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

This podcast takes off with Jeroen and Ron talking about how algorithms can become biased and they discuss this on the basis of the gender bias hiring example. How can you avoid black box algorithms and force the neural network to represent its decision making process?

Next, they touch upon the accuracy of face and emotion recognition and how this relates to the 'dream' of Artificial General Intelligence (AGI). Can machines actually point into places where humans didn't go yet? (Spoiler: AlphaGo Zero)

What can companies learn from this: who takes the responsibility to avoid bias and to have a balanced, unbiased data (training) set? Jeroen and Ron explain why Precision and Recall are better metrics (over accuracy) to check whether your algorithm or data set is unbiased or not. And how can recommendation engines combined with post-processing help avoid collaborative filtering.

  continue reading

8 episodes

Artwork
iconShare
 
Manage episode 343942146 series 3238641
Content provided by Anchormen. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Anchormen or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

This podcast takes off with Jeroen and Ron talking about how algorithms can become biased and they discuss this on the basis of the gender bias hiring example. How can you avoid black box algorithms and force the neural network to represent its decision making process?

Next, they touch upon the accuracy of face and emotion recognition and how this relates to the 'dream' of Artificial General Intelligence (AGI). Can machines actually point into places where humans didn't go yet? (Spoiler: AlphaGo Zero)

What can companies learn from this: who takes the responsibility to avoid bias and to have a balanced, unbiased data (training) set? Jeroen and Ron explain why Precision and Recall are better metrics (over accuracy) to check whether your algorithm or data set is unbiased or not. And how can recommendation engines combined with post-processing help avoid collaborative filtering.

  continue reading

8 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide