Artwork

Content provided by Christian Hubbs and Stephen Donnelly. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Christian Hubbs and Stephen Donnelly or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

96: Is there Bias in Facial Recognition?

35:15
 
Share
 

Archived series ("Inactive feed" status)

When? This feed was archived on July 09, 2018 00:00 (6y ago). Last successful fetch was on July 01, 2020 20:55 (4y ago)

Why? Inactive feed status. Our servers were unable to retrieve a valid podcast feed for a sustained period.

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 253002193 series 1415998
Content provided by Christian Hubbs and Stephen Donnelly. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Christian Hubbs and Stephen Donnelly or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

A recent NIST study investigates the bias in many commercially available facial-recognition algorithms. They results provide very interesting findings that challenge the dominant narrative surrounding AI and bias.

Links

NIST Face Vendor Recognition Test

The Critics Were Wrong: NIST Data Shows that the Best Facial Recognition Algorithms Are Neither Racist Nor Sexist

77: The State and Facial Recognition

Follow us and leave a rating!

Skill Share

Patreon

iTunes

Homepage

Twitter @artlyintelly

Facebook

Learn Economics at Liberty Classroom

  continue reading

112 episodes

Artwork
iconShare
 

Archived series ("Inactive feed" status)

When? This feed was archived on July 09, 2018 00:00 (6y ago). Last successful fetch was on July 01, 2020 20:55 (4y ago)

Why? Inactive feed status. Our servers were unable to retrieve a valid podcast feed for a sustained period.

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 253002193 series 1415998
Content provided by Christian Hubbs and Stephen Donnelly. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Christian Hubbs and Stephen Donnelly or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

A recent NIST study investigates the bias in many commercially available facial-recognition algorithms. They results provide very interesting findings that challenge the dominant narrative surrounding AI and bias.

Links

NIST Face Vendor Recognition Test

The Critics Were Wrong: NIST Data Shows that the Best Facial Recognition Algorithms Are Neither Racist Nor Sexist

77: The State and Facial Recognition

Follow us and leave a rating!

Skill Share

Patreon

iTunes

Homepage

Twitter @artlyintelly

Facebook

Learn Economics at Liberty Classroom

  continue reading

112 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide