Artwork

Content provided by Larry and Arianna Backer and Arianna Backer. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Larry and Arianna Backer and Arianna Backer or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

AI Policing the Welfare State

40:28
 
Share
 

Manage episode 357555936 series 3296212
Content provided by Larry and Arianna Backer and Arianna Backer. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Larry and Arianna Backer and Arianna Backer or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

The city of Rotterdam provides welfare benefits for their residents every year, usually to about 30,000 individuals and families. Between 2017 and 2021, the city used a machine learning algorithm to generate a risk-based score for all residents to judge the probability of committing fraud. The problem? The factors used to judge are not predictors of fraud but of the very need for welfare assistance. But why did they do it that way?

  continue reading

87 episodes

Artwork
iconShare
 
Manage episode 357555936 series 3296212
Content provided by Larry and Arianna Backer and Arianna Backer. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Larry and Arianna Backer and Arianna Backer or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

The city of Rotterdam provides welfare benefits for their residents every year, usually to about 30,000 individuals and families. Between 2017 and 2021, the city used a machine learning algorithm to generate a risk-based score for all residents to judge the probability of committing fraud. The problem? The factors used to judge are not predictors of fraud but of the very need for welfare assistance. But why did they do it that way?

  continue reading

87 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide