Artwork

Content provided by The Conversation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Conversation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Data poisoning: how artists are trying to sabotage generative AI

26:07
 
Share
 

Manage episode 415919531 series 2865065
Content provided by The Conversation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Conversation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Content created with the help of generative artificial intelligence is popping up everywhere, and it’s worrying some artists. They’re concerned that their intellectual property may be at risk if generative AI tools have been built by scraping the internet for data and images, regardless of whether they had permissions to do so.


In this episode we speak with a computer scientist about how some artists are trying novel ways to sabotage AI to prevent it from scraping their work, through what’s called data poisoning, and why he thinks the root of the problem is an ethical problem at the heart of computer science.


Featuring Daniel Angus, professor of digital communication at Queensland University of Technology in Australia. Plus an introduction from Eric Smalley, science and technology editor at The Conversation in the US.


This episode was written and produced by Katie Flood with assistance from Mend Mariwany. Eloise Stevens does our sound design, and our theme music is by Neeta Sarl. Gemma Ware is the executive producer. Full credits available here. A transcript will be available shortly. Subscribe to a free daily newsletter from The Conversation.

Further reading



Hosted on Acast. See acast.com/privacy for more information.

  continue reading

164 episodes

Artwork
iconShare
 
Manage episode 415919531 series 2865065
Content provided by The Conversation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by The Conversation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Content created with the help of generative artificial intelligence is popping up everywhere, and it’s worrying some artists. They’re concerned that their intellectual property may be at risk if generative AI tools have been built by scraping the internet for data and images, regardless of whether they had permissions to do so.


In this episode we speak with a computer scientist about how some artists are trying novel ways to sabotage AI to prevent it from scraping their work, through what’s called data poisoning, and why he thinks the root of the problem is an ethical problem at the heart of computer science.


Featuring Daniel Angus, professor of digital communication at Queensland University of Technology in Australia. Plus an introduction from Eric Smalley, science and technology editor at The Conversation in the US.


This episode was written and produced by Katie Flood with assistance from Mend Mariwany. Eloise Stevens does our sound design, and our theme music is by Neeta Sarl. Gemma Ware is the executive producer. Full credits available here. A transcript will be available shortly. Subscribe to a free daily newsletter from The Conversation.

Further reading



Hosted on Acast. See acast.com/privacy for more information.

  continue reading

164 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide