Artwork

Content provided by Max Bodach and Foundation for American Innovation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Max Bodach and Foundation for American Innovation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

What Should Be Done About Misinformation? w/Renée DiResta

1:14:06
 
Share
 

Manage episode 435597157 series 3530279
Content provided by Max Bodach and Foundation for American Innovation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Max Bodach and Foundation for American Innovation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

The recent riots in the United Kingdom raise new questions about online free speech and misinformation. Following the murder of three children in Southport, England, false rumors spread across social media about the killer’s identity and religion, igniting simmering resentment over the British government’s handling of immigration in recent years. X, formerly Twitter, has come under fire for allowing the rumors to spread, and the company’s owner Elon Musk has publicly sparred with British politicians and European Union regulators over the issue.

The incident is the latest in an ongoing debate abroad and in the U.S. about free speech and the real-world impact of online misinformation. In the U.S., politicians have griped for years about the content policies of major platforms like YouTube and Facebook—generally with conservatives complaining the companies are too censorious and liberals bemoaning that they don’t take down enough misinformation and hate speech.

Where should the line be? Is it possible for platforms to respect free expression while removing “harmful content” and misinformation? Who gets to decide what is true and false, and what role, if any, should the government play? Evan is joined by Renee Diresta who studies and writes about adversarial abuse online. Previously, she was a research manager at the Stanford Internet Observatory where she researched and investigated online political speech and foreign influence campaigns. She is the author of Invisible Rulers: The People Who Turn Lies into Reality. Read her recent op-ed in the New York Times here.

  continue reading

80 episodes

Artwork
iconShare
 
Manage episode 435597157 series 3530279
Content provided by Max Bodach and Foundation for American Innovation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Max Bodach and Foundation for American Innovation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

The recent riots in the United Kingdom raise new questions about online free speech and misinformation. Following the murder of three children in Southport, England, false rumors spread across social media about the killer’s identity and religion, igniting simmering resentment over the British government’s handling of immigration in recent years. X, formerly Twitter, has come under fire for allowing the rumors to spread, and the company’s owner Elon Musk has publicly sparred with British politicians and European Union regulators over the issue.

The incident is the latest in an ongoing debate abroad and in the U.S. about free speech and the real-world impact of online misinformation. In the U.S., politicians have griped for years about the content policies of major platforms like YouTube and Facebook—generally with conservatives complaining the companies are too censorious and liberals bemoaning that they don’t take down enough misinformation and hate speech.

Where should the line be? Is it possible for platforms to respect free expression while removing “harmful content” and misinformation? Who gets to decide what is true and false, and what role, if any, should the government play? Evan is joined by Renee Diresta who studies and writes about adversarial abuse online. Previously, she was a research manager at the Stanford Internet Observatory where she researched and investigated online political speech and foreign influence campaigns. She is the author of Invisible Rulers: The People Who Turn Lies into Reality. Read her recent op-ed in the New York Times here.

  continue reading

80 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide