Artwork

Content provided by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Karen Hao | Covering AI and Ethics Washing in the Tech Industry

34:20
 
Share
 

Manage episode 294596706 series 2706384
Content provided by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Karen Hao trained as a mechanical engineer and then joined a Silicon Valley startup, thinking that technology was the best means to create social change. While surrounded by smart people who were also passionate about using technology for social change, she soon discovered there were no incentives or pathways to accomplish this. “When you're inside a technology company and you're thinking this is going to help change the world, you're often blind to unattended consequences of your work,” she says.

She decided to transition to a career in journalism where she could create social change by raising awareness about social impacts of technologies like AI, and how big tech companies engage in “ethics washing” to protect their profits. She is intrigued by the way that incentives shape the work that is done at a systemic level. She says every tech giant suffers from issues at the systemic level where there are people who deeply care about ethics within the organization, but it doesn't mean the company is willing to change the way their profitable technologies work. And employees are disincentivized to do this work because they could be fired.

A high-profile example was the ethical AI team at Google was doing great work critiquing some of Google’s practices and tried to get the paper published. Google refused to let them publish it, censored their research and then fired both of the team leads. It later came out that this was just one instance of academic censorship, but Google had told many other researchers to strike a positive tone when talking about technology being developed by Google.

For her article, How Facebook Got Addicted to Spreading Misinformation, she did a nine-month investigation into Facebook's responsible AI team that was supposed to be understanding and mitigating the unintended consequences of Facebook's algorithms. She found the team was focused on specific unintended consequences that are good for Facebook's growth like AI bias. It completely ignored the most important harms of Facebook's algorithms—misinformation, amplification, polarization, exacerbation, especially in the wake of the January 6 capital riots—because addressing them would undermine Facebook's growth. There have been times when Facebook was not only ignoring or negligent of the issues that its algorithms might be causing, but also purposely undermining some of the efforts to try and fix it because of this tension with the company's growth.

She was glad to see policymakers cite her article at a recent congressional hearing, and hopes Congress has the political will to regulate companies like Facebook. She says it’s also important for every new generation of Facebook employees to become educated about these issues so they will hold the company accountable. She thinks AI research has shifted a bit over the last five years to be more focused on taking responsibility for societal impacts and part of that evolution is being driven by people on the inside who raised awareness and advocated for change.

One of Karen’s inspirations to go into journalism was Rachel Carson's book Silent Spring that sparked a widespread environmental movement. She strives to write stories that activate that same level of change, transforming both the cultural discussions and policy around important issues.

RELATED LINKS
Connect with Karen Hao on LinkedIN and Twitter
Find out more about Karen on her website
Find out more about MIT Technology Review
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile

  continue reading

52 episodes

Artwork
iconShare
 
Manage episode 294596706 series 2706384
Content provided by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Karen Hao trained as a mechanical engineer and then joined a Silicon Valley startup, thinking that technology was the best means to create social change. While surrounded by smart people who were also passionate about using technology for social change, she soon discovered there were no incentives or pathways to accomplish this. “When you're inside a technology company and you're thinking this is going to help change the world, you're often blind to unattended consequences of your work,” she says.

She decided to transition to a career in journalism where she could create social change by raising awareness about social impacts of technologies like AI, and how big tech companies engage in “ethics washing” to protect their profits. She is intrigued by the way that incentives shape the work that is done at a systemic level. She says every tech giant suffers from issues at the systemic level where there are people who deeply care about ethics within the organization, but it doesn't mean the company is willing to change the way their profitable technologies work. And employees are disincentivized to do this work because they could be fired.

A high-profile example was the ethical AI team at Google was doing great work critiquing some of Google’s practices and tried to get the paper published. Google refused to let them publish it, censored their research and then fired both of the team leads. It later came out that this was just one instance of academic censorship, but Google had told many other researchers to strike a positive tone when talking about technology being developed by Google.

For her article, How Facebook Got Addicted to Spreading Misinformation, she did a nine-month investigation into Facebook's responsible AI team that was supposed to be understanding and mitigating the unintended consequences of Facebook's algorithms. She found the team was focused on specific unintended consequences that are good for Facebook's growth like AI bias. It completely ignored the most important harms of Facebook's algorithms—misinformation, amplification, polarization, exacerbation, especially in the wake of the January 6 capital riots—because addressing them would undermine Facebook's growth. There have been times when Facebook was not only ignoring or negligent of the issues that its algorithms might be causing, but also purposely undermining some of the efforts to try and fix it because of this tension with the company's growth.

She was glad to see policymakers cite her article at a recent congressional hearing, and hopes Congress has the political will to regulate companies like Facebook. She says it’s also important for every new generation of Facebook employees to become educated about these issues so they will hold the company accountable. She thinks AI research has shifted a bit over the last five years to be more focused on taking responsibility for societal impacts and part of that evolution is being driven by people on the inside who raised awareness and advocated for change.

One of Karen’s inspirations to go into journalism was Rachel Carson's book Silent Spring that sparked a widespread environmental movement. She strives to write stories that activate that same level of change, transforming both the cultural discussions and policy around important issues.

RELATED LINKS
Connect with Karen Hao on LinkedIN and Twitter
Find out more about Karen on her website
Find out more about MIT Technology Review
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile

  continue reading

52 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide