Artwork

Content provided by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Natalie Evans Harris | Creating A Shared Code Of Ethics To Guide Ethical and Responsible Use of Data

30:31
 
Share
 

Manage episode 264308732 series 2706384
Content provided by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

During her career at the National Security Agency, Capitol Hill and the White House, Natalie Evans Harris saw that while we collected troves of data, we didn't have strong frameworks and governance in place to protect people in a data driven world. “Data has been used to intrude in our lives. Things are happening based upon data that nobody communicated to the public was actually happening,” she explained during a conversation with Stanford’s Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast.

Data ethics and responsible use of data are essentially about building trust. There's this gap in understanding what sharing data means. Two things have to happen if we're going to build a relationship where people allow their data to be used by a company. Individuals have to trust that what the company is doing with that data is something they're okay with. And the company has to be able to prove that they're being responsible with the use of the data. A company could have the best products out there, but if people don't trust you or understand what you're doing with the data, then they're not going to trust you to use the data. And then innovation stops.

She believes the biggest problem is we do not have a shared vision of what ethical practices mean. We don’t want to put broad impact laws in place to govern responsible use of data when we're still trying to define that vision. To change business practices, we have to change company expectations so that they're not only incentivized to be ethical and responsible in their business models, but they're also penalized when they violate.

Harris has been advocating for a data science “code of ethics” to create a shared vision to guide our behaviors, and then start to develop best practices around. Companies are now taking this code of ethics and personalizing it to their businesses around principles like informed consent, transparency, fairness and diversity. Companies then publicize the practices that they're putting in place to align with those principles. That's how you start to create that shared vision.

She sees that there's this transformation happening with the relationship between technology and people. For so long, technology has been this very passive thing in our lives, and now with AI and machine learning and all of these uses of data and technology, there's this tension around what technology can do and what humans should do. Until people know and understand what is happening with their data, and until companies can thoughtfully express what they're doing with the data in a very transparent fashion, we will continue to have this tension. She is hoping that this code of ethics can start to ease that tension.

RELATED LINKS
Connect with Natalie Evans Harris on Twitter (@QuietStormnat) and LinkedIn
Find out more about Natalie on her personal website
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website
Read more about BrightHive and Beeck Center

Listen and Subscribe to the WiDS Podcast on Apple Podcasts, Google Podcasts, Spotify, Stitcher

  continue reading

52 episodes

Artwork
iconShare
 
Manage episode 264308732 series 2706384
Content provided by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Stanford Women in Data Science (WiDS) initiative, Professor Margot Gerritsen, and Chisoo Lyons or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

During her career at the National Security Agency, Capitol Hill and the White House, Natalie Evans Harris saw that while we collected troves of data, we didn't have strong frameworks and governance in place to protect people in a data driven world. “Data has been used to intrude in our lives. Things are happening based upon data that nobody communicated to the public was actually happening,” she explained during a conversation with Stanford’s Margot Gerritsen, Stanford professor and host of the Women in Data Science podcast.

Data ethics and responsible use of data are essentially about building trust. There's this gap in understanding what sharing data means. Two things have to happen if we're going to build a relationship where people allow their data to be used by a company. Individuals have to trust that what the company is doing with that data is something they're okay with. And the company has to be able to prove that they're being responsible with the use of the data. A company could have the best products out there, but if people don't trust you or understand what you're doing with the data, then they're not going to trust you to use the data. And then innovation stops.

She believes the biggest problem is we do not have a shared vision of what ethical practices mean. We don’t want to put broad impact laws in place to govern responsible use of data when we're still trying to define that vision. To change business practices, we have to change company expectations so that they're not only incentivized to be ethical and responsible in their business models, but they're also penalized when they violate.

Harris has been advocating for a data science “code of ethics” to create a shared vision to guide our behaviors, and then start to develop best practices around. Companies are now taking this code of ethics and personalizing it to their businesses around principles like informed consent, transparency, fairness and diversity. Companies then publicize the practices that they're putting in place to align with those principles. That's how you start to create that shared vision.

She sees that there's this transformation happening with the relationship between technology and people. For so long, technology has been this very passive thing in our lives, and now with AI and machine learning and all of these uses of data and technology, there's this tension around what technology can do and what humans should do. Until people know and understand what is happening with their data, and until companies can thoughtfully express what they're doing with the data in a very transparent fashion, we will continue to have this tension. She is hoping that this code of ethics can start to ease that tension.

RELATED LINKS
Connect with Natalie Evans Harris on Twitter (@QuietStormnat) and LinkedIn
Find out more about Natalie on her personal website
Connect with Margot Gerritsen on Twitter (@margootjeg) and LinkedIn
Find out more about Margot on her Stanford Profile
Find out more about Margot on her personal website
Read more about BrightHive and Beeck Center

Listen and Subscribe to the WiDS Podcast on Apple Podcasts, Google Podcasts, Spotify, Stitcher

  continue reading

52 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide