Artwork

Content provided by Debra J. Farber (Shifting Privacy Left). All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Debra J. Farber (Shifting Privacy Left) or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

S2E17 - Noise in the Machine: How to Assess, Design & Deploy 'Differential Privacy' with Damien Desfontaines (Tumult Labs)

46:07
 
Share
 

Manage episode 362233138 series 3407760
Content provided by Debra J. Farber (Shifting Privacy Left). All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Debra J. Farber (Shifting Privacy Left) or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this week’s episode, I speak with Damien Desfontaines, also known by the pseudonym “Ted”, who is the Staff Scientist at Tumult Labs, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at ETH Zurich, as well as his Master's Degree in Mathematical Logic and Theoretical Computer Science.

Tumult Labs’ platform makes differential privacy useful by making it easy to create innovative privacy and enabling data products that can be safely shared and used widely. In this conversation, we focus our discussion on Differential Privacy techniques, including what’s next in its evolution, common vulnerabilities, and how to implement differential privacy into your platform.

When it comes to protecting personal data, Tumult Labs has three stages in their approach. These are Assess, Design, and Deploy. Damien takes us on a deep dive into each with use cases provided.
Topics Covered:

  • Why there's such a gap between the academia and the corporate world
  • How differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & usability
  • When to use "local" vs "central" differential privacy techniques
  • Advancements in technology that enable the private collection of data
  • Tumult Labs' Assessment approach to deploying differential privacy, where a customer defines its 'data publication' problem or question
  • How the Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirements
  • Why using gold standard techniques like differential privacy to safely release, publish, or share data has value far beyond compliance
  • How data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks & number of tasks that you can possibly answer
  • Damien's work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project
  • How to address security vulnerabilities (i.e. potential attacks) to differentially private datasets
  • Where you can learn more about differential privacy
  • How Damien sees this space evolving over the next several years

Resources Mentioned:

Guest Info:

Send us a Text Message.

Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.
Shifting Privacy Left Media
Where privacy engineers gather, share, & learn
Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.
Copyright © 2022 - 2024 Principled LLC. All rights reserved.

  continue reading

Chapters

1. S2E17 - Noise in the Machine: How to Assess, Design & Deploy 'Differential Privacy' with Damien Desfontaines (Tumult Labs) (00:00:00)

2. Introducing Damien Desfontaines, PhD (00:01:15)

3. Why there's such a gap between the academia and the corporate world (00:03:34)

4. How differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & useability (00:05:19)

5. When to use "local" vs "central" differential privacy techniques (00:08:03)

6. Damien describes advancements in technology that enable the private collection of data (i.e., multi-party computation, secure computation, federated learning) that can be used with local DP (00:11:56)

7. Damien describes Tumult Labs' Assessment approach to deploying differential privacy, where a customer would define its 'data publication' problem or question. (00:14:32)

8. Damien describes how the open source Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirements (00:17:08)

9. Why using gold standard techniques like differential privacy to safely release, publish, or share data, we tell them that this goes beyond compliance to unlock the value of company data (00:19:13)

10. What's involved with deploying differentially private algorithms via Tumult Labs' platform (00:20:37)

11. Damien's litmus test for when it's appropriate to use differential privacy (00:21:49)

12. How data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks and number of tasks that you can possibly answer (00:26:25)

13. Damien describes his work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project (00:30:27)

14. Damien discusses security vulnerabilities (i.e. potential attacks) to differentially private datasets (00:33:02)

15. Where you can learn more about differential privacy (00:37:24)

16. How Damien sees this space evolving over the next several years (00:40:18)

63 episodes

Artwork
iconShare
 
Manage episode 362233138 series 3407760
Content provided by Debra J. Farber (Shifting Privacy Left). All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Debra J. Farber (Shifting Privacy Left) or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In this week’s episode, I speak with Damien Desfontaines, also known by the pseudonym “Ted”, who is the Staff Scientist at Tumult Labs, a startup leading the way on differential privacy. In Damien’s career, he has led an Anonymization Consulting Team at Google and specializes in making it easy to safely anonymize data. Damien earned his PhD and wrote his thesis at ETH Zurich, as well as his Master's Degree in Mathematical Logic and Theoretical Computer Science.

Tumult Labs’ platform makes differential privacy useful by making it easy to create innovative privacy and enabling data products that can be safely shared and used widely. In this conversation, we focus our discussion on Differential Privacy techniques, including what’s next in its evolution, common vulnerabilities, and how to implement differential privacy into your platform.

When it comes to protecting personal data, Tumult Labs has three stages in their approach. These are Assess, Design, and Deploy. Damien takes us on a deep dive into each with use cases provided.
Topics Covered:

  • Why there's such a gap between the academia and the corporate world
  • How differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & usability
  • When to use "local" vs "central" differential privacy techniques
  • Advancements in technology that enable the private collection of data
  • Tumult Labs' Assessment approach to deploying differential privacy, where a customer defines its 'data publication' problem or question
  • How the Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirements
  • Why using gold standard techniques like differential privacy to safely release, publish, or share data has value far beyond compliance
  • How data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks & number of tasks that you can possibly answer
  • Damien's work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project
  • How to address security vulnerabilities (i.e. potential attacks) to differentially private datasets
  • Where you can learn more about differential privacy
  • How Damien sees this space evolving over the next several years

Resources Mentioned:

Guest Info:

Send us a Text Message.

Privado.ai
Privacy assurance at the speed of product development. Get instant visibility w/ privacy code scans.
Shifting Privacy Left Media
Where privacy engineers gather, share, & learn
Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.
Copyright © 2022 - 2024 Principled LLC. All rights reserved.

  continue reading

Chapters

1. S2E17 - Noise in the Machine: How to Assess, Design & Deploy 'Differential Privacy' with Damien Desfontaines (Tumult Labs) (00:00:00)

2. Introducing Damien Desfontaines, PhD (00:01:15)

3. Why there's such a gap between the academia and the corporate world (00:03:34)

4. How differential privacy's strong privacy guarantees are a result of strong assumptions; and why the biggest blockers to DP deployments have been eduction & useability (00:05:19)

5. When to use "local" vs "central" differential privacy techniques (00:08:03)

6. Damien describes advancements in technology that enable the private collection of data (i.e., multi-party computation, secure computation, federated learning) that can be used with local DP (00:11:56)

7. Damien describes Tumult Labs' Assessment approach to deploying differential privacy, where a customer would define its 'data publication' problem or question. (00:14:32)

8. Damien describes how the open source Tumult Analytics platform can help you build different privacy algorithms that satisfies 'fitness for use' requirements (00:17:08)

9. Why using gold standard techniques like differential privacy to safely release, publish, or share data, we tell them that this goes beyond compliance to unlock the value of company data (00:19:13)

10. What's involved with deploying differentially private algorithms via Tumult Labs' platform (00:20:37)

11. Damien's litmus test for when it's appropriate to use differential privacy (00:21:49)

12. How data scientists can make the analysis & design more robust to better preserve privacy; and the tradeoff between utility on very specific tasks and number of tasks that you can possibly answer (00:26:25)

13. Damien describes his work assisting the IRS & DOE deploy differential privacy to safely publish and share data publicly via the College Scorecards project (00:30:27)

14. Damien discusses security vulnerabilities (i.e. potential attacks) to differentially private datasets (00:33:02)

15. Where you can learn more about differential privacy (00:37:24)

16. How Damien sees this space evolving over the next several years (00:40:18)

63 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide