Artwork

Content provided by Larry Magid. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Larry Magid or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Symantec's Paige Hanson on National Cybersecurity Awareness Month

17:33
 
Share
 

Manage episode 218655359 series 1268515
Content provided by Larry Magid. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Larry Magid or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
ConnectSafely CEO Larry Magid speaks with Michelle DeLaune, senior VP and COO of the National Center for Missing & Exploited Children (NCMEC) about the center's work and new technology being used at Facebook to very quickly identify and remove child sexual abuse images, commonly referred to as "child ography."
The technology uses artificial intelligence and machine learning to identify likely images that violate Facebook's child nudity or sexual exploitation of children policies. In a blog post, Facebook Global Head of Safety, Antigone Davis, wrote that the company is using "artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it’s uploaded." Davis told Reuters that the technology "helps us prioritize” and “more efficiently queue” problematic content for the company’s trained team of reviewers." She said that the company has removed.
Facebook said that it had removed 8.7 million pieces of content on Facebook, "99% of which was removed before anyone reported it."
Facebook and many other tech companies have long used photo-matching software called Photo DNA to identify new copies of existing sexual abuse images. Photo DNA was developed by Microsoft and made available, in cooperation with NCMEC, to other tech companies.
In addition to discussing NCMEC's work with Facebook, DeLaune also talked about NCMEC's other work, including its prevention programs, and offers sound advice to parents on how to keep their own children safe and what to do if they or their children encounter sexual abuse images.
The podcast runs 14 minutes.
  continue reading

117 episodes

Artwork
iconShare
 
Manage episode 218655359 series 1268515
Content provided by Larry Magid. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Larry Magid or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
ConnectSafely CEO Larry Magid speaks with Michelle DeLaune, senior VP and COO of the National Center for Missing & Exploited Children (NCMEC) about the center's work and new technology being used at Facebook to very quickly identify and remove child sexual abuse images, commonly referred to as "child ography."
The technology uses artificial intelligence and machine learning to identify likely images that violate Facebook's child nudity or sexual exploitation of children policies. In a blog post, Facebook Global Head of Safety, Antigone Davis, wrote that the company is using "artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it’s uploaded." Davis told Reuters that the technology "helps us prioritize” and “more efficiently queue” problematic content for the company’s trained team of reviewers." She said that the company has removed.
Facebook said that it had removed 8.7 million pieces of content on Facebook, "99% of which was removed before anyone reported it."
Facebook and many other tech companies have long used photo-matching software called Photo DNA to identify new copies of existing sexual abuse images. Photo DNA was developed by Microsoft and made available, in cooperation with NCMEC, to other tech companies.
In addition to discussing NCMEC's work with Facebook, DeLaune also talked about NCMEC's other work, including its prevention programs, and offers sound advice to parents on how to keep their own children safe and what to do if they or their children encounter sexual abuse images.
The podcast runs 14 minutes.
  continue reading

117 episodes

Minden epizód

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide