NCMEC's COO Michelle DeLaune on how Facebook combats so-called 'child porn'

14:00
 
Share
 

Manage episode 219631751 series 1268515
By Discovered by Player FM and our community — copyright is owned by the publisher, not Player FM, and audio streamed directly from their servers.
ConnectSafely CEO Larry Magid speaks with Michelle DeLaune, senior VP and COO of the National Center for Missing & Exploited Children (NCMEC) about the center's work and new technology being used at Facebook to very quickly identify and remove child sexual abuse images, commonly referred to as "child ography." The technology uses artificial intelligence and machine learning to identify likely images that violate Facebook's child nudity or sexual exploitation of children policies. In a blog post, Facebook Global Head of Safety, Antigone Davis, wrote that the company is using "artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it’s uploaded." Davis told Reuters that the technology "helps us prioritize” and “more efficiently queue” problematic content for the company’s trained team of reviewers." She said that the company has removed. Facebook said that it had removed 8.7 million pieces of content on Facebook, "99% of which was removed before anyone reported it." Facebook and many other tech companies have long used photo-matching software called Photo DNA to identify new copies of existing sexual abuse images. Photo DNA was developed by Microsoft and made available, in cooperation with NCMEC, to other tech companies. In addition to discussing NCMEC's work with Facebook, DeLaune also talked about NCMEC's other work, including its prevention programs, and offers sound advice to parents on how to keep their own children safe and what to do if they or their children encounter sexual abuse images.

38 episodes available. A new episode about every 28 days averaging 18 mins duration .