More than 41 million videos of child sexual abuse were reported to the National Center for missing and Exploited children in 2019. Just five years ago, the number of videos reported was less than 350,000. Many of them were reported more than once across multiple platforms as users shared the illegal content.
Facebook reported almost 60 million photos and videos, the Times report states. Only 29.2 million of that content is considered’violating’.
Google reported 3.5 million total videos and photos in about 449,000 reports. Imgur reported 260,000 photos and videos based on 74,000 reports. Apple was apparently one of the lower-reporting companies, tipping 3,000 images and no videos.
Last August, Facebook open-sourced the algorithms it uses to identify child sexual exploitation and other graphic content on its platform. So even though it’s number one on this list, that may be because it’s actively doing more to find this content.
Antigone Davis, Facebook’s global head of safety, said the company’would continue to develop the best solutions to keep more children safe’.
Some cloud storage services, including Amazon, do n’t scan for illegal content. Content on Apple’s messaging app is encrypted, so Apple ca n’t scan it to find illegal material.
Facebook is considering moving toward encryption, but taking a lot of flak for it for it. A draft bill to create a national commission on online child exploitation prevention would reduce legal protections for websites.