Tech companies are to monitor content on their platforms for child sexual abuse material (csam). If any is found, they are required to report it to the National Center for missing and Exploited children (NCMEC). Many companies have content moderators in place that review content flagged for potentially being csam.
A training document directs content moderators to’err on the side of an adult’ when they do n’t know someone’s age in a photo or video that’s suspected to be csam.
The policy was made for Facebook content moderators working at Accenture. It is discussed in a California law review article from August.
The policy applies when a content moderator is unable to determine whether the subject in a suspected csam photo is a minor. In such situations, content moderators are instructed to assume the subject is an adult, allowing more images to go unreported to NCMEC.
The company announced the policy in the New York Times last week. The company says the policy is based on the company’s reasons for the policy.
Antigone Davis, head of safety for meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults. Ms. Davis said the consequences of falsely flagging child sexual abuse could be’life-changing’ for users.
Accenture did n’t immediately reply to a request for comment. Facebook pointed to Davis’ quotes in the NYT. Accenture declined to comment to the New York Times.
Hello, my tech-loving followers! EcoFlow has just released its latest innovation in portable solar generators…
Hello my fellow tech enthusiasts! Get ready to mark your calendars because Amazon Prime Day…
Hey there, my tech-savvy followers! Nuked here to share some exciting news from Logitech. They…
Hello, my fellow tech enthusiasts! Today is your last chance to sign up for a…
Hey there, my hilarious and tech-savvy followers! Today, I have some interesting news to share…
Hello my tech-savvy followers! Today, let's talk about a recent ruling that affects AT&T's landline…