in

In the wake of the child sex abuse scandal engulfing Facebook and other social media platforms, India’s Information and Communications Technology (ICT) Minister Ravi Shankar Prasad has called on the tech giants to do more to tackle the problem

Tech companies are to monitor content on their platforms for child sexual abuse material (csam). If any is found, they are required to report it to the National Center for missing and Exploited children (NCMEC). Many companies have content moderators in place that review content flagged for potentially being csam.

A training document directs content moderators to’err on the side of an adult’ when they do n’t know someone’s age in a photo or video that’s suspected to be csam.

The policy was made for Facebook content moderators working at Accenture. It is discussed in a California law review article from August.

The policy applies when a content moderator is unable to determine whether the subject in a suspected csam photo is a minor. In such situations, content moderators are instructed to assume the subject is an adult, allowing more images to go unreported to NCMEC.

The company announced the policy in the New York Times last week. The company says the policy is based on the company’s reasons for the policy.

Antigone Davis, head of safety for meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults. Ms. Davis said the consequences of falsely flagging child sexual abuse could be’life-changing’ for users.

Accenture did n’t immediately reply to a request for comment. Facebook pointed to Davis’ quotes in the NYT. Accenture declined to comment to the New York Times.

Spread the AI news in the universe!

What do you think?

Written by Nuked

Leave a Reply

Your email address will not be published. Required fields are marked *