Thursday, May 19, 2022

Facebook moderators ‘wander on the side of an adult’ when unsure of age in potential abuse photos

Must read

Shreya Christinahttps://cafe-madrid.com
Shreya has been with cafe-madrid.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider cafe-madrid.com team, Shreya seeks to understand an audience before creating memorable, persuasive copy.

A key responsibility of technology companies is to monitor the content on their platforms for child sexual abuse material (CSAM), and if found, they are required by law to report it to the National Center for Missing and Exploited Children (NCMEC). . Many companies have content moderators who review content that has been flagged as potentially CSAM, and they determine whether the content should be reported to the NCMEC.

However, Facebook has a policy that could mean under-reporting child sexual abuse, according to a new report from The New York Times† A Facebook training document instructs content moderators to “side with an adult” when they don’t know someone’s age in a photo or video suspected of being CSAM, the report said.

The policy is for Facebook content moderators who work at Accenture and is discussed in a California Law Review article from August

Interviewees also described a policy called “bumping” that each of them personally disagreed with. The policy applies when a content moderator cannot easily determine whether the subject in a suspected CSAM photo is a minor (“B”) or an adult (“C”). In such situations, content moderators are instructed to assume that the subject is an adult, causing more images not to be reported to NCMEC.

Here’s the company’s reasoning for the policy, from: The New York Times

Antigone Davis, chief of security for Meta, confirmed the policy in an interview, saying it arose out of concerns about the privacy of those posting adult sexual images. “The sexual abuse of children online is abhorrent,” said Ms. Davis, stressing that Meta uses a multi-layered, rigorous review process that flags far more images than any other tech company. She said the consequences of falsely flagging child sexual abuse could be “life-changing” for users.

Facebook (which now falls under Meta’s corporate umbrella) and Accenture did not immediately respond to a request for comment from The edge† Accenture declined to comment on The New York Times

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article