Leaked document shows that Facebook may not be reporting enough photos of child abuse

2eda7770 b142 11ec aff7 afb84c105c19 scaled


A training document used by Facebook’s content moderators raises questions about whether the social network underreports images of possible child sexual abuse, The New York Times reports. The document reportedly tells moderators to “side with an adult” when reviewing images, a practice that moderators have objected to but company leaders have championed.

It’s about how Facebook moderators should handle images where the subject’s age isn’t immediately apparent. That decision could have significant implications as suspected child abuse images are reported to the National Center for Missing and Exploited Children (NCMEC), which refers images to law enforcement. In contrast, images depicting adults can be removed from Facebook if they violate the rules, but will not be reported to outside authorities.

But, as The NYT points out, there’s no reliable way to determine age from a photo. Moderators have reportedly been trained to use a more than 50-year-old method to identify “the progressive stages of puberty,” but the methodology “isn’t designed to determine a person’s age.” And since Facebook’s guidelines instruct moderators to assume photos they aren’t sure are adults, moderators suspect many images of children are slipping through.

This is further complicated by the fact that Facebook’s contract moderators, who work for outside companies and don’t get the same benefits as full-time employees, may only have seconds to make a decision and may be penalized for making the wrong one. phone call.

Facebook, which reports more child sexual abuse material to NCMEC than any other company, says adult-side mistakes are meant to protect users, privacy and prevent false reports that could limit authorities’ ability to detect actual cases. investigation of abuse. The company’s chief of security, Antigone Davis, told the paper it may also be a legal requirement for them to make false reports. It is striking that not every company shares Facebook’s philosophy on this subject. Apple, Snap and TikTok are all reportedly taking “the opposite approach” and reporting images when unsure of age.

All products recommended by Engadget have been selected by our editorial team, independent of our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Leave a Reply

Your email address will not be published. Required fields are marked *