Stop Child Sexual Abuse on Facebook
Mark Zuckerberg, CEO, Facebook
Newly leaked documents from Facebook show that the company is encouraging its moderators to avoid legally required reporting of child sexual abuse image by "bumping up" the age of potential victims. Most major tech companies take the exact opposite approach to child sexual abuse images, prioritizing child safety and reporting images they are unsure about so they can be investigated further.
There is a ton of child sexual abuse images on Facebook– last year they made more than 27 million reports of them. However, Facebook moderators are trained to identify these images with a puberty scale from 50 years ago that was never designed to measure age and can identify a 13-year-old as a “fully-developed” adult. Moderators also complain of poor performance reviews if they fail to “bump up” the ages of people in sexual images.
Facebook must immediately change these harmful practices and follow the industry standard, or leave millions of children at risk for sexual exploitation.
Sponsored by
To:
Mark Zuckerberg, CEO, Facebook
From:
[Your Name]
Dear Mark Zuckerberg,
Newly leaked documents from Facebook have revealed that you are potentially failing to recognize and protect millions of child sexual abuse victims. The policy of asking content reviewers to “bump up” the ages of people in sexual imagery they review inevitably results in millions of sexual images of children spreading undetected on Facebook. Most of your competitors take the exact opposite approach to child sexual abuse images, prioritizing child safety and reporting images they are unsure about so they can be investigated further. As parents and people who care about child safety online, we call on Facebook to make the following immediate changes:
- Update the training for content reviewers to modern, evidence-based methods of age estimation
- Change the policy of considering all “unclear” photos to be adults, and train reviewers to instead apply a reasonable decision-making framework
- Stop penalizing reviewers who judge photos as likely children