Google: Scan Android Devices for CSAM
Hiroshi Lockheimer, SVP of Android at Google

The trade in child sexual abuse material (CSAM) online is growing -- more than 65 million images were reported last year alone. Android is the most popular personal device OS that doesn’t find and report these heinous images. Google must commit to finding and reporting CSAM on Android.
Apple recently announced an important new step in the fight against child sexual abuse: they’ll scan all new iPhones for child sexual abuse material (CSAM) and report it. This change means abuse can be discovered much sooner -- not days or weeks after when images are uploaded or shared -- and could save child victims and protect their privacy. These scans are done entirely by machines that exclusively look for a “digital fingerprint” of abusive material and flag it as potentially illegal -- a solution that balances the welfare and privacy of kids with that of iPhone users.
Android phones, owned and operated by Google, don’t have the same device scanning in place. Users must upload photos to a service for abusive images to be detected -- allowing millions of images to be shared stealthily and victims to go undetected for longer.
Google, stop failing kids and start scanning for CSAM on Android devices
Sponsored by
To:
Hiroshi Lockheimer, SVP of Android at Google
From:
[Your Name]
Dear Mr. Lockheimer,
Apple recently announced they will scan iPhones for images of child sexual abuse, a massive global issue with more than 60 million reported images last year. This change will help identify child victims sooner and protect them from the repeated trauma of their abuse images spreading far and wide. As parents, we are calling on you to implement the same narrowly-focused scanning and reporting on Android devices. In doing so, you would protect children from harm without violating the privacy of Android users.