<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<oembed>
	<type>rich</type>
	<version>1.0</version>
	<provider_name>Action Network</provider_name>
	<provider_url>https://actionnetwork.org</provider_url>
	
	<html>&amp;lt;link href=&amp;#39;https://actionnetwork.org/css/style-embed-v3.css&amp;#39; rel=&amp;#39;stylesheet&amp;#39; type=&amp;#39;text/css&amp;#39; /&amp;gt;&amp;lt;script src=&amp;#39;https://actionnetwork.org/widgets/v6/petition/google-scan-android-devices-for-csam?format=js&amp;amp;source=widget&amp;#39;&amp;gt;&amp;lt;/script&amp;gt;&amp;lt;div id=&amp;#39;can-petition-area-google-scan-android-devices-for-csam&amp;#39; style=&amp;#39;width: 100%&amp;#39;&amp;gt;&amp;lt;!-- this div is the target for our HTML insertion --&amp;gt;&amp;lt;/div&amp;gt;</html>
	<author_name>ParentsTogether</author_name>
	<author_url>https://actionnetwork.org/groups/the-ma-pa-project</author_url>
	<title>Google: Scan Android Devices for CSAM</title>
	<thumbnail_url>https://can2-prod.s3.amazonaws.com/petitions/photos/000/296/158/normal/iStock-536316013.jpg</thumbnail_url>
	<description>The trade in child sexual abuse material (CSAM) online is growing -- more than 65 million images were reported last year alone. Android is the most popular personal device OS that doesn’t find and report these heinous images. Google must commit to finding and reporting CSAM on Android. Apple recently announced an important new step in the fight against child sexual abuse: they’ll scan all new iPhones for child sexual abuse material (CSAM) and report it. This change means abuse can be discovered much sooner -- not days or weeks after when images are uploaded or shared -- and could save child victims and protect their privacy. These scans are done entirely by machines that exclusively look for a “digital fingerprint” of abusive material and flag it as potentially illegal -- a solution that balances the welfare and privacy of kids with that of iPhone users. Android phones, owned and operated by Google, don’t have the same device scanning in place. Users must upload photos to a service for abusive images to be detected -- allowing millions of images to be shared stealthily and victims to go undetected for longer. Google, stop failing kids and start scanning for CSAM on Android devices</description>
	<url>https://actionnetwork.org/petitions/google-scan-android-devices-for-csam</url>
</oembed>