Google, like all responsible service providers across the tech industry, takes the protection of children very seriously indeed. It uses “proprietary technology to deter, detect, remove and report offences,” including the identification of child sexual abuse material in Gmail email messages. When you sign up to use Gmail, you agree to Google’s terms and conditions that allow for such searches to happen. Now, a U.S. Second Circuit Court of Appeal has ruled that this doesn’t mean that a further search, once details of the initial findings are forwarded to law enforcement, can be carried out in violation of 4th Amendment protections.
How Google Detects CSAM In Gmail Messages
Google describes the measures it takes to identify and report child sexual abuse material in some detail online. This includes working with specialist teams at Google, along with technological solutions such as machine-language classifiers and hash-matching. It’s the latter, hash-matching, that sits at the center of this new appeals court ruling. Think of a hash as being a digital fingerprint left behind by any image or video file, like fingerprints these are unique to each specific file. Which means that Google can detect the hashes associated with known CSAM images and video within Gmail messages without actually viewing the offensive and illegal material itself. “When we find CSAM, we report it to the National Center for Missing and Exploited Children,” Google said, “which liaises with law enforcement agencies around the world.” It’s sad to report that this has proved to be remarkably successful; sad because so many images have been identified, but positive because it means law enforcement can take action against the people distributing it.
To recap, then, Google’s terms of service prohibit using any of its platforms, including Gmail, to store or share CSAM content. Hash-matching technology enables Google to detect such content in Gmail messages without a human having to read the email and without ever actually viewing the image itself, just the hash.
Gmail Child Abuse Material Reported To Law Enforcement—Law Enforcement Overreach, Court Rules
As picked up by reporters at TechDirt, a 2nd circuit appeals court has ruled on a case that was appealed from the United States District Court for the Northern District of New York . This case revolved around a man who was convicted of the possession of CSAM images, but who had appealed on the basis that the law enforcement warrant was “tainted by previous unconstitutional intrusions.”
The detected CSAM hash had been passed to the National Center for Missing and Exploited Children and then to law enforcement to investigate and potentially prosecute. Law enforcement, however, was found to have made a visual examination of the child abuse image rather than just the hash itself. “They went beyond the scope of Google’s private algorithmic search,” TechDirt reported, “in that they learned more than the hash value for the Maher file image; they learned exactly what was depicted in that image.”
And that’s where the court ruling comes in. The examination was made prior to obtaining a warrant which prevented the claim of the government, through law enforcement action, being a beneficiary of a private search because Google had never viewed the Gmail CSAM image in question. The first time it was viewed by anyone other than the perpetrator was when investigators opened it. Sadly, law enforcement could easily have got a warrant using probable cause, the hash itself, but for whatever reason opted to not do so until after the additional search.
The Gmail Search Ruling
So, Google’s terms of service state that it may review content and share this with a third party if necessary to apply with any applicable law—such as when having actual knowledge of CSAM on their platform. The court ruling simply advises that the perpetrator’s “reasonable expectation of privacy” for that content in regard to government access is not overruled by the Google terms. As TechDirt explained so eloquently, “agreeing to share things with private company third parties is not nearly the same thing as agreeing to share the same things with the government at any point the government desires to access content or communications.”
The good news is that the conviction stands as a good faith exception applied to the search on this occasion. The better news is that it sends a message to law enforcement not overstep the mark, and apply the correct search warrant procedure, when it comes to material found in Gmail. We all want to see such perpetrators brought to justice, and procedural errors that might prevent that from happening must be mitigated.