Toggle light / dark theme

Google’s Child Abuse Detection Tools Can Also Identify Illegal Drawings of Children

Posted in law, mobile phones

Apple was hit with a wave of criticism earlier this year when it announced plans to scan iPhones to stop the distribution of Child Sexual Abuse Material (CSAM). Critics fretted that Apple’s hash-checking system could be co-opted by governments to spy on law-abiding iPhone users. In response to the backlash, Apple might end up making changes to that program, but Google has its own way of spotting CSAM, and it might be even more intrusive for those who use all of Google’s cloud services.

The specifics on Google’s CSAM scanning come by way of a warrant issued in early 2020 and spotted by Forbes. According to the filing, Google detected CSAM in Google Drive, its cloud storage platform. And here’s where things get a little weird; the warrant stemming from this report targeted digital artwork, not a photo or video depicting child abuse.

Apple’s system under its “Expanded Protections for Children” banner uses hashes for known child abuse materials, scanning iDevices for matching hashes. This should prevent false positives and it doesn’t require Apple to look at any of the files on your phone. The issue cited most often with this approach is that Apple is still scanning your personal files on your smartphone, and it could be a privacy nightmare if someone manages to substitute different hashes. Apple says this isn’t possible, though.

Leave a Reply