Apple has been scanning iCloud Mail for CSAM since 2019

As Apple prepares to start scanning iPhones and iPads for baby sexual assault materials (CSAM) with the discharge of iOS 15, new particulars have emerged revealing that the corporate already scans iCloud Mail for CSAM.

In response to a new report from 9to5Mac, the iPhone maker has been scanning iCloud Mail for CSAM since 2019 although it has not but begun scanning iCloud Photographs or iCloud backups for such materials.

The information outlet determined to research the matter additional after The Verge noticed an iMessage thread associated to Epic’s lawsuit in opposition to Apple by which the corporate’s anti-fraud chief Eric Friedman mentioned that it’s “the best platform for distributing baby porn”.

Whereas Friedman’s assertion actually wasn’t meant to be seen publicly, it does increase the query as to how Apple may know this with out scanning iCloud Photographs.

Scanning iCloud Mail

9to5Mac‘s Ben Lovejoy reached out to Apple to search out out extra relating to the matter and the corporate did verify that it has been scanning each outgoing and incoming iCloud Mail attachments for CSAM since 2019. As emails despatched utilizing iCloud Mail aren’t encrypted, scanning attachments as mail passes by way of Apple’s servers will not be a troublesome factor to do.

Along with scanning attachments, Apple additionally informed 9to5Mac that it does some restricted scanning of different information although the corporate didn’t specify what this information is. It did nevertheless say that this “different information” doesn’t embrace iCloud backups.

Again in January of final 12 months, chief privateness officer at Apple, Jane Horvath mentioned at a tech convention that the corporate makes use of screening expertise to search for unlawful photos and that it disables accounts if proof of CSAM is discovered.

Whereas Friedman’s assertion initially sounded as if it was primarily based on laborious information, it probably wasn’t. As an alternative he made the inference that because the firm beforehand didn’t scan iCloud Photographs or iCloud backups, extra CSAM would probably exist on Apple’s platform in comparison with different cloud computing providers that actively scan photographs for CSAM.

We’ll probably discover out extra on how Apple plans to fight CSAM on its platform as soon as the corporate rolls out Apple Baby Security photograph scanning with the discharge of iOS 15 this fall.

By way of 9to5Mac