Apple confirmed to me that it has already been scanning iCloud Mail for CSAM since 2019. However, neither iCloud Photos nor iCloud backups have been scanned.
The explanation came after I questioned the company’s anti-fraud director, who said that Apple was “the largest platform for disseminating child porn.” That begged the question: how could the firm know this if it wasn’t scanning iCloud photos?
There are also a handful of additional hints that Apple was scanning using CSAM. According to an old version of Apple’s kid safety website (emphasis added):
Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.
In January 2020, the company’s chief privacy officer also stated the same thing:
Jane Horvath, Apple’s chief privacy officer, said at a tech conference that the company uses screening technology to look for the illegal images. The company says it disables accounts if Apple finds evidence of child exploitation material, although it does not specify how it discovers it.
Apple declined to comment on Friedman’s remark, but did confirm that iCloud Photos had never been scanned.
iCloud Mail is scanned by Apple.
Since 2019, Apple has been checking outgoing and receiving iCloud Mail for CSAM files, according to Apple. Because email attachments are not secured, reading them as they transit through Apple servers would be a simple process.
Apple also said it was performing some limited scanning of other data, but wouldn’t say what it was or how little it was. It did say that iCloud backups are not included in the “other data.”
Although Friedman’s remark appears to be definite – as if it were based on concrete evidence – it now appears that it was not. According to our knowledge, Apple submits hundreds of complaints to CSAM each year, thus email scanning would not give any proof of a large-scale problem on Apple servers.
The reason for this is that other cloud providers were scanning images for CSAM but Apple was not. If other services were blocking CSAM uploads but not iCloud Photos (since Apple wasn’t scanning there), the natural conclusion would be that there is more CSAM on Apple’s platform than anyplace else. Friedman was most likely only trying to come to that conclusion.