Apple says that its technology is good enough that the false-positive rate is less than one in a trillion, which is great from a technical standpoint. That means there is almost no chance that images will be flagged unless they are actually known CSAM. Philosophically, however, that’s irrelevant.
Apple has apparently been working on this technology for a while. Jane Horvath, who appeared on a panel at CES in 2020
, talked about how Apple was working on the ability to scan iCloud Photo Libraries for this type of material. Horvath heads up Apple’s privacy efforts, so it’s notable that this entire effort arguably infringes on user privacy.
There’s also a certain irony that, when she spoke publicly on the topic, Horvath made clear that Apple believes
“building back doors into encryption is not the way we are going to solve those issues.”
Except, that’s sort of what happened. Or, more accurately, that’s what it looks like. To be fair, there’s quite a difference between the two, but when it comes to earning the trust of your users, there’s little distinction.
To that end, it’s not a surprise that people started to complain that Apple was violating user privacy by scanning their photo libraries and looking at images to make sure none of them were illegal. Of course, that’s not what’s happening at all. In fact, the noise serves mostly to distract from what I think is actually a much larger problem.
No one at Apple is going to be able to view photos of your cat, or even of your children. Your iPhone isn’t going to suddenly gather information about all of your photos and report back to the mother ship.
That said, it’s worth mentioning that if you’re using iCloud Photos, that data is encrypted but Apple holds a key
, meaning that it is able to turn it over if subpoenaed by law enforcement.
The thing is, privacy isn’t the problem. At least, not in terms of Apple looking at your Photo library. The problem is that Apple, more than any other company, has made a promise about protecting user data. Tim Cook, the company’s CEO, regularly reminds us that Apple believes “privacy is a fundamental human right.”
For example, the data on your iPhone is encrypted, and not even Apple can access it without your passcode. The messages you send via iMessage are end-to-end encrypted, meaning that only the person you send them to can view them.
Sure, we can all agree that CSAM is repulsive and should be erased from the internet. I haven’t heard anyone argue differently.
But, once you make an exception, it’s hard to justify not making another, and another, and another. If your argument is, “we don’t have the technology to do that,” it’s pretty easy to resist the pressure to hand over user data. It’s a lot harder to make the argument “well, we could do that, but we don’t think it rises to our standard.” The problem is, at some point, someone will come along and force that standard with a law or a court order.
Maybe there are technical reasons that won’t ever happen. Maybe Apple has enough force of will that it really is only ever going to make an exception for CSAM. Still, a backdoor is a backdoor. And you can’t have real privacy-protective encryption as long as a backdoor–any backdoor–exists.
On the other hand, maybe there should be a backdoor, though I think Apple would argue that this isn’t actually a backdoor. Still, it doesn’t really matter if that’s what it looks like to the people to whom you promised you’d keep their data private. Once you break that trust, you’ve lost your most valuable asset.
I reached out to Apple but did not immediately receive a response to my questions.