ivychapel.ink

Guessing Apple's CSAM motivations

Aug 8, 2021 Tags: #security #surveillance

With all gloom’n’doom scenarios about CSAM misuse have been pointed out and outrage about change in value proposition is heating up, I am missing one interesting angle in the Apple CSAM discussion:

Implementation of any security control/technology is an exercise in security budget allocation. What is the expected change in security posture/risk resistance and how much it will cost? Are there more important risks we can address with equal budget? In other words, is this the best thing you can do right know among known things to do?

Apple has put its money, which comes from customers buying their products and services, into technology that puts all customers at more risk. This technology might protect some vulnerable customer groups, but this is yet to be proven.

With little I know, CSAM as a technology:

CSAM enables one narrow use-case: matching stored images against The Blacklist, which contains the evidence of previous illicit activities as collected, classified and then delivered to Apple for matching. This use-case is useful for talking to law enforcement and showing compliance, but is there any real impact? Idea that potential abusers will first consume known abusive imagery, then switch to do something worse seems as sound as “marijuana the gateway drug” theory and “jail all drug users” theory to me. Neither is true nor worked to stop drug crime, as far as I know.

So why would Apple try to invest a lot of money into increasing attack surface on it’s users to protect (with unknown efficiency) some small fraction of it’s audience?

  1. Social signalling: this is straight agrenda-signalling - “we are making baby steps to do the right thing, even though it hurts one of core value propositions of our business”. There are commercial reasons for that, there are social reasons for that, child protection as security weakening plot device has been there for a while. If this is the case - we are to see more crazy ideas like this, and we are to see inevitable detriment of Apple as an innovator in consumer technology, because the innovation is now pointed at causes outside of any meaningful feedback loop.
  2. Proactive compliance operation: Apple believes it will be bent into “put backdoors in all end-to-end encrypted data exchange and let law enforcement into every Apple’s privacy feature” and Apple needs to create precedent with the law enforcement of “being compliant without directly exposing customer data”. Making law enforcement bring in incriminating evidence first and making Apple only answer “this user has N pieces of this evidence” sounds like least hurtful outcome for everyone. If Apple anticipates itself to be bent into compliance, this is good evasion technique that will protect privacy in the long-term, where everyone else will be bent into direct compliance (law enforcement directly accessing the customer data) and Apple will be king with it’s “we work on basis of law enforcement bringing in incriminating evidence first”.
  3. They are mad: Many leading companies have gone mad in the past, so why wouldn’t the most expensive company in the world today go mad to free up space for someone else?
  4. I am mad: I don’t understand mechanics well enough and CSAM will enable law enforcement to collect proof of abuse acts that just happened, or are happening in progress. Makes a bit more sense, I’d be delighted to know more, but remember “evasion reaction” concern?

My hopes are, of course, for the version 2, because that is smart move for the better of Apple’s customers in the long run. However, CSAM is still a problem from a very different angle I’ve started with:

CSAM is huge body work that costs thousands of hours of human labor.

At the same time, Apple has visible problems with regular platform/application security work. Long enough. With (unproven, but well-suspected) track record of actual deaths as a result. Does Apple deal with causes of platform weaknesses that are in their control? We don’t know, but evidence of their failure is there.

Does Apple invest security personnel time and money into increasing attack surface on every Apple user with CSAM, “Find My” Bluetooth network? Definitely yes.

It is very incorrect to compare the two activities directly (as they require different personnel to operate and are not mutually exclusive), but:

We live in very interesting times.