On Apple and the FBI

If you pay attention to tech/policy stories, then surely you know about the Apple/FBI situation. Though this story has been broadly covered, I don’t think we’re having the right debate. And the right debate is, of course, very subtle. So here goes my attempt to nail that subtlety.

What’s Going On?

  • The FBI wants access to a particular criminal/terrorist’s iPhone. They have a warrant.
  • The iPhone is locked, and if the FBI tries a few bad PIN codes, the phone will erase its data as a defense mechanism. Also, iPhones are programmed to slow down password attempts after a few bad guesses, which means that, even if the auto-erase feature were not activated, it would take the FBI years to laboriously try enough PIN codes.
  • Changing the iPhone’s behavior – say to allow as many PIN code attempts as fast as possible – is doable via a software update, but iPhones are programmed such that they accept only software updates blessed by Apple.
  • The FBI wants to compel Apple to program and bless this new behavior so they can software-update the phone and go guess the PIN code quickly and without self-destruct.
  • The FBI is happy with a very narrow solution: the updated behavior can be hard-coded to function only with that particular iPhone, and the FBI is willing to never touch that new iPhone operating system. They’re content with having Apple effectively extract the data for them.

Some say FBI could find other avenues

Is this the only way the FBI can get at this data? Is this data even that valuable? It’s a bit dubious, in my opinion. The FBI already has iCloud backups straight from Apple servers, phone call metadata and texts from Verizon, etc. Is there really some key data on the device left to discover? Doubtful.

Also, hardware-security experts are arguing that, given a few hundred thousand dollars, the FBI could find a way to bypass the iPhone’s restriction that a software update has to be blessed by Apple. This seems possible, though I can imagine how it might be difficult for the FBI to develop that specific expertise urgently.

All in all, I’d say it’s pretty clear the FBI doesn’t strictly need Apple to comply. What’s probably happening is that the FBI is using this as a test case for the general principle that they should be able to compel tech companies to assist in police investigations. And that’s pretty smart, because it’s a pretty good test case: Apple obviously wants to help prevent terrorist attacks, so they’re left to argue the slippery slope argument in the face of an FBI investigation of a known terrorist. Well done, FBI, well done.

So this is a backdoor? That bad guys can use, too?

This is where I break with other privacy advocates. It’s a significant overstatement to claim that the FBI’s request could provide them with the technical means to penetrate other iPhones. I call BS when Tim Cook says:

In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI has explicitly stated that they’d be happy with Apple performing this software update without ever shipping the software to the FBI, and, as an additional constraint, with Apple tailoring the update so it functions only on that one iPhone in particular.

There’s a key difference here between this FBI request – access to a single device in physical custody with a warrant – and prior demands from FBI/NSA – access to any encrypted channel, with or without physical custody of a device. The latter requires engineering all encrypted channels to provide law-enforcement access and is so complex that it’s almost guaranteed to create new security holes, especially with respect to foreign governments aiming for broad surveillance. The former is doable if Apple wanted to engineer this capability into their phones. Not completely without risk – in particular when devices are confiscated at customs and such – but much more doable.

So … slippery slope or not?

Technically speaking, I don’t think so. Apple granting this request will not technically enable the FBI to get into other phones.

But legally speaking? I’m a little bit out of my depth here, but from everything I’m reading, I’d say there seems to be a clear legal slippery slope risk. If Apple can be compelled to program and bless code that weakens the phone’s security, then maybe courts will force Apple to help in other ways. Update a criminal’s phone remotely, maybe, because that criminal is on the run? Or wholesale give the FBI the capability to perform software updates themselves? Which would then amount to the remote built-in backdoor and the introduction of unacceptable security risks for everyone.

So why are technologists all worked up?

So technologists are all worked up. I’m pretty worked up. This is a big deal. I’m on Apple’s side, but not for Apple’s stated reasons. We’re not dealing with a universal backdoor request, and we’re misleading the public if we say that.

The three reasons why this is a big deal are:

  1. there is that legal slippery slope, see above.
  2. starting with the PATRIOT act, the US government seems to be increasingly in the business of bypassing due process. National Security Letters, for example. What if the FBI’s next request to Apple is done in secret, with a gag order so Apple can’t talk about it? What if the FBI’s next request is for the all-out ability to update any phone with any software they choose, without looping Apple in ever again? Is this our one and only chance to stop this behavior before it goes dark?
  3. foreign governments making the same requests without due process because they have no such thing. Yeah. Oy. What do we do about them? Can Apple really be in the position of deciding which governments have reasonable due process?

What happens next?

Legally speaking, I have no idea, but I worry the FBI will win this one.

So, technically speaking, I think what happens next is that Apple begins to engineer phones such that they can no longer assist the FBI, even if compelled by court order. Here’s my specific bet: right now Apple can update a phone’s entire software stack to reconfigure a particular phone’s behavior, including number of PIN tries and delays – the most secure parts of the phone. I bet Apple will move towards making the most sensitive parts of that stack updatable only in very specific conditions:

  1. wipe user data, or
  2. keep user data only if the phone is successfully unlocked first.

The interesting question will be whether Apple will be legally allowed to engineer their phones this way. This will be such a fascinating and critically important discussion.

And we, technologists, fans of civil liberties and freedom, privacy advocates, we should find more subtle arguments than calling everything a backdoor and, by the transitive property of backdoor evilness, calling every law enforcement action evil. Yes, law enforcement has broken the public’s trust time and time again. Yes, the FBI is clearly playing this one to set a precedent. And yes, we should be incredibly thankful that Apple and others are standing up for user security.

Yet we have important and real issues to confront. How does law enforcement evolve in the age of universal unbreakable encryption? What should be the law-enforcement role of third-party organizations, when those third parties have access to our most intimate secrets? If we do choose, as a people, to compel third parties to assist law enforcement when served with a warrant, I hope we also couple that with the extension of Fourth Amendment protections to data we choose to store with those third parties.

This isn’t as simple as “backdoors!” And it isn’t as simple as “terrorism!” Like Tim Cook said, I’m glad we’re having this debate in public. I hope it stays in public.


Posted

in

by

Tags: