Put away the tin-foil: The Apple unlock case is complicated enough
 

DarthNull.org • About Ⓘ

Hello! I'm David Schuetz.
This is where I ramble about...stuff.

Put away the tin-foil: The Apple unlock case is complicated enough

Apple and the FBI are fighting. The {twitter, blog, media}-‘verses have exploded. And FUD, confusion, and conspiracy theories have been given free reign.

Rather than going into deep technical detail, or pontificating over the moral, legal, and ethical issues at hand, I thought it may be useful to discuss some of the more persistent misinformation and misunderstandings I’ve seen over the last few days.

Background

On February 16, 2016, Apple posted A Message to Our Customers, a public response to a recent court order, in which the FBI demands that Apple take steps to help them break the passcode on an iPhone 5C used by one of the terrorists in the San Bernardino shooting last year.

All this week, Twitter, and blogs, and tech news sites, and mainstream media have discussed this situation. As it’s a very complex issue, with many subtle aspects and inscrutable technical details, these stories and comments are all over the map. The legal and moral questions raised by this case are significant, and not something I’m really qualified to discuss.

However, I am comfortable ranting at a technical level. I’ve already described this exact problem in A (not so) quick primer on iOS encryption (and presented a short talk at NoVA Hackers). One of the best posts particular to the current case can be found at the Trail of Bits Blog, which addresses many of the items I discuss here in more technical detail.

Apple (and many others) have been calling this a “back door,” which may or may not be an over-statement. It’s certainly a step down a slippery slope, whether you consider this a solution to a single-phone case, or a general solution to any future cases brought by any government on the planet. But, again, I’m not interested in discussing that.

Emotions are high, and and knowledge is scarce, which leads to all kinds of crazy ideas, opinions, or general assumptions being repeated all over the internet. I’m hoping that I can dispel some of these, or at least reduce the confusion, or at a bare minimum, help us to be more aware of what we’re all thinking, so that we can step back and consider the issues rationally.

Technical Overview

First, a very high-level description of how one unlocks an iPhone. This is a very complicated system, and the blog posts (and slide deck) that I linked to above provide much better detail than what I’ll go into here. But hopefully this diagram and a short bullet list can give enough detail that the rest of the post will make at least some sense.

Simplified passcode logical flow

To unlock an iPhone (or iPad or iPod Touch):

In later devices (iPhone 5S, iPad 3, and later), the management of the bad guess counter and timeout delays are handled by the Secure Enclave (SE), another processor on the SoC with its own software.

So, if you want to unlock an iPhone, but don’t know the passcode, how can you unlock it?

The last is the easiest, but Apple didn’t make it that easy. To modify the operating system on the device, you first have to defeat the default full-disk-encryption, which is also based on the UID (and beyond the scope of this post), so we’re back to a microscope attack.

Or you could boot from an external hard drive. Unfortunately for hackers and law enforcement (but good for iOS users), the iPhone won’t boot from just any external drive. The external image has to be signed by Apple.

This is exactly what the FBI is asking Apple to do (and, incidentally, a boot ROM bug in iPhone 4 and earlier allowed hackers to do this too, which is how we know it’s possible). The basic approach is this:

Beginning with the iPhone 5S, some of the passcode processing functionality moved into the Secure Enclave, so this attack would need to be modified to remove the lockouts from the SE as well.

Note that these methods still require the passcode to be brute-forced. For a 4-digit number, that can happen in as little as 15 minutes, but for a strong passcode, it can take days, months, or even years (or centuries). So even with a signed boot image, this attack is far from a silver bullet.

Confusion!

That’s (basically) the attack that Apple is being asked to perform. Now to address some of the more confusing points and questions circulating this week:

Just crack the passcode on a super fast password cracking machine! That can’t be done, because the passcode depends on the Unique ID (UID) embedded within the SoC. This UID cannot be extracted, either by software or by electronic methods, so the password-based keys can never be generated on an external system. The brute-force attack must take place on the device being targeted. And the the device takes about 80 milliseconds per guess.

Look at the BUGS, MAN! Yes, iOS has bugs. Sometimes it seems like a whole lot of bugs. Every major version of iOS has been jailbroken. But all these bugs depend on accessing an unlocked device. None of them help with a locked phone.

What about that lockscreen bypass we saw last {week / month / year}? These bypasses seem to pop up with distressing frequency, but they’re nothing more than bugs in (essentially) the “Phone” application. Sometimes they’ll let you see other bits of unencrypted data on a device, but they never bypass the actual passcode. The bulk of the user data remains encrypted, even when these bugs are triggered.

But {some expensive forensics software} can do this! Well, maybe it can, and maybe it can’t. Forensics software is very closely held, and some features are limited to specific devices, and specific operating system versions. One system that got some press last year exploited a bug in which the bad guess counter wasn’t updated fast enough, and so the system could reboot the phone before the guess was registered, allowing for thousands of passcode guesses. (Also, as far as we know, all those bugs have been fixed, so this only works with older versions of iOS).

If Apple builds this, then Bad Guys (or the FBI, which to some may be the same thing) can use this everywhere! Well, not necessarily. Apple could put a check in the external image that verifies some unique identifier on the phone (a serial number, ECID, IMEI, or something similar). Because this would be hard-coded in a signed boot image, any attempts to change that code to work on a different phone would invalidate the signature, and other phones would refuse to boot. (What is true, though, is that once Apple has built the capability, it would be trivial to re-apply it to any future device, and they could quickly find themselves needing a team to unlock devices for law enforcement from all around the world…but that goes back into the cans of worms I’m not going to get into today).

NSA. ‘Nuff said. Who knows? (more on that below)

What about the secret key? Isn’t it likely that the Advanced Persistent Threat has it anyway? If the secret key has been compromised, then, yeah, we’re back to the state we were with iPhone 4 and hacker-generated boot images. But the attacker still needs to brute force the passcode on the target device. And, frankly, if that key has leaked, then Apple has far, far bigger problems on their hands.

Later devices which use the Secure Enclave are safe!! Possibly. Possibly not. I see a few possibilities here:

Can anyone outside of Apple do this?

So, what about the NSA? Or China? Or zero-day merchants? Surely they have a way to do this, right?

We don’t know.

If there’s a way to do this, it would require one (or more) of the following:

All of these are theoretically possible, but none seem terribly likely (other than an OS-level bug on older devices). In fact, the most reasonable (and least disturbing) possibility, to me, is the direct physical attack on the chip. Even that might be preventable, but I Am Not A Chip Designer and could only speculate on how.

Bottom Line

There’s still a lot we don’t know. In fact, I think this list is pretty much what I wrote in 2014:

Much of this we’ll never know, unless Apple explicitly tells us in a new update to their iOS Security Guide. And even then, we’ll probably only have their word for it, because many of these questions can’t be independently verified without that trusted external boot image.