Just like the perennial discussion on location-based services and Apple’s ability to track you, the question of accessing an iOS device’s data when the device is locked seems to come up every few months. This time around, the discussion was inspired by a CNET article, with the sensational title “Apple deluged by police demands to decrypt iPhones."

The article seemed to be built around a single paragraph in a blurry copy of a search warrant affidavit from ATF, which stated that the writer “contacted Apple” and was told by “an employee […] who is part of their Apple Litigation Group” that Apple “has the capabilities to bypass the security software” on the iPhone.

That’s it. That’s all we know. An ATF agent reports having talked to a single person at Apple, who told him that they can “bypass the security software” on iOS devices. And from that tenuous hold, the Twitters exploded with “See! I TOLD you Apple had a back door!” and other related Fear, Uncertainty, and Doom.

But is any of it warranted? What could Apple really be doing, and is that any different from what we already know? Let’s review what we know, and don’t know, about iOS security, passcodes, and encryption.

Filesystem Access via Boot Images

iOS devices will only boot from a drive with a boot image properly signed by Apple. This image is (usually) on the device itself, but the Device Firmware Update (DFU) mode can allow the device to boot from an external drive via USB. For older devices, an bug in the bootrom allowed unsigned drives to be booted. That’s since been fixed, but it’s always been an “open secret” that Apple could probably still boot from DFU (since, obviously, they would be able to create a signed external boot image).

Once booted off an external drive, the internal device can be mounted, and unprotected information read. Most built-in Apple apps do not provide extra encryption (to my knowledge, only the Mail application separately encrypts its data at this time). One reason is that some data needs to be accessible while the device is locked: Inbound SMS messages and phone call information have to be written to the disk, the Contacts list needs to be available for displaying the name of inbound calls (and for making outbound calls), etc. So there’s a fair amount of data that can be retrieved at this stage.

So far, we’ve simply replicated what commercial forensics providers do: Boot off an external drive, and find “easily extracted” data. The difference is that forensics tools take advantage of the DFU bug (and thus can’t extract data from iPhone 4S or 5), while Apple doesn’t need any stinking bugs and can do this magic with any device.

Encryption

“But wait, iOS devices also have crypto! Crypto that uses ALL the bits! And this article PROVES that Apple can bypass that! They must have a back door.”

Well, yes, there are multiple layers of cryptographic support, but again, there’s no proof that Apple has any kind of way to get around that. First, let’s start with the device’s unique ID (UID). This isn’t the same as the “UDID” that’s been used by app developers to track devices and their users. This is a deeper ID, that’s “fused into the application processor during manufacturing.” Apple says that “No software or firmware can read [the UID] directly; they can only see the results of encryption or decryption operations performed using them” (see the excellent iOS Security overview paper, last updated October 2012).

This UID is used as the basis for all the rest of the keys on the system. At the lowest level, it’s used to derive the overall disk key, which provides a built-in full disk encryption for the iOS device. This means you can’t simply remove the flash drive from one iPhone and move it to another, since the key will still be back in the first device.

Additional encryption protection (alluded to above) can be added to a file’s data if the developer requests it, simply by setting an attribute when writing data to the disk. These files have their own encryption keys (it gets complicated – you really need to read that Apple paper. And when you do, keep this HITB 2011 presentation open in another window, it’ll help…) The keys for all the files are themselves protected with class-level keys (now we’re getting kind of hierarchical and/or meta), and those keys are stored in a keybag, which is itself encrypted using yet another key.

This last key is derived using the user’s passcode and the aforementioned UID device-unique key. Because the UID is tied to a device, any brute-force attempts to break the passcode have to happen on that device. And because “The UID is unique to each device and is not recorded by Apple or any of its suppliers” (again, the iOS Security paper) it is not possible to move any of these operations to another system, or to speed it up in anyway.

Possibilities?

So how could apple “bypass” security? Several possibilities have been speculated on:

  1. They could have an escrow keybag that only they know about. True, this is possible. But this security system has been subject to some pretty heavy scrutiny, if there’s a hidden escrow bag, it’s very well hidden, and nobody’s discovered the mechanism for creating and updating that.
  2. There could be a back door in the crypto. Not likely, again, given the 3rd party research in the system. If there’s a back door, it’s an “NSA-LEVEL” hole and way beyond anything Apple would be doing.
  3. They could have a way to extract the UID after all. One person on Twitter said that “sending me marketing material a la It’s secure because the vendor says it is is THIS close to insulting my mother.” Okay, fair point. But this is also a very technically detailed bit of marketing material, with far more insight and transparency than just about any other vendor provides. And, again, pretty much everything in that paper has been verified by many other security researchers. Why would Apple risk everything and have a flat-out lie in this paper? It just doesn’t fit.

Finally, I think it’s important to apply Occam’s Razor to the situation. If any of these backdoors existed, then it would take like 10 minutes for Apple to completely unlock a phone, and the alleged “7 month backlog” wouldn’t exist (unless they had thousands and thousands of confiscated devices to process).

Now, there is one final way that Apple might be able to get at the encrypted data on a locked phone: iCloud. If the user has iCloud backups enabled, then there’s a real possibility that Apple has the ability to access that data. After all, you can restore an iCloud backup to a different device, and you can change your iCloud password without losing the data in the backup. But that also shouldn’t take much time at all, and so probably only happens rarely (not contributing to the 7-month backlog).

Conclusions

So, to sum up:

  • Apple almost certainly gets confiscated iPhones sent to them by law enforcement
  • With the proper search warrant, etc., Apple will do what they can to extract data from those phones
  • They almost certainly can boot the phones from a legit, signed external drive, and gain access to much of the unencrypted data on the phone (damned near everything, unfortunately)
  • If they want to get at data protected by a passcode, then they can start a brute-force attack, just as researchers and forensics tool companies have been doing for years
  • If the user’s passcode is strong (5-6 alphanumeric characters), this could take months, if not years, to complete
  • If the device was backed up to iCloud, it’s possible that all bets are off and the data would be easily retrieved from backup

Is any of this new? Any of it at all? Nope. Not a single item in that CNET article told us anything we didn’t already know, except maybe the length of the backlog. Which, really, should be a good demonstration that there isn’t any kind of magic back door, and that if you use a strong passcode and avoid iCloud backups, the data on your phone should be secure against just about anything, including being sent home to Cupertino.