Skip to main content

The FBI is striking at the heart of Apple’s security system

The FBI is striking at the heart of Apple’s security system

/

The entire industry has followed the iPhone's security model. What happens if it breaks?

Share this story

Twenty years ago, if you had asked a cryptographer how to secure your hard drive, she would have told you to keep it as far as possible from corporations like Apple. Corporations have legal departments and federal contracts, she’d argue, so if the feds want them to do something, eventually they’re going to do it. If the cypherpunks ever agreed on anything, they agreed on that. As long as someone else has control over the software running on your device, you’ll never be safe.

Now, Apple is being forced to face that logic head on.

Last night, a federal judge ordered the company to help the FBI break into a phone used by one of the San Bernardino killers, prompting a blistering response from Tim Cook. It’s put Apple at the center of one of the most important legal fights in the company’s history. For more than a decade, Apple has succeeded by violating every one of the old cypherpunk taboos — tightly restricting software and centralizing control over its hardware — and made some of the most secure devices on the market because of it. But that success has now led the company to a difficult question: having amassed all that power over the world’s iPhones, can Apple keep hold of it?

Having amassed all this power, can Apple keep hold of it?

In simple terms, the FBI is asking Apple to build a new version of iOS with more easily breakable passcode protections, and push that weakened OS directly to the suspect’s phone. That "crackable" version of iOS does not currently exist, as Tim Cook pointed out this morning, but it seems likely that it could be built according to the FBI’s specifications. Crucially, that new OS will need to be signed by Apple, which is why feds need the company’s cooperation in all this. But since the 5c lacks the hardware protections found in more expensive iPhones, that’s all the FBI will need.

That scheme relies on one of Apple’s biggest security features: its ability to push software directly to a user’s device. In this case, the FBI is asking for a local firmware update rather than an over-the-air patch, but the same trusted signature is behind both systems, and it’s traditionally been a crucial force for stronger security on the iPhone. While Android phones wait months or even years for a patch, Apple is able to fix iPhone bugs as soon as it finds them. The company’s ability to get software onto your device is unchallenged. At the same time, it makes a scenario like this very difficult to protect against. As long as Apple has the power to sign and push a new version of iOS — and with it, a new version of all the features built into iOS — there’s the possibility of using that power to create a backdoor.

If Apple loses this fight, the consequences will reach far beyond the iPhone

Compare that to a desktop computer running Windows 7, encrypted using third-party software like Symantec’s Drive Encryption. (This scenario very nearly happened in the Silk Road case: Ross Ulbricht was running disk encryption on his computer, but feds reached him before he could close the laptop to engage it.) You could compel Microsoft to sign a firmware update, but that couldn’t do anything about the third-party software itself. You could serve Symantec with a warrant, but unless they had a preinstalled backdoor, they would be left facing a wall of encrypted data. Even if both companies work together, there’s just no pathway to force the computer to accept an update. Without the seamless bond between firmware, encryption, and Apple’s central servers, it’s a lot harder to wrest control away from a user.

There are still plenty of weaknesses with systems like that, and for the most part, they haven’t presented a real problem for agencies like the FBI. People tend not to install extra disk encryption software, and when they do, it can often be circumvented by malware. The advantage of Apple’s centralized system is that people use it, and it can be patched aggressively enough that malware isn’t as big of a problem. It only has one weakness: an attack assisted by Apple itself. This fight will determine how serious that weakness is.

Apple isn’t completely unprotected. For any currently available iPhone other than the 5C, passcode protections are built directly into the hardware, in a portion of the A7 chip known as the Secure Enclave. That means the FBI couldn't use the same technique to break into an iPhone 6 -- but it doesn't mean the system would be impossible to break. Security researchers (including a former Apple employee) have speculated that the Enclave could itself be vulnerable to a firmware update, and one could imagine an equivalent order demanding Apple store the necessary data to undo the Enclave protections. The basic stakes of the issue remain. If Apple can be compelled to turn against its users, it’s hard to imagine any level of protection that will hold up.

If Apple does lose that fight, the consequences will reach far beyond the iPhone and far beyond the United States. Apple’s security setup is already the gold standard for most companies, not just for mobile technology but for all technology. By now, both Windows 10 and Chrome push down updates and patches every bit as aggressively Apple. Android is moving toward iOS-style monthly patches as fast as the carriers will let it. The cypherpunks lost the battle over centralized software control. Apple won. But in doing so, it may have opened the door to something much worse. The same order served to Apple can now just as easily be served to Google or Microsoft. It could be served by another government — by China, Russia, or Pakistan. Apple’s loss would cascade down through the industry. Everyone who’s followed in their footsteps would be vulnerable. In 2016, that’s everyone.