For the last six days, Apple has been waging the legal fight of its life over a phone used by alleged San Bernardino attacker Syed Farook. The case centers on whether the government can compel Apple to rewrite the phone’s security protections to allow the FBI access to the data inside. But while the phone itself is still the center of the legal fight, this weekend’s conversations focused on data that’s already been pulled off the phone to Farook’s iCloud account.
That iCloud account contains backups of Farook’s phone up until six weeks before the attack, everything from iMessages to email drafts. Investigators already have that data, and because the phone belongs to San Bernardino County, they didn’t even need a warrant to get it. But on October 19th, those backups stopped, and the last six weeks of activity are only found on the phone itself, which is exactly why the FBI has been so intent on getting the phone unlocked. In theory, another backup could have automatically pulled that data back to iCloud, but a forensic error reset the account, making further retrieval impossible. Now, that error has taken center stage in the fight over Farook’s phone, and Apple and the FBI have spent the weekend fighting over exactly what it means.
iCloud backups have taken center stage in Apple's legal fight
While the two sides emphasize very different parts of the story, the basic events are undisputed. A few days after the shooting, someone with access to Farook’s iCloud account triggered a password reset, hoping to gain access to the phone’s cloud backups. That effort appears to have been successful, giving investigators direct access to anything stored in the iCloud account associated with the phone. What they didn’t realize is that resetting the password also shut off any possibility of triggering more backups to the phone. It’s a clear screw-up, obvious to anyone experienced in iPhone forensics. It’s still unclear if the FBI or the county is more responsible (naturally neither side is eager to take the blame), but nobody is claiming the move was anything but a mistake.
The iCloud screwup is embarrassing, but won't cancel out a search warrant
At the same time, it’s genuinely unclear whether that blunder cost investigators any evidence. There’s no concrete reason to think that it did. No one has claimed to know why Farook’s phone stopped backing up but the simplest explanation is that Farook himself stopped the backups. If that’s true, they would have stayed off no matter what Apple did. Apple has no centralized control over whether a phone backs up, so while they were actively helping FBI agents investigate the phone, that help was limited to trying to satisfy the normal conditions needed to trigger a backup, conditions like power and Wi-Fi access. It’s possible that the account had hit its storage limit or was suffering from a backup bug that could have been bypassed. The company says there’s no technical evidence pointing to why the backups stopped, so from the server side, there’s no way to know. But without any evidence, it’s difficult to believe technicians could have induced a backup after two months of radio silence.
It’s unlikely any of this will matter in court. The legal question is simple: can the FBI compel Apple to break the security on Farook’s phone? As long as the FBI is locked out, it doesn’t much matter if other avenues closed off along the way. The iCloud screwup is embarrassing, but it doesn’t nullify the bureau’s search warrant any more than a policeman accidentally bolting the front door would affect his right to crawl in through the window.
There’s also data on the phone’s local memory that isn’t included in the iCloud backup, as the FBI pointed out in its Saturday statement. There’s been some confusion on this point too, but a quick turn through Apple’s developer guidelines makes it clear that the bureau is right. iCloud backs up anything developers store in the Documents folder — which typically means any data generated by the user — but it’s entirely a question of how a given app is written. The private messaging apps like Signal and Wickr, for instance, purposefully keep much of their cache data stored locally, out of iCloud’s reach. To find those files, you would need to unlock the phone and scan its local memory. According to the court, investigators have probable cause to see files like that. The question is just how much they can compel Apple to do to help them decrypt the data. It’s either unduly burdensome or it’s not.
An aggressive fight over public perception, playing out at double the usual speed
While the legal question hasn’t changed, the optics have. Last week, cable news was blasting out variations on the headline "Apple refuses to unlock terrorist’s phone," whereas this week, they’ll at least have the option to talk about investigators bungling a lead instead. It’s one of the most aggressive fights over public perception we’ve seen, playing out at double the usual speed. That fight began with the FBI’s legally superfluous Order To Compel on Thursday: the order generated lots of headlines, coming just three days after the initial order to aid decryption of the phone, but until Apple’s lawyers file an official response to the first order, motions like this are legally meaningless. The fight continued with Apple’s iCloud statements on Friday, deftly shifting the focus to the FBI’s forensic fumble.
FBI director Comey struck back Sunday night, asserting that the whole thing was just about serving survivors of the attack, serendipitously timed with the announcement of a new brief from victims of the attack, siding with the FBI. Of course, we still don’t know how many victims are actually signing on to the brief — it could be a majority, or as few as two people — but the headlines came nonetheless: "San Bernardino victims oppose Apple."
There’s a real issue in play here, beyond the furious spin. The basis of strong encryption is that data can be made inaccessible to anyone without the key. It’s about creating an inaccessible foundation, and building security on top of it. For the iPhone, that space has traditionally been local storage, set apart from more easily accessible iCloud storage. Signal and Wickr store data on the phone itself, encrypted to the iPhone’s passcode and the app, because they don’t want anyone else to have access. The only weakness is the phone hardware, which is vulnerable to malicious code signed by Apple. It’s not a big opening, but for anyone serious about security, it’s a worrying one. In the months to come, the courts will decide if that window is large enough to crawl through.