The fight between Apple and the FBI over the security protections on the San Bernadino iPhone has been fierce for the past few weeks, but it’s mostly been a PR battle thus far. From a legal and procedural standpoint, only two things had actually happened until yesterday: the FBI filed a motion to compel Apple to help it bypass security restrictions on the phone, and a magistrate judge ordered the company to comply. But yesterday Apple filed a motion to vacate judge Sherri Pym’s order, which lays bare the company’s actual legal argument against building a special one-off version of iOS that would allow the FBI to unlock the phone with a brute-force attack on its passcode.
So let’s pull it apart and see what’s going on here. While the PR machines on both sides are operating at full tilt, the actual substantive issue in this case is pretty simple: Does the government have the authority to order Apple to help unlock the phone based on statute or precedent? That’s the only question anyone’s trying to answer, since almost all the other facts in the case break decisively in the government’s favor: the government owned the phone, there’s a warrant, the guy was a terrorist asshole, etc., etc., etc. Apple doesn’t even really bring any of that up. The entire brief is focused on whether the government has the power to make Apple help law enforcement.
Terrorism doesn't really even come up
And reading through it, a few things are clear:
- This document feels hastily written and filed for the benefit of the press and public, not as a complete legal argument — it has typos, the language is surprisingly casual, and the arguments are ordered from easiest to understand to the most complex, not from strongest to weakest.
- An unspoken theme running through the entire brief is the fact that Apple is clearly happy to turn over the contents of iCloud backups to the government. This renders any real privacy claims moot; it's trivially easy to sketch out a legislative compromise where Congress will allow Apple to protect strong on-device encryption so long as every user is required to back their devices up to a server accessible to law enforcement.
- It will be surprising if the magistrate judge rules against the government on this set of law and facts by following Apple's complicated argument that relies on a close reading of a wiretapping statute from 1994, an interpretation of a 1977 Supreme Court case that itself interprets the All Writs Act from 1789, and lands squarely on major questions of First and Fifth Amendment jurisprudence. One way or another, this case will end up at the Supreme Court; whether or not the lower courts want to go out on a limb before that happens is totally up for debate.
Okay, so let’s walk through this thing.
Introduction and background
This section is what I mean about this brief being written as much for the press and public as for the judge — the brief opens with a combination of simple and sweeping language that’s designed to be quoted. Let’s quote some now:
"This is not a case about one isolated iPhone."
These are the first words in the entire brief, and really the most important policy argument that Apple is making. If the government can force Apple to assist in unlocking this one iPhone, it can force the company to assist in unlocking every iPhone. And then it can presumably force Apple to assist in loading in all kinds of other shady software: a version of iOS that keeps the microphone or camera on all the time, for instance, or which logs all keystrokes.
This is a great policy argument — too bad it's not really a legal argument
This is a great policy argument! The problem is that policy arguments are the weakest of all at this stage, and that’s why this case is going to end up at the Supreme Court. The government has the facts and a good chunk of the law on its side, and Apple has (to me) the clear policy advantage. Magistrate judges don't make policy like this, you know? Anyway, the policy arguments continue throughout this opening section, and they’re worth reading. Here’s my favorite — I am a sucker for the "burdens on liberty" language here:
Despite the context of this particular action, no legal principle would limit the use of this technology to domestic terrorism cases—but even if such limitations could be imposed, it would only drive our adversaries further underground, using encryption technology made by foreign companies that cannot be conscripted into U.S. government service — leaving law-abiding individuals shouldering all of the burdens on liberty, without any offsetting benefit to public safety.
Apple’s saying that it can break the iPhone, sure, but then criminals will just find other tools the government can’t break, and regular people will be more open to attacks. Again, this is a terrific policy argument. It’s just not a legal argument about the specific authority the government has in this case. So let’s skip this preamble and the background section and go into Apple’s actual arguments.
The All Writs Act
There's been a lot of discussion of the All Writs Act, which is a piece of law that dates to the founding of the country in 1789. It allows courts to issue whatever legal orders they need to issue in order to do their jobs. Tech people love to cite the dates of laws to make them seen antiquated and ancient, but the All Writs Act is pretty basic stuff, and it's not like, on some dusty scroll or something. It's just a couple of short sentences, located at 28 USC §1651:
(a) The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.
(b) An alternative writ or rule nisi may be issued by a justice or judge of a court which has jurisdiction.
That's the whole thing. Would it matter if this law was dated 2016 or 3 BCE? Does anyone complain that the First Amendment was written and ratified in 1789? Any time someone says something in a shocked tone about the FBI relying on a law from 1789 in a high-tech case, roll your eyes. You are almost certainly watching the local news.
the problem with the All Writs Act is that it's so broad
Anyhow, the real problem with the All Writs Act is that it's so very broad. "Courts can do anything as long as it's legal" is not a particularly targeted or well-defined application of state power. And because Congress has never passed a law that specifically deals with the FBI asking a tech company to install a hacked version of its own OS that bypasses lock-code restrictions, the FBI is relying on the All Writs Act as the basis of the court's power to make Apple comply.
So in order to fight back, Apple has to attack the basis of the All Writs order, and say that this request is illegal. And that brings us to CALEA.
CALEA, or the Communications Assistance for Law Enforcement Act, is a 1994 statute that basically says telecom service providers and equipment-makers have to assist law enforcement with surveillance under proper conditions. You can read the entire statute yourself if you like: it's at 47 USC §1001, and the really important bit is at §1002.
this argument is a twisting logic puzzle
Apple's CALEA argument is a twisting logic puzzle of statutory authority that's presented here as being really simple. It's fascinating to me that it's the first real argument in the brief; you usually put the strongest argument first. But what it lacks in strength it makes up for in apparent simplicity:
- Apple says the government could have passed a law called CALEA II that would have required tech companies to install back doors in their products.
- Tech companies hated CALEA II and the Obama administration dropped it.
- There's a subsection in CALEA that says the government can't force equipment manufacturers into specific designs.
- The FBI is asking for a specific design here.
- If Congress wanted the government to have this power, it would have passed CALEA II.
- So it's illegal for the courts to give the government this power using the All Writs Act.
That sounds great, right? It's pretty simple on its face. But here's the real basis for it, and what makes it a logic puzzle — the actual text of CALEA at 47 USC §1002(b)(1):
This subchapter does not authorize any law enforcement agency or officer—
(A) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services; or
(B) to prohibit the adoption of any equipment, facility, service, or feature by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.
Did you catch it? It says that "this subchapter does not authorize" the cops to ask for specific designs, not that asking for specific designs is entirely prohibited or illegal. And what's more, the exact definition of "specific design" needs to be interpreted in this case: the FBI isn't telling Apple how to actually design a new version of iOS, it's just asking for a specific capability. Does that count? Who knows!
So if the FBI can convince the court that it's not asking for a "specific design," this CALEA argument doesn't count. And if it is asking for a specific design, then it can go find the authority to ask for it someplace else in the law, because all CALEA says is that this subchapter doesn't authorize that sort of thing.
You get the feeling Congress should maybe pass a law that actually deals with this
This is where you should be getting the feeling Congress should maybe pass a law that reflects the past 22 years of technological development. If there's one single, blindingly obvious thing about this case, it's that Congress should be involved.
Okay, moving on to the second argument: New York Telephone Co., everyone's favorite 1977 Supreme Court case.
United States v. New York Telephone Co.
This one is way simpler: in 1977, the Supreme Court ruled that the government could use the All Writs Act to compel New York Telephone Company to install a device that recorded the phone numbers dialed on a pair of phones suspected of being used in criminal activity. The court used a three-part test: first, the phone company wasn't "so far removed" from the case, second, the help required was "meager" and the phone company was a public utility, and lastly, the FBI had tried to do it on its own but couldn't accomplish the surveillance without help.
Apple's responses to this are simple:
- Apple says it is very much "so far removed" from the facts of this case. "Apple is a private company that does not own or possess the phone at issue, has no connection to the data that may or may not exist on the phone, and is not related in any way to the events giving rise to the investigation."
- Apple is not a public utility, and has no duty to serve the public in this way. "Apple is a private company that believes that encryption is crucial to protect the security and privacy interests of citizens who use and store their most personal data on their iPhones," it says.
- This help isn't "meager," it's an "unprecedented and oppressive burden." Apple says that this isn't just clipping a pen register onto a couple of phones, it's requiring the company to design and install an entirely new version of iOS. And since every prosecutor in the country will line up to ask for phone unlocks if this goes through, Apple will have to spin up an entire team of people to handle the request volume.
- The government probably has other ways to unlock the phone, and needs to demonstrate that it can't go on without Apple's help. This is an implicit reference to the NSA, which is thought to be in possession of several zero-day exploits that would allow alternative versions of iOS to be loaded in the phone — basically, jailbreaks. "Moreover, the government has not made any showing that it sought or received technical assistance from other federal agencies with expertise in digital forensics, which assistance might obviate the need to conscript Apple to create the backdoor it now seeks."
Apple is implying that the FBI should just call the NSA and have it unlock the phone
Don't call us; call your friends.
Okay. That's the real stuff. Now, onto the crazy part: Apple saying the First and Fifth Amendment don't allow the government to force it to do anything.
Crazytown: the First Amendment (and, sort of, the Fifth)
Here's the outline of the First Amendment argument:
- Code is speech.
- Speech is free.
- The FBI wants to compel Apple to write some code and sign it so it runs on this phone.
- Apple doesn't want to, and says how much it loves encryption all the time.
- That means the FBI wants to compel speech from Apple that is specifically against Apple's viewpoint.
- FIRST AMENDMENT ERROR
I have been a firm member of the "code is speech" camp for most of my adult life, but this strikes me as being the farthest possible stretch of that concept. Code is speech, but it is also the fundamental tool of the digital economy, and getting people to use their tools is not some crazy infringement of fundamental liberty. Storage unit landlords who are ordered to cut padlocks for the cops do not make free speech arguments, you know?
All freshman dorm rooms, begin debating this immediately
I honestly haven't begun to wrap my head around the various implications of deciding this argument in either way. I am calling on all freshman dorm rooms to begin debating this immediately.
Lastly, Apple makes a very short (literally: 62 words) Fifth Amendment argument, which is basically a half-court shot at the buzzer: if you agree with Apple that asking it to help unlock this iPhone is unfair, highly burdensome, illegal, and probably even a little immoral, then, well, you probably also agree that Apple's right to due process has been violated.
You are probably also Tim Cook, but, hey, he's paying the lawyers here.
Reading through this, it's clear: what Apple cares about, really and truly, is protecting the strength of the iOS software keys and the encryption that guards them. Signing a fake version of iOS that lets the government brute-force passcodes weakens those keys, and opens Apple up to a world in which the government requests endlessly customized version of the software, which would be an enormous burden and increase the chances that someone else will steal those keys and release their own hacked versions of iOS for nefarious purposes.
And it's important to note that Apple is not arguing that it should keep your data away from the government forever. If the government has a warrant, it can come to Apple and get your iCloud backups. (It can get your Microsoft Azure and Google Drive and Dropbox backups too — every cloud service has to give things up.)
The right answer is for Congress to write a new law that makes it very clear what Apple's responsibilities are
From a distance, it's clear that the right answer here is not really for the court to glue together a bunch of old laws written before situations like this were possible and order anyone to do anything. The right answer is for Congress to write a new law that makes it very clear what Apple's responsibilities are, and what the cops can and can't ask for. And potential compromises abound: it's entirely too easy to see a world in which Apple loses this case, builds new versions of iPhones and iOS in which breaking the phone in this way isn't possible, but works with Congress on a law that allows the cops access to the data they want through iCloud anyway. All Apple has to do is increase the size of the free iCloud storage tier and require phones to backup constantly, you know? A lot of people would love that, actually. Win-win.
I'm not saying that what will happen, just that someone has to propose a real solution that works for Apple and the cops, and that's the only door Apple's really leaving open with this argument. We'll have plenty of time to watch it unfold — this case certainly isn't going away any time soon.