clock menu more-arrow no yes

Filed under:

Why can't Apple spend its way out of security vulnerabilities?

New, 40 comments

A high-profile iOS attack has raised new questions about the company’s bug bounty

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Fifth Avenue Apple Store Cube 2011

Apple has a lot of money.

On most days, it’s the largest company in the world by market cap, and a surprising amount of that money is already in the bank. At the close of Q1 this year, the company had $55 billion in easily accessible cash along with another $178 billion in long-term securities that could be cashed in if the need arose. With the core business already at its peak, there’s no obvious place to put that money, which is why it’s so easy for Tim Cook to pour cash into speculative areas like health-tracking or Frank Ocean albums.

One exploit trader recently offered $1 million for a similar bug

So from the outside, it’s surprising that the company is being outspent on security. Apple’s recently launched bug bounty program caps out at $200,000, just a fraction of the millions that are regularly spent for iOS exploits on the black market. Exploit traders are paying more for the same bugs, even as Apple is holding onto billions in the bank.

Those concerns are particularly relevant because yesterday, one of those vulnerabilities was discovered in the wild. Spyware from a Israeli firm called NSOGroup was discovered targeting a human rights activist named Ahmed Mansoor in the United Arab Emirates. The spyware was able to remotely jailbreak an iPhone, a capability that’s never been seen before in active use. The vulnerabilities were patched in yesterday’s iOS update (which you should really install), but it’s raised new questions about how vulnerable iPhones could be to remote attackers.

We still don’t know how the firm learned about the vulnerabilities, but there’s a good chance they bought them on the black market, where one trader recently offered $1 million for a similar bug.

That’s led some critics to ask the obvious question: Why doesn’t Apple pay more? It’s not like the company can’t afford it.

For industry veterans, it’s a surprising question to hear. It’s been conventional wisdom for decades that an attacker will always outspend a defender for a given bug. Most bug bounty payouts reflect that. Google’s equivalent bug bounties for Android are about a quarter of Apple’s payout, topping out at $50,000. The Chrome payouts are more aggressive, with a standing $100,000 reward for any way of planting persistent malware on a Chromebook or Chromebox from guest mode. Microsoft has one of the most aggressive bug bounty programs in the industry, but last year’s payouts still topped out at $125,000.

It’s difficult to say how much the equivalent prices would be in an exploit marketplace, but they’re almost certainly higher, a fact many of the people involved with corporate bounties freely admit. The point is to offer a modest reward for doing the right thing, not to outspend the black market.

Spyware companies pay top-dollar to make sure a bug stays unpatched

The more aggressive strategy — matching the black market price — is largely seen as unworkable for practical reasons. Apple is a far larger business than the companies it would be bidding against, but it’s also looking for something much more broad. Spyware firms like NSOGroup want a way to break into an iPhone — but as long as it’s working, they only need one. That lets them pay top dollar to make sure it stays secret and unpatched. "Offense prices are not just paying for the vulnerability or exploit," says Luta Security CEO Katie Moussouris, one of the industry’s leading advocates for bug bounties. "They are paying for the exclusivity and longevity of use of the bug against their targets."

Apple’s task is much harder. As soon as this week’s vulnerabilities were reported, Apple patched them — but there are plenty of other bugs left. While spyware companies see an exploit purchase as a one-time payout for years of access, Apple’s bounty has to be paid out every time a new vulnerability pops up. With this vulnerability patched, NSOGroup and countless others will be looking for other ways to get a foothold against iOS security, and each patch produces even more vulnerabilities to fix. Last year’s Stagefright vulnerability in Android inspired a number of downstream vulnerabilities, either variations on the initial attack or blindspots in the deployed patches. Yesterday’s update got the most press attention, but it was the seventh security update Apple’s pushed so far this year, and each new version has fixed dozens of reported bugs, many of them building on previous patches.

There are more bugs than any single team can find

That doesn’t mean security is impossible — but it means you can’t get there with bounties alone. The only long term solution is a robust internal security team that spends its time looking for bugs. Because there are still more bugs than any single team can find, a bug bounty program is the best way to catch the remainder, but it’s only a supplement to the rest of the security work.

If Apple really did put its enormous cash reserves behind catching every bug, the result might have unintended consequences for its own security workforce. Building and deploying patches is hard work, every bit as delicate and creative as finding vulnerabilities. Companies need dedicated teams to do that work — but with skyrocketing prices for iOS vulnerabilities, why not put in a few months to find an exploit, turn it in for the bounty, and then spend the rest of the year working on your tan? "If Apple or other defense bounties tried to outbid or even match offense bug prices, they may lose the employees they need most to fix the issues," Moussouris says.

Instead, we’ve ended up with a strange division between the sketchy-but-lucrative exploit markets and legit-but-cheap bug bounties. The same researchers participating in Apple’s bug bounty could make more money selling the same finds to an exploit broker. But they don’t, either because they believe in making software that doesn’t break, or because they’ve got mortgages that can’t be paid in cash.

Or if we’re really lucky, they might believe in siding with the faces over the boots, and making sure a nasty vulnerability doesn’t fall into the wrong hands. For all the abstract talk of markets and incentives, the Mansoor case is also a reminder of how much is at stake in getting that system right. It’s hard to say how much damage might have been caused if Mansoor had clicked on the spyware link — how many arrests, how much real pain inflicted on real people to prop up a corrupt and brutal state. The hope is that, when the next researcher finds the next bug, that thought matters more than the money.


How to fake a fingerprint and break into a phone