Skip to main content

How Apple could miss a privacy-invading FaceTime security flaw

How Apple could miss a privacy-invading FaceTime security flaw


Sometimes you need real users to spot the weirdest bugs

Share this story

A major flaw in Apple’s FaceTime feature allowing callers to eavesdrop on call recipients was widely publicized yesterday, close to three months after the flaw may have been introduced. Apple has made a point of billing itself as the privacy-conscious adult among the tech giants, and it usually employs strict security measures and a meticulous approach to hunting bugs. So in this case, why didn’t Apple catch such a major flaw before it went public?

Part of the problem is the rough history of the feature itself. The flaw has to do with FaceTime’s new group chatting feature, which Apple pulled from later iOS 12 betas and delayed the release of until this past October. Three months is a long time for a bug this bad to be active, but it also means users haven’t had much time to discover this kind of weird behavior. (Apple has pulled the Group FaceTime feature pending a patch. We’ve reached out to Apple for comment about when and how they first heard about the bug, or if the team independently discovered it.)

“You’re like, ‘Wow, how did that get through testing?’”

Another problem is the nature of the bug, which put it just outside the reach of conventional bug testing. Jake Williams, founder of Rendition Infosec, says the most basic form of bug testing is an automated process called “fuzzing,” which he says involves sending an improperly formatted input to see if the system breaks (e.g. inserting a 20-character password instead of the 15-character maximum). But the FaceTime bug dealt with a chain of unusual UI maneuvers rather than a particular input, so it would have passed through a fuzzing test unnoticed.

The bug would have been more likely to turn up in quality assurance testing, which involves real-world use examples with real users. But calling your own phone number after starting a call with someone else is relatively rare, so it could have easily slipped through the cracks. Williams isn’t surprised a random person found it before Apple’s actual security team.

“I think that’s how the bug got out the door,” he says. “We’ve seen a number of these over the years that you step back, and you’re like, ‘Wow, how did that get through testing?’ But then as you think about it, you’re like, ‘Well, it’s a fringe scenario and why would somebody have tested that? It’s one of those weird logic bugs.’”

“There are very creative hacker minds out there.”

Apple’s security team likely isn’t to blame, says Katie Moussouris, CEO and founder of Luta Security. Instead the problem could have to do with Apple’s response to bug reports from the public. She notes that Apple is among the 6 percent of Forbes 2000 companies that actually has a published way to report bugs.

“I think what they missed here was an opportunity to train their support staff and their social media staff on fast routing of security bugs, or potential security bugs to the right team,” she says.

The bug was apparently reported by a curious teen eight days before it went public. The teen’s mom says she reported the bug through Apple support, and when she didn’t receive a response, she emailed and faxed a formal notice to the company. Apple, Moussouris speculates, might have seen these reports but focused on investigating the bug and trying to recreate it, instead of opening a line of communication with the reporter.

“This is a member of the public that is trying to report what was a quite serious security and privacy hole, and they struggled at the beginning to find the right contact,” she says. “So to them, Apple’s looking like a black box that’s not responsive, because they’ve already tried all these channels that seem logical. I think that Apple was likely in the process of doing its investigation, but it wasn’t really helped out by the fact that there were considerable delays at the beginning.”

That said, it’s unclear if Apple ever sent a message saying it received the report in the first place. Moussouris says that the ISO standard for vulnerability disclosure only requires that the company acknowledge that it received the report, and that bugs can take a long time to remedy. Google’s Project Zero, which attempts to find zero-day bugs, gives companies 30 to 60 days to respond to a report. This reporter might not have known what action Apple was taking on its end.

Some of this mystery could be remedied if Apple and other companies set clear expectations around reporting, telling researchers that they’re looking into the bug and to keep the disclosure private.

“Ideally, organizations have tried to eliminate as many bugs from their own code before shipping as possible,” she says. “But after that fact, you know they’re going to be bugs left over, there are very creative hacker minds out there, and as you can tell, very creative teenagers.”