Skip to main content

Apple’s hired contractors are listening to your recorded Siri conversations, too

Apple’s hired contractors are listening to your recorded Siri conversations, too

/

Just like Alexa and Google Assistant

Share this story

top of the homepod
Photo by James Bareham / The Verge

Apple is paying contractors to listen to recorded Siri conversations, according to a new report from The Guardian, with a former contractor revealing that workers have heard accidental recordings of users’ personal lives, including doctor’s appointments, addresses, and even possible drug deals.

According to that contractor, Siri interactions are sent to workers, who listen to the recording and are asked to grade it for a variety of factors, like whether the request was intentional or a false positive that accidentally triggered Siri, or if the response was helpful.

Apple isn’t very transparent about the recording process, or who listens to them

But Apple doesn’t really explicitly say that it has other humans listening to the recordings, and whatever admissions it does make to that end are likely buried deep in a privacy policy that few (if any) Siri users have ever read. Apple does note on its privacy page that “To help them recognize your pronunciation and provide better responses, certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols,” but nowhere does it mention that human workers will be listening to and analyzing that data.

In a statement to The Guardian, the company acknowledged that “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” Apple also noted that less than 1 percent of daily activations are analyzed under this system.

The fact that humans are listening to voice assistant recordings in general isn’t exactly news — both Amazon (for Alexa) and Google (for Assistant) have been revealed to have similar systems where actual human workers listen to recorded conversations to better improve those systems. It makes sense: smart assistants obviously can’t tell the difference between false positives and actual queries (if they could, it wouldn’t be a false positive), and anyone who’s used a smart assistant can tell you that false positives are still very, very common at this stage of their evolution.

But for all three of these companies, it wasn’t clear up until recently the extent to which these companies were listening in on customers.

Apple’s system may also be more concerning for a few reasons, like the pervasiveness of Apple products. Where Alexa is largely limited to smart speakers, and Google Assistant to speakers and phones, Siri is also on Apple’s hugely popular Apple Watch, which is on millions of people’s wrists every waking moment. Plus, Siri on an Apple Watch activates any time a user raises their wrist, not just when it thinks it’s heard the “Hey, Siri” wake word phrase.

The pervasiveness of Siri on Apple hardware may make it worse than Google or Amazon

According to The Guardian’s source, that proliferation has led to some very personal conversations making their way to complete strangers working for Apple: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

Additionally, as The Guardian notes, while Amazon and Google allow customers to opt out of some uses of their recordings, Apple doesn’t offer a similar privacy protecting option, outside of disabling Siri entirely. That’s a particularly bad look, given that Apple has built so much of its reputation on selling itself as the privacy company that defends your data in ways that Google and Amazon don’t. Implicitly telling customers that, effectively, “the only way to have peace of mind that a random stranger won’t listen in on their accidentally triggered Siri recordings is to stop using Siri entirely” is a bit of a mixed message from the company that supposedly puts privacy at a premium.

Short of completely stopping use of smart assistants, there likely isn’t much that Siri customers will be able to do to avoid the issue, other than being careful what they say around their iPhones and HomePods (unless the public pressure here causes Apple to add an opt-out option). Still, it’s a good reminder that when you agree to use these products, you’re often giving up a lot more privacy than you think.