Skip to main content

Deleting your Siri voice recordings from Apple’s servers is confusing — here’s how

Deleting your Siri voice recordings from Apple’s servers is confusing — here’s how


It’s way too hard

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Siri logo on a black background
Illustration by Alex Castro / The Verge

Last night, Apple joined Google in ceasing its program of having human graders listen to user voice commands recorded by its voice assistant. Apple didn’t specify whether or not it was actually ceasing whether those recordings still take place at all. I asked and haven’t received a clear answer.

For all of Apple’s well-earned reputation for protecting privacy, sometimes the actual controls it provides to users to handle their data settings are weak, opaque, or nonexistent. It’s ironic because Apple has a much better set of default technologies and policies when it comes to user data. In general, Apple wants to avoid having your data and make it easier for you to stop sharing it with others.

But this issue where Siri recordings are saved on it servers — albeit anonymized — has revealed a new problem, one that Apple is going to need to do a better job of handling as it shifts more and more of its businesses to services. Because Apple doesn’t truck in user data, the company doesn’t have the same experience Google, Amazon, and even Facebook do in offering users control over the data it does collect, and it certainly doesn’t have the same experience in dealing with privacy concerns when they do happen.

Everybody else has a web portal with easy ways to control and delete data

Amazon, Google, and even Facebook each have a specific web site where you can review data privacy settings for their assistants, delete data, and generally get information on what each company knows about you. Here they are, with the full URL written out (you should avoid blindly clicking any link that purports to take you directly to your account settings):

We have written up guides with more detailed instructions for deleting your data from both the Google Assistant and Amazon Alexa.

Apple does not offer a privacy portal website for your Siri data, or any particular settings screen to fix it in an app. Its general privacy page is a big set of very clear explanations of what Apple’s policies are, but no specific information on your data or checkboxes to delete it. The only thing you can do from Apple’s website is download or delete all of your data.

In some part, this is a result of Apple’s relatively unique, device-focused infrastructure. It’s harder for Apple to make a web-based privacy portal when it focuses so much effort on keeping data on discrete devices.

Still, Amazon and Google make it relatively easy to delete your voice data from their servers. Google also allows you to turn off voice logging on their assistants at the links above, although doing so may break some features.

The day after this story was originally published, Amazon decided to give users the option to disable human review of their voice logs, but it does not (and has not ever) allowed you to turn off saving your recordings by default. In short, you can delete them as often as you like, but you cannot prevent their upload with a setting.

Apple also doesn’t offer the ability to use Siri without your voice getting saved to its servers. Apple stresses that your recorded utterances are not associated with your Apple account, but that is cold comfort if you’re truly worried about a human contractor potentially hearing private information the HomePod accidentally heard in your house.

You can’t use Siri without having your voice recorded on Apple’s servers

It gets worse: while you can delete your utterances from Apple’s servers, the process for doing so is so completely unintuitive that the only way you could possibly learn how to do it is to Google it and find an article like this.

It’s possible the future update it promised last night will allow you to use Siri without having your voice saved on Apple servers. But read Apple’s statement carefully and you’ll see the opt out is for “grading,” not necessarily recording: “Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Apple’s most recent iOS security guide is a masterclass of how to explain to consumers and security experts alike how to keep an operating system private. But check out page 69, where Apple details its data retention policies for your voice recordings:

User voice recordings are saved for a six-month period so that the recognition system can utilize them to better understand the user’s voice. After six months, another copy is saved, without its identifier, for use by Apple in improving and developing Siri for up to two years. A small subset of recordings, transcripts, and associated data without identifiers may continue to be used by Apple for ongoing improvement and quality assurance of Siri beyond two years. Additionally, some recordings that reference music, sports teams and players, and businesses or points of interest are similarly saved for purposes of improving Siri. 

What happens on your iPhone stays on your iPhone” has apparently become “What happens on Siri stays on Apple’s servers, potentially forever.”

How to delete your voice recordings from Apple’s servers

Homepod in a room
Photo by James Bareham / The Verge

Here’s how to delete your recorded voice remarks on an iPhone — but you’ll need to repeat similar processes on every Apple device you own. What you’re going to do is delete all of the information Apple gets from Siri, including recordings of your voice. But the way you do that isn’t by going to the Privacy section of your settings. Instead, do this:

  • Go to “Settings” > “Siri & Search”
  • Turn off all the ways there are to activate Siri. There are two: “Listen for ‘Hey Siri’” and “Press Side Button for Siri.”
  • When you turn off the last way to activate Siri, that effectively turns Siri off. You’ll get a warning that there is one more step you need to take to delete your data from Apple’s servers.
  • Go to “Settings” > “General” > “Keyboard.” Scroll down to where you see “Enable Dictation.” When you tick that to off, you’ll get a warning that if you ever want to use it again, you’ll have to go through some re-uploading.

Deleting your data involves checking nearly random boxes that don’t say anything close to “delete my data”

You might have been wondering why I am so strident that Apple has a problem here, but think about how weird the above steps really are. These are dark patterns on multiple levels.

First, the “privacy” section of the iPhone’s settings doesn’t have anything related to deleting your Siri data. Second, the way you disable Siri isn’t to just uncheck “Use Siri,” but instead uncheck what seems like nothing more than different options for activating Siri. Thirdly and most egregiously: there is no way to know that unchecking those options is how you delete data you might not want Apple to have. To figure that out, you have to click the “Our approach to privacy” link on Apple’s privacy page or, as I mentioned above, Googling it.

And even if you get through all that, here’s the kicker: you have to go through all those steps on all of your Apple devices to delete Siri’s data, but turning Siri or Dictation back on means that data logging starts anew.

the only way regular users have to stop Apple from keeping voice recordings is to turn off Siri and Dictation and never use either of them again

Recently, 9to5Mac pointed to a downloadable iOS profile created by a security researcher that should stop server-side logging. It looks relatively innocuous to my untrained eye, but it’s never a good idea to just install profiles from the internet, so I recommend against it.

Enterprise users and schools have the option to build and install a profile themselves using Apple’s Configurator tool which supposedly disables server-side logging for Siri. It’s designed to help administrators manage small fleets of Apple devices, and technically against Apple’s terms of service for consumers to use on their own phones. It’s fairly easy to click the wrong box in this tool and mess up your phone, so again I recommend against it.

Unfortunately, the only way regular users have to stop Apple from keeping voice recordings is to turn off Siri and Dictation and never use either of them again. Apple takes a strong privacy stance, but this is definitely an area where it should be doing better by its users.

This is all terrible. But here is the good news: Apple’s data practices are much, much more private than what Google or Amazon do. It doesn’t track you for advertising purposes across the entire web. It doesn’t want to know your location or what you’ve purchased.

Apple collects less information about you than the rest — but not zero

That is all great! But it is not a replacement for clear and obvious privacy settings for the data Apple has about you. Because Apple surely does know some things! It has an advertising business inside the App Store. It knows what Apple products you own. There’s that iCloud loophole, wherein it really can turn over your synced data to governments if legally required, just like everybody else.

Apple collects vastly less data about you than Facebook, Google, or Amazon do — but it’s not nothing. And the surprising-not-really-surprising revelation that it is storing recordings of your voice just like Google and Amazon is proof of that.

Apple has had fewer privacy scandals than everybody else in big tech (although there have still been some big ones). Apple is also trying to build technology that’s private by design. But Apple has a blind spot to giving users control when it comes to the data it actually does collect, and it’s created some some bad user interfaces because it. So the very good news in all of this is that the complaint here is fundamentally about product design — something Apple ostensibly knows something about.

Privacy is not an all-or-nothing thing when it comes to technology. All the other big tech giants have learned that the hard way and had to radically improve the tools they offer to users to manage their data.

Now it’s Apple’s turn.

Correction, 12:15AM ET, August 2nd. The original version of this article said that it was possible to set a default where Alexa wouldn’t store your voice recordings on it servers. That was incorrect — you could only opt out of some human review of those recordings. Google is the only company of the three that allows you to, by default, set its Assistant to never store your voice recordings on its servers. The relevant section has been updated, and I regret the error.