Apple has said that it will temporarily suspend its practice of using human contractors to grade snippets of Siri voice recordings for accuracy. The move follows a report in The Guardian where a former worker detailed the program, claiming that contractors “regularly hear confidential medical information, drug deals, and recordings of couples having sex” as part of their job.
“We are committed to delivering a great Siri experience while protecting user privacy,” an Apple spokesperson says in a statement to The Verge. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
Apple did not comment on whether, in addition to pausing the program where contractors listen to Siri voice recordings, it would also stop actually saving those recordings on its servers. Currently the company says it keeps recordings for six months before removing identifying information from a copy that it could keep for two years or more.
The purpose of the grading program is to help improve the accuracy of Siri voice recognition and prevent accidental triggers. “A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple initially told The Guardian. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
But the company’s terms of service was less than clear about the reality that humans outside Apple are listening in on Siri, only noting that “certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols.”
Apple also didn’t offer users any way to opt out beyond disabling Siri altogether. Competing voice assistants from companies like Amazon and Google also use human review to improve their accuracy, but have allowed users to opt out.