Just like every other tech company that got caught with its hand in the cookie jar this year — hey there, Amazon, Apple, Google, and Facebook — we recently learned that Microsoft had been quietly letting human contractors listen to your Skype translations and Cortana voice recordings. That’s right: they’re not just AI.
But unlike Apple and Google, each of which halted listening to some of these recordings after the revelations, Microsoft appears to be merely updating its privacy policy to admit that yes, in fact, humans do review some of these recordings. One caveat here: Microsoft is only doing this for Skype’s translation feature, not Skype calls. The company is, however, analyzing voice snippets from Cortana requests and exchanges, presumably across all platforms including PC, where one might be more readily searching the web with more sensitive requests.
Motherboard spotted the changes, which you can also read for yourself here, here, and here. Here are the key phrases that might clue you in:
Our processing of personal data for these purposes includes both automated and manual (human) methods of processing. Our automated methods often are related to and supported by our manual methods.
And:
To build, train, and improve the accuracy of our automated methods of processing (including AI), we manually review some of the predictions and inferences produced by the automated methods against the underlying data from which the predictions and inferences were made. For example, we manually review short snippets of a small sampling of voice data we have taken steps to de-identify to improve our speech services, such as recognition and translation.
And:
When you talk to Cortana or other apps that use Microsoft speech services, Microsoft stores a copy of your audio recordings (i.e., voice data) [...] This may include transcription of audio recordings by Microsoft employees and vendors, subject to procedures designed to prioritize users’ privacy, including taking steps to de-identify data, requiring non-disclosure agreements with vendors and their employees, and requiring that vendors meet the high privacy standards set out in European law and elsewhere.
It’s true that systems built using machine learning, like a majority of modern voice recognition and natural language processing ones, generally need to be audited by humans in order to improve — it’s not clear how a machine would tell a false positive unless a human points it out, annotates the data, and feeds it back into the system. And to Microsoft’s credit, it offers a privacy dashboard where you can retroactively delete your voice data.
(Also, Cortana seems like it’s on the outs.)
But the scandal, with all of these tech companies, was that they didn’t think to make it clear that humans (read: outsourced contractors) would be listening to extremely personal details like people speaking their exact street address, confidential medical info, or sex noises into a voice assistant’s microphone — and let us proactively opt out, if we decide that’s something we don’t want to bring into our homes.
Apple says it’ll have a future update that lets its customers opt out. Will other companies do the same?
Comments
From the link in that quote….at the end of the article.
By sciwiz on 08.14.19 3:24pm
Calling this a scandal devalues the word scandal. Why exactly is having someone hear audio recordings of your search requests any different than the fact that Google knows every search request you’ve made using Chrome, Google Search, or your Android device?
Also…
Who is doing that? Why would you make sex noises at a virtual assistant?
By derek.tonkin on 08.14.19 4:33pm
Google isn’t a person. Your search data is held in encrypted form by Google and no Google employee has access to them. These recordings, on the other hand, are being heard by real people.
By ujwalsoni on 08.14.19 4:42pm
My God I hope you do not actually believe this.
What on earth would make you think that Google, who was one of the companies caught in this "scandal" of having humans review audio searches for QA is not doing the same thing with normal, typed searches? It seems like the real issue is that it is spoken as opposed to typed.
By derek.tonkin on 08.16.19 2:37pm
Because the phone wrongly thought it heard the activation keyword for the assistant while it was on your nightstand next to you getting freaky, then didn’t manage to detect any meaningful spoken commands, and so saved the recording and passed it on for manual review?
By trost79muh on 08.14.19 8:34pm
Welcome to machine learning data validation/testing.
By Demios on 08.14.19 8:27pm
Usually when you welcome someone to something, it’s because you’ve invited them and they’ve gone there of their own free will.
Unless you are a Bond villain.
By badasscat1 on 08.14.19 9:58pm
Sure, take a figure of speech that implies "of course, that’s mostly how ML works" literally.
By Demios on 08.16.19 11:28am
Oh common! We all know that these people have nothing to listen to… No one except for those two lads over at Windows central use Cortana!
By iWan_rulZ on 08.15.19 3:37am
Quit Apple
Quit Amazon
Quit Google
Quit Microsoft
Where are the cries???
By YCSMD on 08.15.19 11:19am
Why is anyone surprised by this? This is not a "scandal."
By daxus on 08.15.19 3:46pm
and there goes customers’ trust in garbage. First people blamed consumer for not reading the privacy policy but now even when customer read the privacy policy they change the fancy policy without inform the consumers. I think this in unethical and no cool.
I don’t feel comfortable about it at all. I mean there is a possibility that they have read the our (customers) conversations. With this thing in mind I am deleting skype account once and for all for good.
I am of you want to delete skype account follow this guide. https://www.purevpn.com/internet-privacy/how-to-delete-skype-account
By techjunkie2018 on 08.17.19 10:21am