Skip to main content

Amazon’s Alexa can handle patient information now — what does that mean for privacy?

Amazon’s Alexa can handle patient information now — what does that mean for privacy?

/

A lot comes down to data use agreements between Alexa and its partners

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Amazon Echo

Last week, Amazon announced that Alexa-enabled devices are now able to handle patient information. It’s an exciting update that could make aspects of health care more accessible, but questions remain around privacy and how Amazon will (or won’t) be able to use the data it now has access to.

Typically, health care workers can only share a patient’s health information with the patient and others in the health care system, according to the Health Insurance Portability and Accountability Act of 1996 (HIPAA). That means that a business like Amazon, which is not explicitly a health care company, wouldn’t be able to handle this data. Now, Amazon says that Alexa is able to follow HIPAA guidelines, and it has already invited six health companies to develop voice programs (“or skills”) using its Alexa system. Customers of those organizations can use Alexa to make an appointment or access personal medical information like blood sugar readings.

“There’s no ‘Good Housekeeping’ seal of approval” or formal process to prove that someone is now HIPAA compliant

Now that Alexa is allowed to handle this data, what can it do with it? The answer depends on the specific agreements between Amazon and a given partner — which means that there’s a lot we don’t know yet. In the case of Livongo, the company whose voice skill lets users check their blood sugar reading, Alexa is mostly a way to transmit the information. When a patient asks Alexa to check their blood sugar reading, the device accesses that data from the Livongo cloud and tells the patient. But the patient information is stored with Livongo and Amazon can’t do anything with it, says Livongo chief product officer Amar Kendale. (The Verge reached out to all six companies developing voice skills. Many were not available or redirected data use questions to Amazon, whose reps did not answer questions as of press time.)

In other cases, the situation could be different. Pamela Hepp, co-chair of the Cybersecurity and Data Privacy Group at Buchanan Ingersoll and Rooney, says that it is possible that patient information could be shared and used to train one of Amazon’s artificial intelligence algorithms. Again, it depends on the agreement. The HIPAA Privacy Rule does require written authorization before someone’s health information can be used for marketing, adds Hepp, but what constitutes “marketing” is not as straightforward as one might think. For example, a company could theoretically use a patient’s information to provide them with info about new services, even if that service isn’t related to the patient’s health needs. In these cases, Alexa could be used to communicate that information, though Alexa couldn’t market Amazon’s own products.

It’s also important to note that there is no official certification process for becoming HIPAA compliant. “There’s no Good Housekeeping seal of approval” or formal process to prove that someone is now HIPAA compliant, according to Hepp. Rather, it is a self-implemented process.

Companies must be able to follow various HIPAA requirements, like the Privacy Rule, Breach Notification Rule, and Security Rule. Of these, the Security Rule can be the most difficult for companies to follow. The Security Rule covers topics such as encryption and having proper controls related to who can access which types of information. To comply with the Security Rule, companies need to have a “robust security infrastructure” in place, explains Charlotte Tschider, a health law expert at DePaul University. This can be expensive for large companies, especially if they’re using older technology that needs to be updated.

Though there is no agency that certifies whether a company is actually following HIPAA, both Tschider and Hepp point out that the Office for Civil Rights, which is part of the Department of Health and Human Services, and the Federal Trade Commission are both able to investigate allegations of HIPAA violations.

A lot of good could come out of this change, says Tschider. “This could be really, really beneficial for consumers,” she adds. “Historically, the harder it is to see a doctor, the less likely someone is to get care. The less access someone has to things that could improve their health, the less likely they are to tweak it.”

Kendale of Livongo, for instance, hopes that the new Alexa skill will make it easier for patients with diabetes to make smart nutritional choices, especially when they’re away from their blood glucose monitor. For example, a patient standing in the kitchen can ask Alexa for their last blood sugar reading and use that information to choose between two foods.

Providence St. Joseph Health is another company working with Amazon; their Alexa voice skill lets the customer find an urgent care center nearby and make a same-day appointment. “I personally believe that voice is going to be a big, big deal in healthcare,” says Aaron Martin, chief innovation officer at Providence St. Joseph. They chose to build this particular skill because “we wanted to find the broadest use case possible,” Martin says. “The hope is that we get a lot of patients using the skill and it allows them to iterate and learn much faster about how to use the skill in a healthcare standpoint.” (Martin declined to comment on Providence St. Joseph’s data use agreement with Amazon.)

Still, Hepp points out that the more “entry points” there are into a medical system, the more risk there is for a cybersecurity breach. And Tschider says she’s concerned about the details of what is in those data use agreements: “I’m concerned about how a very large organization that also sells me stuff is going to use my health information.”