Skip to main content

Google’s DeepMind and UK hospitals made illegal deal for health data, says watchdog

Google’s DeepMind and UK hospitals made illegal deal for health data, says watchdog

/

The ruling concerns a 2015 agreement the AI subsidiary made with UK hospitals that has since been replaced

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

A deal between UK hospitals and Google’s AI subsidiary DeepMind “failed to comply with data protection law,” according to the UK’s data watchdog. The Information Commissioner's Office (ICO) made its ruling today after a year-long investigation into the agreement, which saw DeepMind process 1.6 million patient records belonging to UK citizens for the Royal Free Trust — a group of three London hospitals.

The deal was originally struck in 2015, and has since been superseded by a new agreement. At the time, DeepMind and the Royal Free said the data was being shared to develop an app named Streams, which would alert doctors if patients were at risk from a condition called acute kidney injury. An investigation by the New Scientist revealed that the terms of the agreement were more broad than had been originally implied. DeepMind has since made new deals to deploy Streams in other UK hospitals.

Today, ICO said it had found “a number of shortcomings” with the agreement, particularly that patients had not been fully briefed on how their personal data would be used. In a press statement, the UK’s information commissioner Elizabeth Denham said that the “price of innovation does not need to be the erosion of fundamental privacy rights.”

Patients were not asked if they consented to having their medical data processed by DeepMind. The information shared included including details of drug overdoses, abortions, and whether individuals were HIV positive. DeepMind and the Royal Free have argued that patients had given “implied consent” to sharing, because this information would be used to deliver “direct care” via the Streams app.

Today’s ruling suggests that the two institutions did not go far enough. “Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening,” said Denham.

The contract was always clear that no private data would ever be shared with DeepMind’s parent company Google, which bought the firm in 2014. Neither would machine learning or AI tools be used to analyze this information. (Although DeepMind is involved in two separate deals with UK hospitals to develop AI-powered algorithms for improving cancer treatment and eye disease.)

DeepMind says it welcomes ICO’s “thoughtful resolution” of the case, and admits it made a number of mistakes during its original deal. The company says it should have better explained the deal, to patients and the public, and that it “underestimated the complexity of the NHS and of the rules around patient data.”

In a blog post by the ICO, the watchdog stated that in the rush to innovate, institutions like the Royal Free Trust should not forget to follow the law. The Trust has been asked to sign a new agreement committing it to act in accordance with the law and commission an audit of the 2015 trial. “When you’re setting out to test the clinical safety of a new service, remember that the rules are there for a reason,” writes Denham.