DeepMind Health received NHS data on inappropriate legal basis, says Dame Caldicott

  • 16 May 2017
DeepMind Health received NHS data on inappropriate legal basis, says Dame Caldicott

Google’s artificial intelligence arm, DeepMind, received NHS patient data on an inappropriate legal basis, according to the national data guardian.

Dame Fiona Caldicott expressed her concerns in a letter sent in February, in relation to Royal Free London NHS Foundation Trust sending data to DeepMind to test an application. Streams, an acute kidney injury alert app, was deployed at the organisation in January this year.

Revealed first on Sky News on Monday evening, the letter was sent to Stephen Powis, medical director at Royal Free.

Dame Fiona says in the letter that she informed Royal Free and DeepMind in December that she “did not believe that when the patient data was shared with Google DeepMind, implied consent for direct care was an appropriate legal basis”.

“My considered opinion therefore remains that it would not have been within the reasonable expectations of patients that their records would have been shared for this purpose”.

The trust’s agreemeent with DeepMind hit the headlines in April 2016 when New Scientist reported that the AI firm had been given access to five years’ worth of data, covering 1.6 million patients, most of whom had not had acute kidney injury.

The scale of the data transfer has been criticised by privacy groups and has led to an Information Commissioner’s Office (ICO) inquiry. On Tuesday, an ICO spokesperson said the investigation is “close to conclusion”.

The crux of the issue is the definition of “direct care”, for which it is accepted data can be shared without seeking express consent each time. DeepMind argued that in developing the app it was providing such care to patients, and so had implied consent to use the data.

https://twitter.com/lexanderjmartin/status/864183460478541825

Dame Fiona’s letter makes clear she disagrees. “My view is that when work is taking place to develop new technology this cannot be regarded as direct care, even if the intended end result when the technology is deployed is to provide direct care”.

The letter continued: “Implied consent is only an appropriate legal basis for the disclosure of identifiable data for the purposes of direct care if it aligns with people’s reasonable expectations, i.e. in a legitimate relationship.”

She did not dispute the app’s ability to work successfully, and said that she recognised the usefulness of further guidance to companies working with new technologies and patient data.

Nicola Perrin, head of Understanding Patient Data, said such technologies offer potential to provide better patient care, “but there must be appropriate governance so that everyone can have confidence that patient data is being used responsibly”.

Perrin said Dame Fiona “raises an important question about the legal basis for using patient data to test a new technology, to ensure it is safe before it is introduced in clinical practice. Such testing is essential, but there must be clarity about the regulatory framework and transparency for patients.”

A spokesperson for the Royal Free said Streams is working successfully, and that the trust was proud of its work with DeepMind.

“We took a safety-first approach in testing Streams using real data. This was to check that the app was presenting patient information accurately and safely before being deployed in a live patient setting.”

“Real patient data is routinely used in the NHS to check new systems are working properly before turning them fully live. No responsible hospital would ever deploy a system that hadn’t been thoroughly tested. The NHS remained in full control of all patient data throughout.”

The spokesperson added that the trust “take seriously the conclusions of the NDG, and are pleased that they have asked the Department of Health to look closely at the regulatory framework and guidance provided to organisations taking forward this type of innovation, which is essential to the future of the NHS”.

On Monday, DeepMind said on its website that the company should have publicised its intentions before starting work with the Royal Free.

“We should also have done more to engage with patients and the public at that time, including by proactively providing information about how patient data would be processed, and the safeguards around it.”

Privacy campaign group medConfidential have been critical of DeepMind’s involvement with NHS trusts.

Medconfidential’s Phil Booth said in a statement on Dame Fiona’s letter that: “Every flow of patient data in and around the NHS must be safe, consensual and transparent. Patients should know how their data is used, including for possible improvements to care using new digital tools”.

“Such gross disregard of medical ethics by commercial interests – whose vision of ‘patient care’ reaches little further than their business plan – must never be repeated.”

The Office of the National Data Guardian (NDG) confirmed to Digital Health News the veracity of the letter, and a spokeswoman said that “while the ICO investigation is ongoing the NDG will provide any further assistance to the ICO as required, but will not be commenting further on the matter at this point”.

DeepMind Health provided Digital Health News with a statement, with a spokesperson saying that the data used to provide Streams “has never been used for commercial purposes or combined with Google products, services or ads – and never will be”.

The spokesperson said that safety testing is essential across the NHS, and welcomed Dame Fiona saying further guidance would be useful.

It was also acknowledged from the company that, “we also recognise that there needs to be much more public engagement and discussion about new technology in the NHS”.

Since the original New Scientist report, DeepMind’s ventures in the NHS have been watched intently, with one group of academics saying the Royal Free deal made “inexcusable” mistakes.

This has not deterred other trusts signing up to use their services with University College London Hospitals NHS Foundation Trust, Moorfields Eye Hospital NHS Foundation Trust, and Imperial College Healthcare NHS Trust having agreements in place.

DeepMind have attempted to allay health data transparency fears by creating a new data audit infrastructure and having a panel of independent reviewers look into its work. The panel has yet to publish any findings.

This story was updated to include DeepMind’s statement – 16.05.17

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

Digital Health Coffee Time Briefing ☕

Digital Health Coffee Time Briefing ☕

This briefing features the South Yorkshire Digital Health Hub unveiling investment in health tech research and training by Google, and awards for Word360.
Digital Health’s 2023 Review: Top 10 most read news stories

Digital Health’s 2023 Review: Top 10 most read news stories

As 2024 edges ever closer, let's take a look back Digital Health's most popular and most read news stories from the past year.
Digital Health Coffee Time Briefing ☕ 

Digital Health Coffee Time Briefing ☕ 

Today's briefing includes news that Visionable has secured $5.5M in funding to expand into the US market and Google has launched a new AI model.

5 Comments

  • I am sorry but they got it wrong. As an expert in the field of research, confidentiality and data protection, I have no doubt that the law is clear enough about this.

    The NHS Trust and DeepMind thought that if they took the risk then they would be able to convince us (after the fact) that what they were doing was legal.

    It wasn’t. They should take heed… the public expect their data to be handled with the utmost confidentiality and to be kept inform of how their data is being used – once your personal data is out there, you’ve pretty much lost control of it.

    I agree though, it’s about time we all had access to see who’s been dipping in and out of our records – ultimately giving people this option is the best way to ensure appropriate use.

  • In my personal and honest opinion … the NHS are the guardians of UK citizens health DATA, the fact that in 2017 the NHS, which belongs to and is paid for by UK citizens, has failed to provide all UK citizens with access to their own health DATA is very bad. The NHS provide great clinical care. The fact that in 2017 the NHS has failed to provide all UK citizens with access to their health DATA can not be blamed on the IT suppliers or the politicians all of whom support the NHS, the accountability lies within the NHS, specifically senior leadership. Real health DATA can provide great insight in to things such as the early diagnosis of problems and spotting financial trends. So much time and resource can be used on enforcing information governance, that’s fine but … the bottom line is that in 2017 UK citizens should have access to and control of their own health DATA, this would lead to increased efficiency. When IT comes to DATA I trust Google, who in my personal opinion are extremely powerful (but who’s business in not power) to not only do a good job, but to do IT efficiently. The role of the National Health Service is to provide great clinical care, they do that brilliantly.

  • i think point is trusts have to get explicit consent for research (as law currently stands)
    it says it was identifiable…
    maybe the law should be changed but as it stands…

  • not sure if testing was the purpose why anonymised data would not have been sufficient…otherwise I think it’s a bit of a gotcha for them reluctant as I am to say that…

    • er, it was anonymised?
      I think the point is not that it was testing. The point is it was used in the development process. It’s different, but could, if you wanted to be really fussy, be called research. I think this is a poor judgement, by bad bad judges [please note the irony]

Comments are closed.