Google’s artificial intelligence arm, DeepMind, received NHS patient data on an inappropriate legal basis, according to the national data guardian.

Dame Fiona Caldicott expressed her concerns in a letter sent in February, in relation to Royal Free London NHS Foundation Trust sending data to DeepMind to test an application. Streams, an acute kidney injury alert app, was deployed at the organisation in January this year.

Revealed first on Sky News on Monday evening, the letter was sent to Stephen Powis, medical director at Royal Free.

Dame Fiona says in the letter that she informed Royal Free and DeepMind in December that she “did not believe that when the patient data was shared with Google DeepMind, implied consent for direct care was an appropriate legal basis”.

“My considered opinion therefore remains that it would not have been within the reasonable expectations of patients that their records would have been shared for this purpose”.

The trust’s agreemeent with DeepMind hit the headlines in April 2016 when New Scientist reported that the AI firm had been given access to five years’ worth of data, covering 1.6 million patients, most of whom had not had acute kidney injury.

The scale of the data transfer has been criticised by privacy groups and has led to an Information Commissioner’s Office (ICO) inquiry. On Tuesday, an ICO spokesperson said the investigation is “close to conclusion”.

The crux of the issue is the definition of “direct care”, for which it is accepted data can be shared without seeking express consent each time. DeepMind argued that in developing the app it was providing such care to patients, and so had implied consent to use the data.

Dame Fiona’s letter makes clear she disagrees. “My view is that when work is taking place to develop new technology this cannot be regarded as direct care, even if the intended end result when the technology is deployed is to provide direct care”.

The letter continued: “Implied consent is only an appropriate legal basis for the disclosure of identifiable data for the purposes of direct care if it aligns with people’s reasonable expectations, i.e. in a legitimate relationship.”

She did not dispute the app’s ability to work successfully, and said that she recognised the usefulness of further guidance to companies working with new technologies and patient data.

Nicola Perrin, head of Understanding Patient Data, said such technologies offer potential to provide better patient care, “but there must be appropriate governance so that everyone can have confidence that patient data is being used responsibly”.

Perrin said Dame Fiona “raises an important question about the legal basis for using patient data to test a new technology, to ensure it is safe before it is introduced in clinical practice. Such testing is essential, but there must be clarity about the regulatory framework and transparency for patients.”

A spokesperson for the Royal Free said Streams is working successfully, and that the trust was proud of its work with DeepMind.

“We took a safety-first approach in testing Streams using real data. This was to check that the app was presenting patient information accurately and safely before being deployed in a live patient setting.”

“Real patient data is routinely used in the NHS to check new systems are working properly before turning them fully live. No responsible hospital would ever deploy a system that hadn’t been thoroughly tested. The NHS remained in full control of all patient data throughout.”

The spokesperson added that the trust “take seriously the conclusions of the NDG, and are pleased that they have asked the Department of Health to look closely at the regulatory framework and guidance provided to organisations taking forward this type of innovation, which is essential to the future of the NHS”.

On Monday, DeepMind said on its website that the company should have publicised its intentions before starting work with the Royal Free.

“We should also have done more to engage with patients and the public at that time, including by proactively providing information about how patient data would be processed, and the safeguards around it.”

Privacy campaign group medConfidential have been critical of DeepMind’s involvement with NHS trusts.

Medconfidential’s Phil Booth said in a statement on Dame Fiona’s letter that: “Every flow of patient data in and around the NHS must be safe, consensual and transparent. Patients should know how their data is used, including for possible improvements to care using new digital tools”.

“Such gross disregard of medical ethics by commercial interests – whose vision of ‘patient care’ reaches little further than their business plan – must never be repeated.”

The Office of the National Data Guardian (NDG) confirmed to Digital Health News the veracity of the letter, and a spokeswoman said that “while the ICO investigation is ongoing the NDG will provide any further assistance to the ICO as required, but will not be commenting further on the matter at this point”.

DeepMind Health provided Digital Health News with a statement, with a spokesperson saying that the data used to provide Streams “has never been used for commercial purposes or combined with Google products, services or ads – and never will be”.

The spokesperson said that safety testing is essential across the NHS, and welcomed Dame Fiona saying further guidance would be useful.

It was also acknowledged from the company that, “we also recognise that there needs to be much more public engagement and discussion about new technology in the NHS”.

Since the original New Scientist report, DeepMind’s ventures in the NHS have been watched intently, with one group of academics saying the Royal Free deal made “inexcusable” mistakes.

This has not deterred other trusts signing up to use their services with University College London Hospitals NHS Foundation Trust, Moorfields Eye Hospital NHS Foundation Trust, and Imperial College Healthcare NHS Trust having agreements in place.

DeepMind have attempted to allay health data transparency fears by creating a new data audit infrastructure and having a panel of independent reviewers look into its work. The panel has yet to publish any findings.

This story was updated to include DeepMind’s statement – 16.05.17