DeepMind Health have moved to counter data privacy concerns by introducing an audit trail for the company’s access to NHS patient data.

Google’s artificial intelligence off-shoot announced on Thursday that it will develop a digital ledger called ‘verifiable data audit’, which will give trusts the ability to see how the data is being processed in real time.

Mustafa Suleyman, DeepMind’s co-founder, told Digital Health News that the technology “should bring a level of transparency and oversight to the use and processing of health data that will improve the level of trust and accountability”.

The mechanism, that will be developed this year, would allow the trusts to see not only when data is accessed, but also how that data is used and why.

In the blog-post making the announcement, written by Suleyman and Ben Laurie, DeepMind’s head of security and transparency, they said: “we want to make that verifiable and auditable, in real-time, for the first time”.

DeepMind is forming a expanding lists of partnerships with NHS trusts,  both through its clinical alerting app, Streams, and its artificial intelligence research.

However, the company’s involvement in the NHS has been criticised by privacy advocates, concerned about both the scope and transparency around Deepmind’s access to NHS patient data.

A New Scientist media investigation in May last year reported that the agreement between the Royal Free London NHS Foundation Trust and DeepMind involved information on 1.6 million patients over five years.

This led to national media attention, a still on-going Information Commissioner’s Office investigation.

Notwithstanding these concerns, Royal Free extended the partnership in November last year signing a new five-year deal, and Imperial College Healthcare NHS Trust signed up for the app in December.

DeepMind’s blog-post compared the forthcoming audit mechanism to block-chain, as it would be append-only and allow for third parties to verify that no one has tampered with the data.

The three technical challenges identified in creating the technology were ensuring there were no blind spots, it could be used answer different group’s needs and making sure the log is complete despite different systems storing the data.

On the latter, Suleyman and Laurie said “this doesn’t mean that a data processor like DeepMind should see data or audit logs from other systems”.

“Logs should remain decentralised, just like the data itself. Audit interoperability would simply provide additional reassurance that this data can’t be tampered with as it travels between systems.”

Suleyman told Digital Health News that he saw the technology as a “step change in the way we store and process large data sets”.

Jim Killock, executive director of Open Rights Group, said in a statement that it “seems like a very interesting attempt to improve auditing of the way the data is stored, copied and used”.

DeepMind Health is also overseen by a panel of Independent Reviewers, who meet four times a year, but have yet to publish their first annual report into the company.

George Danezis, professor of security and privacy engineering at UCL, said in a statement that “enhancing such audit logs with high-integrity cryptographic controls, inspired by block chains, provides a higher level of assurance that mistakes or violations of policy will be found, and unauthorized parties cannot hide their trails”.

DeepMind is also working with University College London Hospitals NHS Foundation Trust for a research project into head and neck cancer, that was announced in September.

DeepMind is a London-based AI company that Google brought for £360 million in 2014.