Hospitals need to think far beyond electronic records and must surpass narrow approaches to AI, says Neil Perry. The director of digital transformation at Dartford and Gravesham NHS Trust details his mission to do things differently.

Dartford and Gravesham NHS Trust is in many ways a ‘normal’ NHS trust. If you look at our size, our footprint, the services we provide, and our resource for areas like digital, we are comparable with around 80% of hospital trusts across the health service.

But our plan for advancing digital maturity is not the norm, and our approach could provide a possible new trail for others in the NHS.

A strategy for leapfrogging

In 2017, we reset our digital strategy. The mission: to build on existing technology, to leapfrog other trusts and to become a self-made digital exemplar.

There was a historic feeling amongst our clinicians that IT systems were for the bean-counters. We wanted to change that and give clinicians the tools they would find most useful.

With a good patient administration system already in place, we created a blueprint that would focus on delivering clinical functionality fast.

We became the first trust in the country to invest in a technology called Miya Precision from our health tech partner Alcidion.

This platform will give us the tools to complete a large part of our immediate digital jigsaw, and will help us to realise our longer term strategy as we try to move forwards from traditional thinking.

What we are doing immediately 

Right now, we are readying the platform to push and pull atomic level data from all of our IT systems and from further afield – for example from GP systems, so we can use all of that data for sophisticated clinical decision support when clinicians need it.

Natural language processing and e-noting capabilities that wowed clinicians when they made their decision to procure this technology, will make free text a clinical asset.

We will be able to codify free text in a way that many other hospitals don’t and push it through rules engines, that will then suggest appropriate actions and pathways for clinicians, automating order sets and prescribing decisions based on symptoms and clinical narrative.

At the technical end, standards like FHIR, and the ability to convert all of our data to this standard, is the means by which we will get information and insights to where they need to be. But clinicians don’t really care about that – they just want something that works well.

That is what we are delivering – information that will translate into active and actionable insights, rather than simply static data, so that it will flow through new ways across our existing and future systems. This will help to automate organisation-wide actions and reduce the cognitive and administrative burdens faced by our clinicians and their wider supporting teams.

Thinking beyond EPRs to fully connected intelligence

When you look at big electronic patient record (EPR) implementations – often menu driven systems, clinicians don’t always fully engage with their IT.

We want to deliver something that our clinicians do value. Natural language processing is the start of that and will mean we can extract value from the narrative of what the clinician has seen, and the hands-on investigations they have done. We can then start to expose entire volumes of insight on different conditions and permeations from their clinical knowledge to rules engines, algorithms and artificial intelligence.

This type of untapped intelligence in most hospitals still resides in paper-based systems. But with Miya Precision, we can take that clinical knowledge and channel it, using the platform as a central neural network. Data that previously was unstructured will now be transformed into useable structured information. It will mean that patterns of clinical insights will emerge that can be accessed and shared across hospital teams in a new and practical manner.

We want our technology to be able to intelligently advise a junior or locum doctor in A&E on the things they need to do for a patient who is presenting with shortness of breath and has a number of comorbidities, for example. Once certain tests and imaging have been completed, our technological neural network will collate the findings – whether from clinicians or from AI algorithms assessing those examinations – and advise healthcare staff of the things that need to be monitored or carried out.

This will follow logic that senior clinicians have programmed in, or that computers have picked up from big data or machine learning based on many thousands of other patients with similar pathways, to see what has been done to lead to an improved outcome.

It will mean we can create risk scores that move us on from traditional early warning scores like NEWS2. And we can start to build new clinical calculators off the back of that data that can predict, segment and signpost people to the correct, fastest path for the best treatment.

Digital neural networking to clinical artificial general intelligence

The age-old debate in healthcare IT around whether hospitals should pursue a best of breed approach to digital or a monolithic EPR strategy is now moving on to something very different.

EPRs are just the first steps required. Models like HIMSS EMRAM – used to monitor electronic medical record adoption worldwide – may need to evolve to encapsulate the adoption of more modern technologies and the creation of digital neural networks or central intelligence hubs within hospitals.

These hubs need to be able to embrace data from everywhere. Whether that’s a traditional record system or directly from a device.

AI algorithms are being developed that can detect if a patient has asthma or COPD directly from the audio of their breathing patterns. Sleep apnoea can be detected from smartphones rather than bringing patients in for nightly study. There is potential to capture data through sensors, audio, imaging, all being brought into a central intelligence hub that has the in-built rules to speed up pathways. This could be life changing for patients if we can diagnose their cancer and have them on an MDT list in a matter of hours or days, rather than the weeks or months it can take today.

This also requires a rethink of current approaches to AI, beyond the very narrow and specific purposes that algorithms have in healthcare today. Joining up and connecting different AI algorithms through intelligence hubs could pave the way for the creation of clinical artificial general intelligence, potentially leading to a situation where AI can look for as many things as the clinician when they are exposed to patient data and images.

Doing all of this requires a co-ordinated joined up but agile approach. A single monolithic approach will take years, or even decades – and our clinicians and patients need change much quicker than that. Working with innovative partners that are pushing boundaries, we hope to be the first to do things that will deliver for our users’ needs and be replicable for others across health and care.