The idea of measuring digital maturity has something of a chequered history in the NHS.

Back in the days when we were all supposed to be getting comprehensive, digital systems off the back of the National Programme for IT, the idea didn’t seem all that relevant.

We would all receive a comprehensive estate of systems; and that would be that. At least, that was the plan. Following the failure of NPfIT, we need to think again.

As I’ve noted before in this column, while there are more than 30 hospitals in Europe that have achieved HIMSS level 6 or 7, none of them are in the UK. By inference, we may take it that we are relatively immature in our e-health adoption.

However, HIMSS is a historic, US-originated model that requires systems to be implemented in a defined pre-determined sequence when, in practice, there are significant differences between the healthcare systems of the US and the UK, and between hospitals when it comes to the adoption of electronic workflows.  

There is now another metric available that takes some account of this. All acute trusts in England are ranked on the Clinical Digital Maturity Index, which was launched by EHI last year, with the backing of NHS England.

I was recently invited by EHI to attend the inaugural CDMI steering group, along with colleagues from around the country – chief information officers, chief clinical information officers, and other healthcare IT leaders from both trusts and industry.

At that meeting, I put myself forward to chair the group and had the honour of being accepted. The other good news is that a fellow clinician, Mike Fisher, the chief clinical information officer of Royal Liverpool and Broadgreen University NHS Hospitals NHS Trust, will be my deputy chair.

It is so important that clinicians take a lead role in driving the healthcare IT agenda. From the above, you could reasonably infer that I think that measurement is worthwhile, but a note of caution first…

What do IQ tests measure?

IQ tests are some of the best known tests going. However, they do not measure your intelligence: instead, they measure your ability to correctly answer IQ tests.

The aspiration is that there will be strong correlation to your true intelligence, but there are numerous pitfalls. IQ tests have previously been used as “evidence” of the inferior intelligence of other (non-white) races and similar, pernicious nonsense.

But the flaw in such arguments is obvious. An important factor in how well you do in an IQ test is how familiar you are with the type of questions they contain.

You need to be educated in some mathematical concepts and to be familiar with the abstract representation of concepts that are used in the tests. Education and cultural background are therefore major determinants of ability to score well in IQ tests.

How mature are we?

With that health warning in mind, I think having a nationally recognised and comprehensive metric for measuring our “digital maturity” is of great value.

Being able to benchmark your trust against other organisations and understand the deficiencies in your own system implementation is an important driver to make things better.

There is much work to do out there. One statistic that I recently came across is that while 70% of trusts have some form of electronic prescribing, only 13% have comprehensive, inpatient e-prescribing.

Implementing e-prescribing solutions isn’t easy, and the risks involved are obvious. At Liverpool Heart and Chest Hospital NHS Foundation Trust we are in the unusual (and I believe unique in the UK) situation of switching e-prescribing solutions. In so doing we removed paper processes for injectables, infusions and challenging drugs to prescribe such as warfarin.

How mature is the tool?

Our situation demonstrates why the CDMI needs developing. At present it is a survey of some of the main administrative and clinical systems that you’d expect a trust to have in place; but what actually matters is delivered functionality.

Our new e-prescribing solution has more functionality than our old one, in that it covers processes that were previously part completed on paper.

It also has a better user interface and the benefit of being an integral part of our electronic patient record for workflow and clinical decision support functionality. However, as things stand, we don’t get more points in the CDMI for our new system than for our old one.

Another stark example for us at Liverpool Heart and Chest Hospital is scheduling systems. Our catheter labs use a system that was developed in-house. It’s a great, real-time workflow solution for planning cases and it acts as a communication tool on the day of the procedure.

In the operating theatres, my surgical colleagues are still using an old, simple, web-based database that has a poor user interface and none of the workflow benefits of the catheter lab system.

These applications both “tick the box” as a scheduling solution; and yet one is widely recognised internally as being vastly superior to the other.

This challenge of measuring delivered functionality has led an organisation called KLAS to take a radically different approach; focussing on the satisfaction of healthcare providers with their suppliers and the perceived delivered benefits of healthcare technology.

Obviously, this leads to a rating of suppliers and of their technology, rather than to a measure of the digital maturity of provider organisations; but the juxtaposition with the current CDMI approach is illuminating.

Outcome improvements are the true goal

So, functionality matters, and the presence of systems is an imperfect surrogate for that. But we actually need to dig even deeper. What really matters is improving outcomes.

Patients don’t care if we use HL7 or XML for messaging, or have the slightest interest in any of our other, technical discussions.

What interests them is that their data is available to the clinician caring for them, and that there are powerful clinical decision support systems to flag potential problems early. They enjoy the engagement created by being able to access their own records.

From an organisational perspective, clinical systems can help drive down length of stay because those same decision support systems facilitate the timely referral of patients, and because the clinical decision support engine can help to prevent avoidable harm. They can also reduce administration, audit and clerical overheads.

Still a good start

It might sound as though I feel that the CDMI is a long way off the mark in what it is measuring, but that isn’t the case and that wouldn’t be a fair criticism.

The CDMI as it stands today has two huge distinctions (and these obviously pre-date my own involvement). First, it exists as a metric of UK healthcare IT implementation, and second it is a comprehensive survey of the UK’s acute trusts. In both respects, there is nothing else there out there that is comparable.

I am keen to build on what I regard as foundations and to develop something that begins to re-focus on measuring delivered functionality and outcome improvements. But Rome wasn’t built in a day…

The Clinical Digital Maturity Index has been developed by EHI Intelligence. Trusts can check their CDMI data and access new dashboard tools for free. Anybody interested in the steering group and the further development of the CDMI should contact Karl Grundy.

 

Dr Johan Waktare

Dr Johan Waktare is a consultant cardiac electrophysiologist at Liverpool Heart and Chest Hospital, specialising in interventional procedures for heart rhythm disorders. He is the clinical lead on the trust’s electronic patient record project, as well as being a clinical lead for IT and the trust’s Caldicott Guardian.

A self-confessed IT geek, Dr Waktare has always been interested in computer hardware and software. His status was cemented when, several years ago, the IT helpdesk agreed to replace a user’s PC rather than look at it – after hearing that he had failed to repair it.