Adrian Byrne, the chair of the Digital Health CIO Networks muses on why things are simply not joined up enough in healthcare.

The events of 2020 have accelerated the appetite for data in all quarters. It feels like it from the viewpoint of someone who is largely asked to make provision for it, with enhanced situation reporting changing on a daily basis, local and national drivers, and many people working on modelling.

The endless showing of charts and models in daily briefings have shown that when there is a need to follow the science, we also need to follow the data – next slide please!

However, the required pace of deployment coupled with the use of what I would call consumer based tools means that large data collection can be struck up in an organization without a CIO getting to know about it. Information flows risk becoming fragmented and duplicated as people respond to many masters filling out web forms and spreadsheets, often with poorly defined data items. Add to that the ability for just about anyone to produce a pie chart and present it in PowerPoint and you have the recipe for a meltdown in any kind of data flow or data structure.

Defining a data point requires a certain pedantry that is often not present when a request is made for example for a predicted bed state.

Swamping the system

I was wondering if it was any different and thinking back to the old Data Standard Change Notices (DSCNs). They are still there, in the form of Information Change Notices, but the range of data collected that passes all of this by seems to swamp the whole system. Be it maternity, cancer, situation reporting or Covid, we are in a spiral that makes you feel sometimes like you’re in a washing machine, but is it real or perceived?

Certainly when you mention it to operational managers or other CIOs you get a bit of a groan. It is somewhat inevitable that the rather slow consultative approval processes and mechanisms for national data collection will be bypassed when the ask is changing on such a frequent basis, but we need to find a happy medium.

There used to be a simple mantra associated with data for reporting. It should always be a bi-product of operational activity. I think that principal has gone. Now it seems it is so easy to throw up a web form with a simple user interface and drop down menus, that those who desire the data seem to disregard the number of combined hours it will take to collect it.

The trend does not just apply to data set collection. I remember a couple of years ago commenting on a referral process for extra CT/MR capacity that expected clinicians to log in separately and re-enter all of the criteria, with no provision to submit any electronic order. Without any integration back to reference data, the lack of data quality must call the validity into question anyway. The double whammy here is that if the data is required locally, then it has to be entered twice.

Giving information value

I know in theory we can carry out some kind of Robot Process Automation (RPA) for transferring data, but really there is a right way and a wrong way and new systems should not fall back to this mechanism. Nothing like this should be unleashed onto operational staff without the capability of accepting a proper interface connection from those that are able. Instead there is a temptation to provide a lowest common denominator for all without sufficient consideration given to the value of information. There is a chapter on this in Douglas W Hubbard’s book ‘How to Measure Anything’ if you’re interested, but the gist is quite easy to pick up. Give information a value and think about what you’re willing to pay to obtain it. The use of the information is very removed from the “front line” and there is a lack of perception of operational issues so there can be a tendency to trivialise or to not even realise that an extra task is required to create data. We hear a lot about the value of data for research but very little goes into the cost of collection if a separate process is required to do so. Really, each request for information should have a short piece of analysis around how much it took to obtain it, and this might help to moderate some behaviours.

Baked in Interoperability

Interoperability certainly helps, and can go a long way to eradicating the pain of data collection, but the concept of “Baked in Interoperability” from the Bob Wachter review and report Making IT Work: Harnessing the power of Information technology improve care in England, seems to have been lost in the mists of time. Without the right design principles of course, you can’t bolt things onto systems that easily.

The reason you have to bake it in is that it is a part of the ingredients of the cake, not the icing. Ordering things this way means it never gets done, hence we have seemingly jumped to “interoperability as an afterthought”.

Doing things right

Maybe I’m just in a moaning mood, but this tendency to not do things right seems to go right across the spectrum. I was in conversation with an auditor the other week, and I was on one of my pet hates – documents written as PowerPoint slides. I asked why they did this. The response was they disliked doing them but it was the so called template and that’s that. I’m tempted, once again, to refer you all to Dr Harvey’s road to Abiline from 1981, a parable of the consequences of not questioning why we are doing something, because it feels like we are on the way (do I owe Joe McDonald £5?).

Therefore, it seems to me that whilst the ready access to technology of all kinds is a great asset, it can also be a recipe for problems reminiscent of the kind of thing that happened once users got their hands on Microsoft Access a couple of decades ago. We never learn. Be it a “low code” environment, a spreadsheet, or even PowerPoint, we need to take more care and give more instruction on how things are used, as with the versatility now on offer, where you can make a silk purse you can also make a sow’s ear.

It remains to be seen whether the response to the Topol review, Preparing the Healthcare Workforce to deliver the digital future, and digital literacy in general, will get us there.

Keeping control over Teams

Microsoft Teams is the latest threat in this regard. If we do not keep control of how Teams and channels are created, what they are called and what they are for, we will have a chaotic environment that will include personal identifiable data in a completely unstructured spaghetti with no capability of running any kind of retention policy.

If we now start to include clinical information flows in Teams this will exacerbate the problem as it will bring in subject access in addition to freedom of information requests, plus parts of a record that are not capable of easily being summarized into a view. Maybe we should stick to using Teams as the mechanism of transport/delivery, but not the store?

In Toyota they have a useful practice as a part of the lean thinking methodology, called genchi genbutsu, literally “real location, real thing” but translated to English as “go see for yourself”. Anyone doing anything in digital should always be made to go see for themselves. I think it happens a lot where assumptions are made, sometimes based on outdated practices, and only observed in one place if at all. I’ve seen data sets that are not data sets, data items that are undefined or ill defined. I’ve seen requests for data items and reports on data points that are not collected. These can come from a local or national source. Sometimes there is a tendency to work on the basis that if you can think it then you must be able to report on it.

I can’t change or rectify any of this by writing such an article, I know that, but I have a simple request. When you are doing something [that involves information], think about the end to end process. Think about the extra time it may take. Think about interoperability and design it in, and don’t release anything until it is truly baked in. Think about the toolset you’re using – is it the right one?

There are plenty of memes showing the easy wrong choices versus the difficult right ones. Strategic solutions will usually cost more in the short term but this should not deter us from pursuing them. Let us not be driven by simple solutions using poor techniques but let’s try to do the sometimes difficult but right thing.