A year before leading the Conservative party to (partial) election success in 2010, David Cameron spoke to the faithful at conference and said "we will not make it if we pull in different directions, follow our own interests, take care of only ourselves."
This became known as the ‘we're all in this together’ speech (although just how genuine a desire there was to share the pain is highly debatable); and there’s a message here for healthcare IT.
Unless those employed at the coalface of care delivery and the deliverers of secure products and processes can agree that we're all in IT together, then the future of a secure NHS looks about as assured as a smooth break with Brussels.
Logged in culture
The truth is that healthcare is not the same as almost any other sector when it comes to data security. Conventional thinking, where security has to be seen as a business priority, gets kicked to the kerb.
When your business is saving lives, patient care – quite rightly – rises to the top and nothing must interfere with that. Unfortunately, this conflict can and does hurt both data security and patient care.
The insecure practice of leaving a computer terminal up and running and logged in, because it saves a few seconds over inserting a smartcard, is one good example.
This is probably the single most common insecure practice, and one that we've all (health workers and patients alike) seen in action on hospital wards, in clinics and at GP surgeries.
The scale of the issue has been shown by a study (of US healthcare providers, but that doesn’t really change anything in hands-on terms) called: ‘Workarounds to Computer Access in Healthcare Organizations: You Want My Password or a Dead Patient?’ highlights just why.
What is really interesting about the study is that it shows this practice has some other, unintended consequences.
For instance, the case study described a patient being prescribed the wrong medication as the doctor concerned didn't realise he was looking at a different case file when he went back to the computer.
Was that doctor to blame for a clinical mistake that should never have happened? Sure. If you want to apportion blame. However, I'd say that's less than helpful.
You could just as easily 'blame' whoever designed a system that left doctors feeling the log-in process was less a patient data safety mechanism and more an annoying encumbrance to clinical workflow.
One nurse in the study spoke of spending around 90 minutes of a 14-hour day logging into assorted systems. No wonder the study talks of a logged-in culture being regarded as a professional courtesy to whoever needs to use the tech next.
Minutes add up to hours
Back in the UK, in an online tech user forum discussing this very report, a doctor chipped in to comment that the fastest they had seen a switch from one user to another on a NHS computer was about 45 seconds.
The trouble was that the longest switch was three minutes. In an emergency department with half a dozen computers shared by ten doctors, and the same again in numbers of nurses, and with speciality teams passing through, that's "an awful lot of time being wasted switching users."
Unsurprising then, that during an emergency it can happen that a healthcare professional uses someone else's account because there isn't time to mess around.
Further down the same thread, itself a highly interesting example of how health and IT professionals approach the same problem from contradictory perspectives, the same clinician user shares more helpful insight.
They note that it isn’t just a case of logging into a workstation, but logging into anything from three to fifteen different clinical information systems depending on your particular speciality.
Many of them, in the user's experience, are web-based and often tied to an ancient version of Internet Explorer complete with obsolete ActiveX controls, so they all require different passwords. Not just different passwords, but different password management requirements, lengths, expiry dates.
In another forum thread, a user with experience from the IT management side of a hospital recalls how one audit showed that many PCs in public spaces, including wards and the A&E department, we routinely left logged in with doctors’ credentials.
As a result, he proposed that the computers should automatically lock down after a brief activity timeout period to protect patient data privacy and system security. Senior management responded that the additional time to log back in would compromise patient safety, so the idea was not approved.
Not that it would have mattered if the US study is anything to go by: doctors there put plastic cups over sensors to beat the movement detectors that locked terminals down when not in use, and one ward tasked the most junior doctor on duty with tapping the spacebar every five minutes to bypass the system.
Bending the curve, learning from David Cameron
Security exists on a curve with convenience at one end and defence at the other. The trick is finding a way to bend that curve so that the one touches the other.
In order to achieve anything close to that, the IT people have to understand the needs of the clinicians who, in turn, must appreciate the difficulties facing the techies. But when it comes to the scenarios describe above, that appears to be your problem right there; they are not all in this together.
The study authors get it right in their abstract when they say "understanding workarounds to healthcare workers' computer access requires not only analyses of computer rules, but also interviews and observations with clinicians."
The IT guys have to understand not only the clinical workflow but that lives actually are put on the line at the end of the day.
Equally, clinicians must embrace the realisation that a culture of security is required to protect patients, staff and establishment alike; lives are increasingly being put at risk by the insecure use of technology. Management, as the final part of this medical ménage à trois, is the oil that can smooth this mechanism into place.
As long as doctors treat IT as idiots, and vice versa, the only winners are going to be hackers and lawyers…
About the author: Davey Winder is a three time Information Security Journalist of the Year award winner, and regularly contributes to The Times as well as being Managing Analyst at IT Security Thing. @happygeek