Doctors texting clinical patient data on their smartphones is just wrong. Surely everyone can see that, so why is mobile security proving stupidly problematical in the NHS?
It’s not as if there are no alternative approaches. Just the other week, Digital Health Intelligence had news of an award-winning clinical imaging app developed in conjunction with University Hospitals Birmingham NHS Foundation Trust.
The Secure Clinical Image Transfer app has been designed from the ground up to be secure; it was at the very heart of the design and development process.
As Jane Tovey, the medical illustration services manager at University Hospitals Birmingham puts it: "The encrypted data package is monitored to ensure it is delivered to the correct patient notes and all parts of the pathway are safe and auditable."
They know it’s a problem. They do it anyway
One BMJ Innovations report last year, 'The ownership and clinical use of smartphones by doctors and nurses in the UK: a multicentre survey' study reveals why this should excite you as much as it does little old security geek me.
According to that particular research, 92.6% of doctors and 53.2% of nurses said their smartphone was ‘very useful’ or ‘useful’ in helping perform clinical duties. Some 89.6% of doctors and 67.1% of nurses who owned medical apps were using them within their clinical practice.
When it came to SMS, 64.7% of doctors were sending patient-related clinical information to colleagues this way. 46% were also using picture messaging, and 33.1% app-based messaging.
The statistic of note, however, and the one that rings very loud alarm bells was that 71.6% of doctors and 37.2% of nurses "wanted a secure means of sending such information."
Say what? On the face of it, these health professionals knew that they were sending clinical data using an insecure method, but continued to do so regardless. I've said it before, and it bears saying again: "Instead of seeing security as vital to service delivery, patient care is understandably the top priority."
SCIT would appear to have delivered on the timely patient care priority without devaluing the secure integrity of the app. It shows that in this emerging age of mobile healthcare, security doesn't have to take a back seat. All it takes is the will to take security seriously.
And the money.
Wouldn't it be a shame if the desire to be secure is dampened by the reality of funding it?
Encrypt and survive
When enterprise security vendor Sophos commissioned research into the IT security levels within the NHS at the end of last year, it discovered that 84% of NHS-employed chief information officers, IT managers and the like that it questioned believed that encryption was "becoming a necessity."
The same study, sadly, revealed that this realisation and actual delivery were wildly disparate things.
Only 10% of those asked actually had encryption "well established within their organisation." Drilling down into the numbers revealed 59% had email encryption, 49% file share encryption and 34% cloud data storage encryption.
Jonathan Lee, the UK healthcare sector manager at Sophos UK and Ireland, blamed budget cuts as one of the reasons that encryption tended to stop at laptops and USB sticks.
Yet, as that BMJ Innovations report I mentioned earlier shows, the drive towards a mobile healthcare delivery system is taking a detour via many insecure routes.
The Sophos survey tells us what we already know, that those on the tech side of healthcare understand (42% anyway) that the mobile-working uptake is forcing a change in IT security requirements.
Yet when cloud company Accellion published details of a Freedom of Information request, it suggested nearly three quarters of NHS trusts didn't include mobile devices in any cyber security training programmes.
The same FOI request also revealed 80% of trusts had given staff mobile devices, from which many were accessing patient records. Yet only around half of trusts were providing 'secure applications' to share that data.
No wonder that healthcare was responsible for most data breaches last year according to the Information Commissioner's Office.
In the final quarter of 2015 the health sector had 184 breaches reported to the ICO; compare that to the second placed local government sector on just 43 breaches during the same period.
(And yes, I do know lots of those don’t relate to IT but instead to misplaced files and faxes – but that just underlines my point; the NHS needs secure ways of sharing information).
Time for a culture shift
I'm not saying that encryption is a magic bullet when it comes to securing electronic systems; and in isolation cannot be expected to work security miracles. What it can do, however, is mitigate against the value of stolen data and bolster trust in those services moving sensitive data around such as healthcare.
You wouldn't accept a financial app that moved transactional data around without encryption being a pre-requisite for use would you? So why are apps that don't secure patient data thought to be OK as some kind of trade-off between care delivery and privacy?
It's time for a change in the culture of mobile care delivery, and encryption has to be right there in the mix of things. SCIT has not only shown that it can be done, but should be held high as a prime example of how it should be done.
About the author: Davey Winder is a three time Information Security Journalist of the Year award winner, and regularly contributes to The Times as well as being Managing Analyst at IT Security Thing. Follow him @happygeek