NHS IT infrastructure is “not fit for AI” and a “large push” towards standardisation is required if the benefits of the technology are to be realised, a new report has concluded.
‘Accelerating Artificial Intelligence in health and care: results from a state of the nation survey’ was published last week during the NHS Health and Care Innovation Expo in Manchester.
Compiled by the Department of Health, NHS England and The AHSN Network, the report assesses the state of artificial intelligence in the UK healthcare sector, its potential impacts and the barriers to greater adoption.
It concludes that while AI is anticipated to have a transformative impact on healthcare, a lot of groundwork will be needed in the interim.
“Data readiness” – the process of getting data to a consistent standard that can be read by AI algorithms – was a key concern among the 106 survey respondents whose answers formed the basis of the report.
These respondents – described as “CEOs, senior managers and others working across the AI ecosystem in England” – shared the opinion that NHS IT lacks the necessary standards to support data sharing and drive value from it.
“The view emerged that the underlying data infrastructure is not fit for purpose for AI and requires standards to facilitate data sharing and the development of appropriate commercial models to leverage the value of public/NHS data,” the report read.
It also highlighted problematic agreements between NHS organisations and companies hired to process data, saying that datasets often ended up “in proprietary format or in difficult to access repositories”.
The report also reviewed the type of AI solutions currently in use, concluding most used “lowest complexity” statistical techniques. These were classified as systems that used “single specific reasoning methods”, including neural networks and pattern recognition algorithms.
Machine translation systems and chat bots made up the high end of the complexity scale. Such methods were used by just 8% of the AI solutions surveyed.
“Whilst AI solutions are increasing in their complexity, most now delivering impact are on the low end of the complexity spectrum,” the report read.
In terms of where these systems are being put into action, 75% of the solutions claimed to be designed to unlock value in data and analytics. This was followed by condition recognition (60%), organisational processes, such as automating routine clinical and admin tasks (50%), leveraging skills and capacity (43%) and ‘other’ (24%).
Open sesame or closed doors?
The findings revealed that 35% of solutions currently being employed in health and care settings were proprietary or ‘closed source’ systems, while only nine percent used open-source licenses that could run on any platform.
The report suggested this presented a barrier to widespread data-sharing across healthcare organisations, which could limit the effectiveness of AI initiatives.
“Liberating both data and applications and making them portable and interoperable… facilitates innovation and competition, and forces vendors to compete on quality, value and service,” it said.
Meanwhile, less than a fifth (18%) of solutions had secured regulatory approval in the UK, EU or abroad, while 23% were cited as “in the process” of gaining such approval.
Individuals who were surveyed for the report suggested there were shortcomings in the ability to effectively regulate AI using “legacy” frameworks, arguing this could limit widespread adoption in healthcare.
Perceived “gaps in regulating AI-enabled products and services” were highlighted, and “a clear need for a new regulatory framework to keep up with advances in AI” identified.
Last week, the UK government published its new code of conduct on artificial intelligence and data-driven technologies in healthcare, which establishes guidelines to which those working in this field should adhere.