The Health and Social Care Information Centre is creating a repository and library for quality indicators to reduce duplication and improve standards.
From 1 April, the new HSCIC, now an executive non-departmental public body, was given a statutory role to analyse, collect and present national data on health and social care.
Director of information services John Varlow told HC2013 last week that the centre is establishing a database consisting of a repository and a library of quality indicators.
“It’s a repository, which has all audit details of indicators; who has asked us to create them, what the decisions have been and how we have assured them. Having that detail is to ensure that people don’t duplicate this,” said Varlow.
“We also have a library, which consists of things which have gone through an assurance process and have a quality stamp on them, to say they are good quality indicators.”
Varlow said that too many quality indicators are in use in the NHS and that this has created a big problem.
“As an example we got over six different ways of doing re-admission in care. This is really problematic.
"You can’t have one indicator that does everything, but the aim should be to have as few indicators as possible to do as many things as possible,” he said.
“I would argue that there are very few consistent methods in terms of producing indicators.
"We’ve got indicators coming out of our ears; we got lots of indicators in the systems that don’t have good quality data beneath them.
“Ultimately that creates an extra burden. It’s time that we get something going that makes a little bit more sense.”
He added that defining the quality indicator, why and how it was to be used, was crucial.
“One of the key things is the purpose of the indicators. We need to know why we are defining the indicator. You can’t just pick one and use it for everything you feel like, he said.
“If you’re doing it for financial purposes you are going to define that indicator differently than if you were doing it in terms of performance.
"That doesn’t matter, as long as it’s clearly defined in the library that it’s created for financial purposes only.
“Simple things like: Is it accurate? Is it complete? Is it timely? If it doesn’t match the criteria, it won’t make an indicator.”
Every new indicator from now on will be subject to both external and internal peer review, as well as a governance board review before gaining a ‘stamp of approval’.
Varlow said that anyone could request an indicator, but that the creation of a new indicator would be subject to a rigorous process.
“What we’ll first of all do is say: Is there something in our repository already? If it is, can you use that? Why do you need to do it differently? Or is it something that’s already in our library, then why don’t you use that,” he said.
“We are working closely with NICE in terms of their quality standards and in terms of guidelines. Part of this process is that we are not going to create indicators where they don’t have a place.
“If somebody is doing something which is clinical I would expect to see it underpinned by a NICE guideline.
"I wouldn’t want to have someone create an indicator that doesn’t marry up. It’s about making sure as much as possible in the system is tied together.”
He added: “The information centre is there to get data in and doing stuff with it.”