Earning the public’s trust on health data 

  • 13 March 2023
Earning the public’s trust on health data 
Doctor and Guardian columnist Ben Goldacre has co-developed an online prescribing data tool.

Ahead of the Digital Health Rewired conference, Professor Ben Goldacre, director of the Bennett Institute for Applied Data Science, explains why information governance is not enough when it comes to protecting privacy.

The NHS holds an incredibly rich dataset, containing the medical records of more than 65 million patients going back for decades. These detailed records have the potential to help understand the benefits and hazards of different treatments, monitor and improve clinical services and drive life sciences innovation.

Yet this poses an ethical problem, which Goldacre was commissioned to investigate. “The same data that could do good work also contains the most private secrets of every citizen in the country,” says Goldacre. “The challenge is this: how do we get lots of analysts doing lifesaving work on the data, whilst also managing the risks when lots of people have access to confidential information?”

This dilemma was illustrated by the General Practice Data for Planning and Research (GPDPR) scheme in 2021, which intended to gather patient data held by GP surgeries in England and feed it into a central NHS database. Promises that the data would be anonymised were not enough to reassure the public about the privacy risks and millions opted out of the scheme, leading to it being withdrawn.

“We lost public trust, and so many people opted out that we lost a lot of data,” explains Goldacre. “The reality is that when you remove names and addresses from people’s records, it does something to prevent their data being misused, but it isn’t enough to ensure their privacy.”

To resolve this issue, Goldacre was asked by the secretary of state in 2021 to conduct a major review into the safe use of health data. The result was the Better, broader, safer: using health data for research and analysis report, known as the Goldacre Review.

Reassuring the public

“It’s not just enough to tell people to trust the good work you’re doing with data. You have to take practical steps and prove to them what systems and platforms you’ve got to protect their privacy. You need to show rather than claim,” asserts Goldacre.

Historically there has been a lot of reliance on information governance to protect health data, but according to Goldacre this is not sufficient.

“We have lots of rules for people who want to access data, so we can assess whether they’re trustworthy. That was the only way in the past that the system could think of allowing the privacy risks of letting people download the full history of millions of patients’ medical records onto their own machines,” he says.

“Those rules create huge delays and obstructions, which has caused tremendous frustration for analysts and researchers.”

Instead, the Goldacre Review recommended the use of trusted research environments (TREs), which keep all patient data on a central machine. This would allow NHS analysts and researchers to work within that secure environment, rather than downloading data onto their own machines.

“TREs are commonly used in other sectors, and absolutely perfect for the problem we face, because they allow lots of people to work on the data without having to worry so much about it being misused,” continues Goldacre.

They also manage some of the duplication and inefficiencies that happen when people work with data.

“When everybody is working on the same data in thousands of different places, they’re all duplicating that data curation work,” adds Goldacre. “That’s a spectacular waste of money, but also a risk because those hundreds and thousands of people doing the same job behind closed doors can’t learn from the great work people have done on data curation in other settings or see it and evaluate its quality.”

Data saves lives

The Government’s Data Saves Lives strategy published last year, adopted TREs as the normal way of working with health and social care data. There is a firm ministerial commitment to only reactivate the GP patient data collection when they have a TRE built on best practice.

An example of best practice quoted in the report is OpenSafely, a secure analytics platform for NHS electronic health records (EHRs) that Goldacre’s team has run on 58 million patients’ GP data for the past three years. The other is the ONS TRE that has been running on census and other data for the past two decades.

Next on the horizon is implementing the Goldacre Review’s recommendations to create standard training, job descriptions and clear career progression for data analysts in the NHS.

“The NHS analyst community has outstanding pockets of excellence but suffers from being dispersed across hundreds of trusts and local areas,” says Goldacre.

“If you want excellence to spread you need to have clear lines of communication and good quality training so different people in different places are speaking the same language.”

Ultimately Goldacre’s vision is to see data used more efficiently so as to provide better, safer patient care.

“I don’t think the big gains in quality of life and life expectancy are going to come from single new pills.  I think they’ll come from improving the efficiency and logistics of delivery of care in the health service. And that’s supported by the better use of data,” he concludes.

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

Health tech can help reframe ageing as an opportunity not a problem

Health tech can help reframe ageing as an opportunity not a problem

Edinburgh's new Global Research Institute in Health and Care Technologies is working on solutions that will enable more people to age well, writes Professor Alan…
WHO launches collaborative network for data and digital health

WHO launches collaborative network for data and digital health

WHO is bringing together its European region member states with partners for a network focused on advancing data and digital solutions in health.
Calderdale and Huddersfield awarded HIMSS stage 6 for analytics capabilities

Calderdale and Huddersfield awarded HIMSS stage 6 for analytics capabilities

Calderdale and Huddersfield NHS Foundation Trust has achieved a stage 6 validation from HIMSS for its use of data and approach to data science.

6 Comments

  • I’d agree the TRE approach is positive – but isn’t the problem really who/what would control it?
    The repeated efforts – over many years – of successive secretaries of state for health to establish that they have the right to own patients’ identifiable data (medical records) – & to use it as they felt fit has almost become a folk memory!
    The mechanics of the TRE are good (AFAIAA used in QResearch) – but there does need to be Trust in the people controlling the TRE.
    After care.data & GPDPR, – & the subsuming of NHS Digital/HSCIC/NHSIC into NHS England,- how can I have confidence that the current – AND ALL FUTURE – organisations managing my confidential data (repeatedly stated to be *financially* valuable) can be trusted in intent – & competence?

  • First – people intuitively recognised the claims of anonymity were bogus – because they always were – as this article says, the data was pseudonymised. Any half-competent data anaylst will tell you data about a person, especially rich data like a medical record can NEVER be anonymised.
    2. This TRE concept is a pretend shutting of the door after the horse has already bolted – because the NHS data set has already been shared with Microsoft over 3 years ago (and probably others) as many Trust executives know – they had to facilitate it
    3. Consent is not hard to do despite claims otherwise – gaining trust is – and should be! Th consequences of a misstep here are too dreadful to contemplate and the NHS does not have a positive track record on this subject – just as most enterprises tbh
    4. If you want my trust do this:
    a) get my consent
    b) clearly convey a limited well defined purpose for its use
    c) communicate to me what the consequential valuable outcomes of using my data were – because I might just feel more like consenting to the next limited purpose request then!

  • Yawn. Why don’t we just unleash the data by getting consent ? https://www.digitalhealth.net/2016/04/joes-view-of-consent-to-share/

    • Think how much good could have been done, and how many lives could have been saved, if care.data had been done consensually ten years ago

  • People don’t trust opt-out systems. People expect to be asked for their consent. That’s what the problem is. You can dress things up as anonymous or whatever, people still won’t trust them.

  • I think the TRE approach is very positive, compelling and makes a lot of sense, but i guess the question i now have is ‘can the citizen still opt out, or is their data going to be used no matter what their point of view’ as that seems to be the direction of travel.

Comments are closed.