Difficult issues around consent in AI and data cannot be addressed through legal processes or focusing on patients’ rights and autonomy, suggested an expert panel in the final session of Digital Health’s AI and Data event on Tuesday.     

Will Navaie, head of ethics operations at Genomic England, said there was often an assumption that consent is about respecting people’s autonomy. “No, it’s not. People in a vulnerable situation are seeking help because they cannot fix themselves. There needs to be a relationship of trust with their healthcare providers.

“I think trust in the system has been eroded. I’m interested in how we can repair it, so we don’t need to rely on legal processes.”

Navaie said there was a need to build a “safe environment” for innovation, where people can “create products that work and have value”. He used a “farm to plate” analogy, pointing out that sausages produced from well cared for animals are “less likely to make you sick” than cheap sausages.  “We should think about data in the same way.”

He argued that the need for consent was being used as “a sticking plaster for broken systems”.

The panel agreed that consent should not be viewed as one-off event, which can then be taken for granted. “You need to talk to people on a real, continuous basis. There is no one moment of consent – this is a relationship,” said Martin Farrier, chief clinical information officer at Wrightington, Wigan and Leigh NHS Foundation Trust.

Farrier said his organisation’s “huge pile of data”, the largest in the NHS, had made it possible to prove the effectiveness of CPAP (continuous positive airway pressure) as a treatment for covid-19, early in the pandemic.

“We were able to publish that data ahead of anywhere else in the world. It changed the way care was given for covid. [That was] hugely powerful,” he said, adding later in the discussion: “We didn’t ask for consent [to use the CPAP data] – the data existed. Consent was implied.”

Ownership of data a heavy responsibility

Farrier stressed that a huge database, though helpful in directing decision-making, is also a heavy responsibility.    “What scares me, is I have a responsibility to do something useful with it, without causing people huge offence. Consent is implied – but there are a lot of uncomfortable and unanswered questions.”

He suggested that a way forward is for people to be continuously managing their data in a way that builds trust in the process.

However, such an approach could bring “a risk of complacency” said Rafiah Patel, chief digital ethics and assurance officer at Surrey and Borders Partnership NHS Trust. “What about people who can’t consent or are giving consent that is not informed?”. She called for “checkpoints” in the system, where the use of data would have to be justified.

The potential for patients’ data to be commercially exploited was another concern for the panel. “These things are becoming uncomfortable,” said Farrier. “I don’t’ know where the line is.”

He suggested changing the ownership of data, via the NHS app, so that clinicians “have to ask you for permission to use your data”.

“That’s great for privacy, but it’s not great for innovation,” said Navaie. He added that changing the system so people could withdrew permission to use their data risked exacerbating healthcare inequalities, as groups who already distrust health services excluded themselves.

The panellists emphasised the importance of engagement with the general public – but agreed this will take ongoing commitment. “Engagement is hard and messy. We have to invest in it,” said Patel.

In his closing remarks, Navaie said it was a mistake to get “hung up” on formal or legal consent. “Consent, even when explicit, doesn’t need to be written down. Let’s get hung up [instead] on transparency. If we don’t know what’s happening, the option to opt out is meaningless.”

Earlier on at the event, Ming Tang has told the AI and Data audience that the procurement process for the federated data platform is awaiting full business case approval.