Following on from reports US healthcare provider Ascension, the largest non-profit system in the US, was providing Google with access to 50 million private medical records, Finn Raben, director general at ESOMAR, a membership organisation which representing the interests of the data and research, explores why trust needs to be at the heart of data sharing agreements in healthcare.
This month the Wall Street Journal reported on how US healthcare provider Ascension, the largest non-profit system in the US, was providing Google with access to 50 million private medical records. The report revealed that Google’s aim was to create productivity tools for doctors that sift through complete medical records for pertinent data and make recommendations or to quote them directly : “Google is using the patient data to tune artificial-intelligence software that may help improve patient care”.
According to Google nothing about this activity was illegal in the US, but many media reported that neither patients nor doctors were notified, and mention estimates of 150 Google employees having access to the data, which included lab results, diagnoses, and hospital records, all of which provide detailed information on people’s health histories.
There was further Google and medical data controversy when Google acquired the health division of London-based AI firm Deepmind.
Addressing public mistrust
As you will have read a multitude of times, technology, whether that’s social media, AI, big data analytics, or hardware has a vital part to play in the improvement of healthcare provision and diagnosis.
However, big tech, particularly when it comes to data collection and storage is an area viewed with significant mistrust from a public perspective – and there is a growing body of evidence to suggest they have very good reason to be so distrustful.
Our own data shows that 74% of the UK public are concerned with sharing their personal data, while only 28% agree that current laws and regulations ensure that no misuse of personal data occurs.
Clearly consumer trust is of vital importance in healthcare, yet according to a recent survey by the Open Data Institute only 59% of people in the UK trust the NHS to be ethical in its use of their data. On the flipside it did perform better than national, and central government, universities, and banks, although I’m sure we can all agree it should be higher.
While public knowledge/awareness of the extent to which their personal data is being monetised by big tech is quite low, there is an inherent and increasing consumer trust issue amongst many technology businesses driven by month on month breaking stories about data breaches.
What we will see more of is a knock on-effect in industries that align themselves with the tech industry in the future. This is inevitable across industries, so what can you do to ensure you’re treating public data ethically and building patient and consumer trust?
Importance of data protection officers
Internally, it’s important to have a data protection officer (DPO) that knows what they are doing. The recent GDPR legislation introduces a duty for you to appoint a DPO; however, there are no prescribed qualifications or skills associated with the role.
What you must do is ensure they have the right training and right skills to do the job effectively and that they are empowered to perform the task without the risk of penalisation. DPOs advise, they serve as your most important warning bell, but in the end, it is the organisation that bears the ultimate responsibility.
What to look for in a tech partner?
However, it’s external partners and suppliers that you have less control and potentially more of a challenge with. Unfortunately, a common assumption in technology firms has been that consumers will only share their personal data if they don’t know it’s happening, although our data suggests that 3 out of 4 Brits would be happy to share their data if the data collector is trusted and reputable, and two-thirds of consumers would likely share more data if they were informed. So, when it comes to tech partners, what do you need to look out for?
Firstly, GDPR legislation requires a much higher bar when it comes to personal data. Whilst flexibility is given by offering a myriad of legal bases for the collection and use of personal data, each with their own requirements, limitations and benefits.
Regardless of the legal base, a great deal of focus is placed on transparency, accountability, and consumer trust and control. If you’re working with consent, it must be freely given, specific, informed, unambiguous and affirmative. Check this with your partner, pre ticked boxes and long terms and conditions forms are certainly not good enough.
This should be viewed as a positive thing for consumer trust, but it does mean that many tech partners must be providing information to data subjects on how data is stored, and what it is going to be used for. This becomes vitally important if there are changes to the initial use of the data, especially if that change wasn’t initially foreseen; at that point consent becomes first amongst equal.
One extremely important thing to look out for in data collecting or tech partners is whether they have voluntarily signed up to an ethical set of codes and guidelines.
There are a number of data associations, both local and international, such as the Market Research Society (MRS) in the UK and ESOMAR globally, that require members to sign up to a set of ethical codes and guidelines, almost all of which go further than required law.
They are expressions of good practice that every regulator worldwide frequently recommends and offer a great platform for future-proofing your activities.
Data protection is a hot button issue at the moment across the public and commercial spheres, but do not think it’s just a passing fad.
With increased scrutiny from the public, press, and lawmakers, it’s vital that we are all going beyond the law to encourage trust and secure a future where personal data can still be used to improve patients’ lives.