As the NHS announces it has partnered up with Amazon Alexa which will bring verified health information to people’s homes, our cybersecurity columnist Davey Winder delves beyond the media headlines regarding data privacy. 

While in the middle of trying to pull my head off, or at least it felt like that, my osteopath asked me: “Who gave Alexa permission to access my NHS medical records?”

Given the painful position I was in at the time, I could barely mumble a “nobody, because it can’t” response.

His question was typical of much of the social media coverage, and sadly some of the mainstream media as well, regarding the news that the NHS is partnering with Amazon to offer health advice by way of the Alexa voice assistant.

Not that it’s misguided to be concerned about the privacy of our medical records, but rather both the public concern over access to those records and the NHS defence of the partnership seem to miss the real privacy pain point by a country mile.

Part of the problem sits with the original announcement of the Amazon Alexa partnership, which just stated “Amazon’s algorithm uses information from the NHS website to provide answers to voice questions.”

I don’t expect these type of media friendly announcements, aimed at the general public, to go into the technical minutiae.

However, privacy was the elephant in the room. By not making the privacy perspective clear from the get-go, this has allowed the subject to dominate the conversation rather than the positives of easy access to qualified health information.

This has led to NHSX appearing somewhat too defensive of the project, in my opinion, without addressing the actual potential for privacy problems.

Quelling the media storm  

As Digital Health reported recently, Tara Donnelly, the chief digital officer of NHSX, tried to quell privacy concerns by confirming no patient data will be shared with Amazon.

“The NHS is not paying for this service,” Donnelly said. “This is a collaborative initiative that draws on information that is already freely available.”

Indeed, NHS data is offered by way of various other services hooking into NHS websites using freely-available application programming interfaces (APIs).

Who is listening?

It isn’t the trawling of NHS sites in order to throw a qualified answer back from the data available that concerns me, nor should it concern my osteopath or you, dear reader. What should concern all of us, however, is how the questions themselves are handled, how they are stored and who has access to them.

Think I’m being a little dramatic about such things? Well, what if I were to tell you that last year an Amazon Echo voice assistant recorded a private conversation between a couple and sent it to one of the husband’s employees?

It was a mistake, of course, and Amazon said at the time the Echo had woken up when it mistakenly heard “Alexa” and then mistakenly heard “send message” and asked who to send it to and as soon as it heard the name of someone in the contacts list during the ongoing conversation sent it.

I mention this as it shows that things can go very wrong even with the best intentions in the world. And not everyone thinks that Amazon has the best intentions, or at least not just philanthropic ones, when it comes to a partnership which the NHS has said there is no financial agreement over.

Keeping data safe

Amazon is in the data business, and the making money business. Sure, when asked about privacy and the NHS partnership, the New York Times reports that an Amazon spokeswoman has stated “customer trust is of utmost importance, and Amazon take privacy seriously,” and the company is not building health profiles, no health information will be used to sell merchandise or make recommendations, and no information will be shared with third parties.

Which still doesn’t mean that the data from NHS queries will not be stored. It will be. As soon as the Alexa wake up word is heard, your Amazon device begins recording audio and transmitting that to the Amazon cloud.

Here’s what Amazon has to say about that in the official Alexa FAQ:

“Alexa uses your voice recordings and other information, including from third-party services, to answer your questions, fulfil your requests, and improve your experience and our services.”

What’s more, Amazon states that “we associate your requests with your Amazon account to allow you to review your voice recordings, access other Amazon services, and to provide you with a more personalized experience.”

Spanner in the privacy works

Still not too concerned that health questions will be stored?

Sure, you can delete voice recordings associated with your account from the Settings > Alexa Privacy options in the Alexa app, but I very much doubt the average Jo(e) will know that.

Which is before you even start to consider that the account holder may well not be the person asking the question, but does have the ability to listen to whatever questions have been asked.

The family dynamic when a parent could listen to the sexual health questions being asked by their children, for example, throws another spanner into the privacy works.

There is no default privacy mode with Alexa, beyond that ability to go and delete recordings if you know you can and where to go do it from.

Talking of which, you can ask Alexa to “delete what I just said” and it will. Maybe share that one with family and friends who use the thing.

Training Alexa

Oh, and Amazon also uses voice recordings to “train our speech recognition and natural language understanding systems” according to that FAQ.

As Bloomberg reported earlier in the year, Amazon “employs thousands of people around the world to help improve the Alexa digital assistant powering its line of Echo speakers. The team listens to voice recordings captured in Echo owners’ homes and offices.”

I’m not against better access to health information, and I’m not anti-Alexa or any other voice assistant. As long as they understand the privacy implications of the devices they are using, I’m perfectly happy for anyone to use these things all day long.

What I’m not happy about is jumping into the voice-assistant ecosystem without fully taking into consideration all the privacy implications that come with doing so.

There needs to be more debate, there need to be more guarantees, and there needs to be a stepping back from this partnership, even if only temporarily, until these implications are properly explored and mitigated…