The NHS “cannot afford” to not use a contact-tracing app but special measures need to be taken to limit the risk a user could be identified, according to a new paper.

Imperial College London has published a white paper outlining the eight questions governments, public health authorities and developers should consider when developing contact-tracing apps.

Such apps could prove useful in avoiding long-term confinement measures, the college said, but as they collect sensitive information like location data, Bluetooth-enabled proximity information, and whether individuals are infected, caution needs to be exercised to protect privacy.

Dr Yves-Alexandre de Montjoye, of Imperial’s department of computing and author of the paper, said: “We need to do everything we can to help slow the outbreak. Contact tracing requires handling very sensitive data at scale, and solid and proven techniques exist to help us do it while protecting our fundamental right to privacy. We cannot afford to not use them.

“Our questions are intended for governments and citizens to help evaluate the privacy of apps. They could also for app developers when planning and evaluating their work.”

Dr de Montjoye answered the questions below:

1. How do you limit personal data gathered by the app developers?

“Large-scale collection of personal data can quickly lead to mass surveillance. We should ask how much data the app gathers – like the whole disease trajectory and real-life social network of infected users.”

2. How do you protect the anonymity of every user?

“Special measures should be put in place to limit the risk that users can be re-identified by app developers, other users, or external parties. Because location traces are unique, they might easily be linked back to a person.”

3. Does the app reveal to its developers the identity of users who are at risk?

“The goal of contact tracing is to warn people who are at risk, so there’s no need for app developers to know who these people are.”

4. Could the app be used by users to learn who is infected or at risk, even in their social circle?

“Personal health data is very sensitive. Digital contact tracing should warn those who are at risk without revealing who might have infected them.”

5. Does the app allow users to learn any personal information about other users?

“Having access to small amounts of information could help users identify who is infected, so apps shouldn’t disclose information on a user’s location or social networks to other users.”

6. Could external parties exploit the app to track users or find out who’s infected?

“Apps should consider the risk of external adversaries, including well-resourced ones. External entities could install Bluetooth trackers to cover a city, or install malicious code on phones, and record the identifiers that they observe in specific locations. This can be avoided by regularly changing and re-anonymising identifiers like location data.”

7. Do you put in place additional measures to protect the personal data of infected and at-risk users?

“The app design may require revealing more personal information about users who are infected or exposed, but these are often the people who are more vulnerable and at risk. It’s important to consider what additional measures can be taken to protect their information.”

8. How can users verify that the system does what it says?

“Large-scale contact tracing is too sensitive an issue to rely on blind trust. Technical measures should be used to guarantee public scrutiny on the functioning of the app. Transparency of the system (app code, protocol, what is being broadcast, etc) is fundamental to guarantee privacy.

“This requires that the app be open source and app versions distributed on mobile app stores be verifiable, enabling developers to confirm that they’re running the public, auditable code.”

Privacy and effectiveness

It comes at a time when the technology is attracting questions over privacy and effectiveness.

An open letter from hundreds of academics from 26 countries urged governments and public health authorities to evaluate the potential dangers of developing contact-tracing apps, which could “catastrophically hamper trust” if they become a tool for “large scale data collection on the population”.

A similar tone was struck in an Ada Lovelace Institute rapid review of the technical, social and public health evidence for contact-tracing apps, which found “absence of evidence” for their deployment.

Privacy group medConfidential has also called on NHSX to be “upfront” about their plans for a contact-tracing app.

The app, understood to be using Bluetooth to trace users, allowing people to input their own symptoms, alerting anyone they have come into contact with that they may have been exposed to the virus. NHSX has not provided further information.

Recent research from Oxford University, which is advising NHSX on its development of an app, found an app could help stop the pandemic but only if 60% of the population used it.

The team simulated coronavirus in a model city of one million people and found a “digital contact tracing app, if carefully implemented alongside other measures, has the potential to substantially reduce the number of new coronavirus cases, hospitalisations and ICU admissions”.

The same team has previously suggested current contact-tracing methods are too slow to keep up with Covid-19.

But Ross Anderson, a professor at Cambridge University, has suggested the use of such apps could be unreliable as they require large numbers of the population to use them and to input their symptoms correctly.

“Anyone who’s worked on abuse will instantly realise that a voluntary app operated by anonymous actors is wide open to trolling,” he wrote.