Privacy in the balance in the battle against COVID-19

As the UK begins trialling the NHS’s Coronavirus contact-tracing app, fresh privacy concerns have been raised about the app’s operation.

We have previously commented on the technical innovations that states such as South Korea and China have implemented in their fight against COVID-19. In Europe, mobile phone technology is also being deployed to facilitate contact-tracing. That is, to identify potentially risky contact that a phone-holder may have had with others.

In the UK, a mobile app developed by NHSX, the digital arm of the national health service, will send and detect short-range bluetooth signals between mobile phones to record when people are in close proximity with others for a specific period of time. Should one of those users then be diagnosed as positive with COVID-19, the system will send alerts to the mobile handsets of others that they have been recorded as being in contact with, allowing them to take steps to minimise possible onward infection by taking a COVID-19 test themselves or self-isolating.

The NHSX app will not identify to others who it was that was diagnosed as positive. Nonetheless, the app’s design has raised legal and privacy concerns. While the Information Commissioner’s Office has declared that “as a general rule, a decentralised approach” would better follow its principle that organisations should minimise the amount of personal data they collect, the NHSX app adopts a centralised system. This means that data about a user’s interactions with others can be uploaded from the app to a centrally-held computer system for the purposes of risk identification, rather than the process occurring on individuals’ handsets. This puts the government in control of the data it collates, bringing with it the risk of that data being lost or stolen, or of it being retained in some form as a centralised index of the movement of the entire population beyond the term of the pandemic.

While NHSX states “the app does not collect personally identifiable data from users” and that “users will always remain anonymous”, even anonymous data can prove problematic if the nature or volume of the data is such that it allows identification of individuals—perhaps through the recognition of patterns in the data or by process of elimination. For example, if several people in a group receive an alert (say, work colleagues) then, through discussion, they may be able to narrow down the options and determine the patient responsible.

These observations should not detract from the benefits of early warning systems, such as the NHSX app. However, they should make clear the importance of asking questions about the nature of the technology to ensure that the amount of data collated and disclosed (whether anonymous or otherwise) is minimised. Moreover, serious consideration should be given to the appropriateness of a centralised system and the risks inherent therein.

As for the UK’s collection of health, movement and other personal data via its contact-tracing app, even if one of the lawful bases of processing under Article 6 of the General Data Protection Regulation (GDPR) does not apply to the app’s design (for example, consent by a user’s acceptance of the app’s terms of use), GDPR Recital 16 notes the Regulation does not apply to activities concerning ‘national security’. If national security is construed in the more conventional sense of (say) military threats and therefore does not apply, GDPR Article 23 notes that states may restrict the obligations imposed by the Regulation in order to safeguard (inter alia) public health. As COVID-19 would almost certainly be considered a public health issue by the courts, this Article could be used to overcome the constraints the Regulation would otherwise impose, including those under Article 5(1)(c) that any personal data that is collected is “…relevant and limited to what is necessary in relation to the purposes for which they are processed”.

It is notable that other countries (such as Germany, Italy, Ireland, Austria and Switzerland) have opted for decentralised systems instead, and that Apple and Google are collaborating on the development of a framework for decentralised contact-tracing systems. These will go some way to addressing privacy concerns—for example, by moving contact-recording and risk-identification functions out of centrally-controlled computer systems and onto the mobile phones of the individual users concerned. They may also improve the utility of the technology across borders by enabling interoperability where contact occurs while someone is travelling abroad1.

COVID-19 apps that the private sector have brought to market, such as symptom reporting apps that have been touted by their developers as being of help to researchers fighting the disease, cannot avoid the processing requirements imposed by the GDPR. Given the potentially sensitive nature of the data these apps may be processing (data concerning health is a ‘special category’ of personal data under GDPR Article 9), the private sector needs to consider carefully the nature of its apps and designs to ensure full compliance with data protection obligations and responsibilities.

1 European Union Member States have since agreed an interoperability solution for COVID mobile tracing and warning apps based on a decentralised architecture.

Update 18 June 2020

The UK government has announced that it is to redesign its contact-tracing app.

Following technical problems with its proposed solution and criticisms from privacy groups, the redesigned NHSX contact-tracing app will be based on the decentralised framework provided by Apple and Google, rather than the centralised model originally considered by this article.

Paul Schwartfeger on 4 May 2020