Tech in times of crisis

As nations around the world battle with the Coronavirus, news reports are emerging of the technological innovations some states are implementing in their fight against COVID-19. However, in the rush to try to bring the virus under control, we are left wondering what the privacy implications and legal consequences of these solutions may be.

In South Korea, health authorities have begun sharing details of the movements of COVID-19 patients publicly via text message alerts and online, to warn others in the area that they may have come into contact with an infected person and could be at risk of spreading the virus. Patient movement profiles are reportedly built on the basis of interviews, mobile phone signals, surveillance camera footage and credit card transaction data to recreate a patient’s route a day before their symptoms showed. Yet while the alerts do not name the patients, they may disclose the affected individual’s gender and age, as well as the names of the locations they visited. Having closely analysed these data, some online users have begun speculating as to the identities of the persons concerned.

The prospect of patients being publicly identified in this way raises concerns over their safety and well-being, aside from the obvious privacy infringements. Public identification could expose the patient to the risk of violence from others or the stigma may lead them to self-harm. Rumours of a person’s illness might also result in them being denied access to essential local services, such as home delivery of their groceries or medicines.

In China, it was recently reported that health officials have instructed the privately-owned company Alibaba to make modifications to its mobile and online payment platform, Alipay, in light of the Coronavirus outbreak.

Alipay allows users to pay for things in-store by means of a QR code displayed on a mobile phone screen which shops scan to obtain payment. However, following the state-mandated modifications to the app, colour-coded ‘health’ QR codes are now obtainable by users via the Alipay platform. These codes are coloured red, yellow or green to indicate the health of the individual and whether they pose a contagion risk, although how the state determines what colour code a person is assigned has not been disclosed.

Residents in the Chinese province of Hangzhou are already being asked to show their code for scanning when in public places such as schools, building complexes, supermarkets, and on the street, although it is presently unclear if and how the scanning of a code may restrict an individual’s movements. Analysis of the app’s code conducted by the New York Times, however, indicates that the app also appears to share the user’s location data with police.

Whether other nations around the world intend to adopt similar contact tracking technologies in their fight against the Coronavirus remains to be seen. While there are arguably benefits to early warning systems that alert individuals to potentially risky contacts they may have had with others, surveillance of this nature brings with it questions of personal privacy and security.

The recent UK facial recognition technology case of R (Bridges) v The Chief Constable of South Wales Police [2019] EWHC 2341 noted the importance of striking a sensible balance between the protection of private rights, on the one hand, and the public interest in harnessing new technologies on the other. (In Bridges, the public interest was aiding the detection and prevention of crime.) While in these unusual times, heightened surveillance of citizens by means of monitoring their mobile phones and other personal data may be deemed necessary for the protection of their rights and the rights of others, we nonetheless still need to consider the proportionality of any such interference, giving proper thought to the implications of what data is captured, how long it is stored for, who it is used by and how. The prospect of Coronavirus patients being identifiable by non-state actors—whether from an excess of purportedly anonymous evidence as may be the case in South Korea, or by design as in China—should ring alarm bells for privacy advocates given the potential for individual stigma and harm.

Paul Schwartfeger on 2 March 2020