Lawfare News

The Dangers of Expansive Public Health Surveillance

Charlotte A. Tschider
Wednesday, August 24, 2022, 8:01 AM

A review of David Lyon, “Pandemic Surveillance” (polity, 2022).

German coronavirus contact tracing app (Marco Verch, https://flic.kr/p/2jcjxmf; CC BY 2.0, https://creativecommons.org/licenses/by/2.0/).

Published by The Lawfare Institute
in Cooperation With
Brookings

A review of David Lyon, “Pandemic Surveillance” (polity, 2022).

***

For much of the general public, the first two years of the coronavirus pandemic, although exceptional and a cataclysmic loss of human life, represented an event that would eventually end. The measures taken during the pandemic, such as mask-wearing and proof of vaccination, have largely ended. But what of surveillance and the capitalistic engine that keeps it running? David Lyon, in his excellent new book, “Pandemic Surveillance,” might say that public health surveillance, despite its temporary pandemic utility, is here to stay.

In “Pandemic Surveillance,” Lyon somehow makes fresh the recent raft of academic pandemic publications, reframing the seemingly philanthropic involvement of big tech companies in the pandemic as an exploitative enterprise. On the heels of Shoshana Zuboff’s pre-pandemic “The Age of Surveillance Capitalism,” this framing doesn’t seem too groundbreaking. Yet the issues that Lyon layers from scores of international thought leaders defies the imagination of the Apple phone user who simply clicked “yes” for contact-tracing alerts.

From the title, one might think this book focuses exclusively on contact-tracing apps, but it explores much thornier issues. Lyon discusses technology pandemic profiteering, coercive technology that will exist and expand beyond the scope, duration, function, and mission of the actual pandemic. Such profiteering is buoyed in part by human beings’ willingness to be exploited for data. 

The themes developed over the course of this relatively fast 160-page read gesture at important areas for future work. For example, the farce of for-profit “altruism”—or free technology offered for the “public benefit” by private companies without disclosing potential commercial benefit—is not lost on Lyon. As Lyon observes, pandemics are profitable for those who surveil, and surveillance is of us, not for us, even during a pandemic. Public benefit gives way to maximizing growth margins, data sales, and targeted advertising. When a pandemic is threatening your life, do you really care what the privacy notice says? 

This is not to say that all types of public health surveillance are riddled with surreptitious motives. Contact tracing, though not previously performed on this scale, performed digitally, or used for a pandemic this large, has historically been successful, improving the speed of treatment and reducing transmission vectors. While it is nearly impossible to demonstrate what would have happened (increased transmission rate, for example) without these technologies, we can only assume that most pandemic surveillance technologies provided at least some public health benefit.

However, even when pandemic surveillance technologies can do good, they can also exacerbate inequity and extend well beyond their initial scope and design. Lyon’s global surveillance vignettes sprinkled throughout illustrate just how much humans have given away to public and private entities under the guise of public health. Because many governments did not have the competency to create these technologies, pandemic surveillance involved countless public-private partnerships. Ghana introduced a centralized registry of all mobile equipment matching identities to phone numbers. In Brazil, public health activities were outsourced to private companies and significantly more biometric registration methods were introduced. In Mexico, a mandatory program used QR code swiping for contact tracing. Indeed, we are just beginning to see how government and private entities can benefit or profit from pandemic data collection.

Many of these technologies have not been sunset. Rather, they have been repurposed for other surveillance tasks and data retained. In Israel, contact-tracing technologies were used to track and monitor protesters. Drones and ultraviolet cameras used to monitor quarantine on the Thailand-Myanmar border are now used to monitor illegal border-crossing. CCTV and facial recognition scans introduced for the pandemic are still used in Russia. In Australia, apps checking quarantine conditions are still being maintained. Even the United Kingdom has stated that it will retain personal data from the pandemic for 20 years—with no right of erasure. These examples, among so many others, invite the question: Why aren’t people more concerned?

The most surprising aspect of Lyon’s sampling of surveillance technologies is that they do not just belong to countries with an autocratic government like the People’s Republic of China, where surveillance has been a long-standing part of everyday life. Rather, even countries with stated commitments to data protection have perpetuated surveillance expansion. 

The unanticipated consequence of pandemic technology, then, is the continued use of these technologies and the public’s seeming acceptance of even more surveillance. In the United States, the same public that rallied post-Snowden to reduce broad-scale surveillance under the Patriot Act gladly traded vaccination and contact details for a good steak, at least in those states where contact tracing was a mandatory trade-off for restaurants reopening. 

How is this more concerning than trading personal information for a free subscription to Pokémon Go? Well, purely commercial activities, even if surreptitiously communicated, do not pretend to be legitimate and official. They simply benefit from a public willing to trade personal information for entertainment or convenience. Pandemic surveillance, however, capitalizes on fear while seeming official. 

As Lyon explains it, the specter of power convinces the public that such technology is actually for us. If we believe that activities involving our most sensitive data are primarily benefiting us, we are more likely to participate, especially when we fear the alternative. Indeed, during a pandemic, it is nearly impossible to protect lives, ease social isolation, and protect privacy and civil liberties simultaneously. However, it is the expectation of health privacy and the necessity of trust in public health decision-makers during a pandemic that makes broader, pervasive, and opaque surveillance so concerning. 

In U.S. and international public health law, most health-related activities are restricted by laws creating duties of loyalty and confidentiality, or laws that mandate or bar other activities. For example, physicians owe fiduciary duties to their patients, which means that they must act in the interest of their patient and according to their wishes. The Health Insurance Portability and Accountability Act (HIPAA) restricts health care providers, health plans, and other covered entities (and their business associates) to specifically communicated data collection purposes and practices. Internationally, most privacy laws, such as the EU’s General Data Protection Regulation, require limited data use and restrict any secondary processing, especially for protected categories of data like health data. The existence of these laws in typical health care scenarios creates a public expectation that data related to health be given some special protection.

However, public interest is a known exception to restrictions on data processing, and many government entities leveraged this justification to process identifiable health data without individual consent. Even where consent is required, disproportionate bargaining power between technology companies and the individual means that individuals are faced with a “take it or leave it” proposition. When a person’s life is potentially at stake, few individuals are willing to pass up a potentially beneficial service simply because they are unsure how their data will be used at a later time.

Later use, then, raises a central question: When what begins as contagion tracking morphs into interaction and behavior modification, then feeds policing and broader public-private data exchanges of resident data, is surveillance consistent with the goals of “public health?” The technology once positioned as commercial or public now benefits from its blended use. And it is doubtful that the average person can tease out these entanglements to make a true choice. In some countries, where use of these technologies was mandatory, indeed people have no choice.

As Lyon explains, ubiquitous surveillance under the guise of public health has other ancillary, though critical, consequences. Technology will render some humans hypervisible and others invisible, and those who are hypervisible may benefit from this visibility while also being subject to greater exploitation. The invisible may be captured only in context-specific surveillance, which is known to disproportionately surveil certain populations, for example, surveillance in “higher-crime areas” or related to immigration status. Although under some circumstances, surveillance may be justified, adopting pandemic technology for new uses that disproportionately target certain communities while wholly leaving out others should make us pause and consider what utility these technologies provide.

Even capturing seemingly innocuous and critical data about health can reveal sensitive predictions about individuals when location data is used for these purposes. Where people gather, recreate, find entertainment, and with whom they associate can create inferences about them, including information about their gender, sexuality, religious beliefs, race or ethnicity, income level, as well as a whole host of other health data. Lyon observes the intersection of such inferences with the rapid adoption of a whole host of technology advancements, including advanced analytics and artificial intelligence, the backbone of which involves large volumes of data. With more data, especially the type of data collected during the pandemic—such as information about an individual’s movements, location data, and contacts—much can be inferred, with significant confidence, about an individual’s sensitive characteristics.

Our humanity and freedom are at stake in the push against enlarging and expanding surveillance. If “Pandemic Surveillance” is left wanting in any way, it is the desire for more solutions. But the complexity of public-private surveillance makes easy solutions difficult to imagine. 

One immediate response, however, is advocacy from those who can unravel these entanglements to challenge those in a position of power. Although Lyon describes limited advocacy efforts in a variety of countries, in particular Taiwan, one can only imagine that any failure to advocate against pandemic surveillance in other countries had less to do with endorsing surveillance and more to do with (a) not wanting to be depicted as a coronavirus denier and (b) searching for some solution, any solution, to stop the spread. Responding to surveillance in fear, however, may be exactly what technology companies and, to some extent, governments are banking on. 

What is needed, as Lyon explains, is a reformulation of how we think of surveillance. First, we must resist the “normalization of surveillance,” where surveillance technology scope creep can be thwarted. All technologies must have an end date or event and be fit for purpose. We must double-down on promoting trust, specifically considering ways to make technology use more transparent and less secretive. This means that data protection agencies, so long as they can legally regulate government and private entities, will become increasingly important.

Finally, we must humanize technology rather than considering public health from a, well, “public,” utilitarian, and ultimately reductionist perspective. As I see this, the challenge is in seeing the individual rather than the cumulative whole, which is a tall order for those designing technology based on digital approximations of humans rather than the humans themselves. This “datafication,” or seeing human beings as data, creates a distancing effect that makes it far easier to overcollect, overuse, and ultimately exploit human beings. By restricting data collection and use to only what is needed and consulting the surveilled in surveillance technology design, especially individuals from a variety of communities and backgrounds, it is more likely that technology will be designed for us, rather than of us for some other purpose. As Lyon points out, the goal of human flourishing is accomplished by limiting the creep of surveillance while also balancing the degree and manner by which individuals are surveilled (or left out).

Although “Pandemic Surveillance” leaves the reader wanting more detail about how to fix a nuanced, complex, multifaceted problem addressed in the majority of the book, Lyon offers us the intellectual scaffolding to consider how fear can lead to the erosion of important human interests.


Topics:
Charlotte Tschider is an associate professor at the Loyola University Chicago School of Law. Professor Tschider is the author of International Cybersecurity and Privacy Law in Practice, 2d ed. (Wolters Kluwer 2018, 2023), Cybersecurity Law: An Interdisciplinary Problem (with David Thaw, Gus Hurwitz, and Derek Bambauer, West 2021), and Cyborg Health (with Dr. Krista Kennedy, forthcoming Cambridge University Press 2025). Professor Tschider regularly advises global private and public organizations on data protection, cybersecurity, and artificial intelligence policy matters.

Subscribe to Lawfare