Armed Conflict Cybersecurity & Tech

The Importance of Equity in Contact Tracing

Susan Landau, Christy Lopez, Laura Moy
Friday, May 1, 2020, 3:15 PM

Critics focus on the privacy cost of contact tracing. But it’s important to examine the disparate privacy implications for the most vulnerable communities.

An iPhone on a desk with a music app open, 2020 (M. Johnson/www.songsimian.com/https://flic.kr/p/2icK8j9/CC BY 2.0/https://creativecommons.org/licenses/by/2.0/)

Published by The Lawfare Institute
in Cooperation With
Brookings

Commentators and policymakers appear to have slowly rallied around “contact tracing” as a means to ease society out of restrictive social-distancing measures. Unlike in past pandemics, some observers assume that this iteration of contact tracing should involve a technological supplement to manual contact tracing undertaken by legions of public health workers. But this technology-aided contact tracing presents serious challenges, including concerns about equity that many commentators have so far overlooked.

During the current pandemic, public health experts have made the persuasive case that dramatically increasing our capacity to implement contact tracing is one of several steps necessary to allow us all to get back to work, school and socializing. Some are suggesting that we augment this human-based contact tracing with mobile apps that use our phones to track whether we have been in contact with an infected person. It’s understandable that policymakers and others are looking to technology to help us alleviate the substantial hardships of our current stay-at-home situation as quickly as possible. Yet precisely because of this pressure, it’s worth taking a moment to think about the implications of using mobile apps for contact tracing—not just the efficacy of the idea, which has been written about elsewhere, but also its costs. Analysts have already begun litigating one potential cost: loss of privacy. But it’s important to examine the disparate privacy implications for people already hardest hit during this pandemic—people living in poverty, African Americans, and Latinx, among others.

The regional outbreak of the novel coronavirus grew into a worldwide pandemic at breathtaking speed. At the same breathtaking speed, technologists have bandied about ideas—from GPS data-tracking to Bluetooth apps—to assist public health efforts with location tracking. In recent weeks, attention has coalesced around the possible use of mobile apps to assist with more traditional contact tracing. Calls for contact-tracing technology are growing as some states begin to relax stay-at-home orders and to reopen nonessential businesses, increasing the likelihood that large numbers of people will be exposed to the virus in coming weeks.

A number of groups have seized on this suggestion, with teams at MIT, in Europe, and elsewhere now working to develop privacy-protective Bluetooth contact-tracing apps, enabling people to find out that they have been exposed to known coronavirus carriers without revealing when or through whom the exposure occurred.

Perhaps most directly in the spotlight is the effort by Apple and Google to create an interoperable system based on Bluetooth that will work on iOS and Android mobile devices alike. The proposed system is designed to be privacy protective; if Jane’s phone comes in contact with yours—that is, is within a specified distance of yours for a predetermined threshold of seconds or minutes—then the two devices will exchange nameless identifiers, which will be stored on the devices, not in a centralized database. If Jane later discovers she has COVID-19, the respiratory disease caused by the coronavirus, then (and only then) the identifiers her phone used over the previous days will be published in a central database. Everyone’s phone periodically checks this database, looking for identifiers of those who have tested positive. If your phone finds a match between the identifiers in the database and one of the identifiers your phone has been in contact with recently (including Jane’s), you will be alerted that you may have been exposed to the disease.

So far Apple, Google and the researchers designing privacy-protective app specifications have done an admirable job of getting technical privacy protections right. In addition to developing underlying infrastructure that minimizes the amount of individual users’ data shared with a central database, Apple and Google have said they will impose strict rules for whatever apps eventually implement this interoperable system. At least in the first phase of the effort, apps will be run by public health authorities. For security’s sake, such apps will be the only apps using the system that will allow users to report that they are positive for the disease. In other words, no one—not a hacker, not a nation-state—will be able to use the system to pretend to be someone who has tested positive, thus causing many to falsely believe they have been exposed and potentially infected.

While the central database will keep track of the nameless identifiers of people who are reported ill, the database will not be able to track who has been exposed. That’s thanks to a design feature that changes these nameless identifiers frequently. The Apple and Google system does not collect location information, which means the contact-tracing apps can't either. The system also will not track which people are spending time together.

Contact-Tracing Technology May Not Work Very Well

Although thus far they seem to be doing well on privacy, Apple and Google have not solved—nor can they—the more fundamental question of efficacy. Will a contact-tracing system actually work? Will it help public health authorities diminish the spread of COVID-19? This is something that policymakers will need to address before deciding to deploy contact-tracing apps. As noted above, many have written about why contact tracing may not be very effective, and this is not the focus of this piece. Nevertheless, a few efficacy problems are worth highlighting.

Adoption rate will be critical to determining the impact of an app; without a high rate of use, when a person who is infected does have contact with another person who is at risk of contracting the disease, there is a low probability that both will have the contact-tracing app installed on their phones and in use. Consider, for example, if one-third of the population adopts an offered contact-tracing app. When an infected person comes into contact with another person, the likelihood that both have the app in use is only one-third times one-third, or about 11 percent.

Indeed, achieving a high adoption rate will be challenging. Singapore has been pushing a contact-tracing app for months, but as of a week ago, only one in five had downloaded the app—not enough of the population for the app to function effectively. One government official there told the press, “In order for [the app] to be effective, we need something like three-quarters—if not everyone—of the population to have it.” There’s reason to believe that the U.S. won’t get the voluntary adoption rates needed to make such a contact-tracing app effective.

In addition, a contact-tracing app likely will suffer from a high rate of false positives—falsely informing users that they may have been exposed to someone with COVID-19 following situations in which transmission would be highly improbable. As Ross Anderson and others have noted, just because Bluetooth registers two people within six feet of each other does not mean that the two phone users have experienced viral exposure to one another. The two people could be walking past each other outside, where exposure is brief and transmission is unlikely. They could be inside, within six feet but in different apartments or office suites (Bluetooth signals go through walls).

Contact tracing false positives could lead to other problems. For example, if a high volume of contact-tracing app users receive notice they may be infected, it could lead to a stark increase in the demand for testing. This, in turn, could intensify the test shortage problem. False positives also could lead to rapid loss of trust in the system. After a few false positives, people will likely remove the contact-tracing app from their phones.

Contact-tracing apps also are likely to suffer from false negatives. As one of us wrote earlier, that is likely to be the more serious problem. We don’t know the rate of asymptomatic carriers of the coronavirus—early numbers from Wuhan are increasingly suspect—but recent studies from Iceland indicate it may be as high as 50 percent of those infected. Infected children appear less likely than adults to exhibit serious symptoms—but children also are less likely to have smartphones with any contact-tracing app installed. This means that there will be a large cohort of asymptomatic transmitters who are unlikely to be tested (and unlikely to have a smartphone) and thus will remain outside the contact-tracing app’s ecosystem; because of that, the app will never be able to determine definitively whether or not a given user has been exposed to the disease. Even if the app is widely used, a user who has been in close contact with an asymptomatic, untested carrier would never be notified.

Contact-tracing technology also threatens to create a false sense of security. None of the proposed systems is designed to protect individual users from contracting the disease, but only to notify people who may have been exposed after the fact. Proponents of these systems have not claimed otherwise, and notifying people who may have been exposed could of course still have important epidemiological benefits. But some users who adopt a contact-tracing app may nevertheless feel that having such a tool on their phones means it’s safe to prematurely relax or abandon the precautions they take to avoid contracting the disease. This is especially likely in the event that government officials promote contact-tracing technology simultaneous with a relaxing of stay-at-home orders.

Perhaps the biggest hurdle to the app’s efficacy is simply that the virus remains largely an unknown. Even five months into the pandemic, we are learning new facts that are critical for halting coronavirus transmission. For example, we recently discovered that the virus was circulating in numerous U.S. cities months before it was detected. We’ve learned that the six-foot distancing rule won’t necessarily work in all conditions; we’ve learned, for example, that air conditioning causes people at a greater distance to become infected. Until we can accurately model how the virus spreads, the value of contact-tracing apps will remain unmeasurable and unclear. This is because a contact-tracing app might be designed to capture likely transmissions when two users are in close proximity but not to capture transmissions that occur in other conditions. What if the wind is blowing from one runner to another 10 feet behind? Or a few people are sitting well-distanced from each other in a restaurant, but the air conditioning is turned on high?

Contact-Tracing Technology May Not Benefit All People Equally

Any system designed to keep track of every significant interpersonal interaction that every person has is likely to have privacy implications. The Apple and Google system is decentralized; your and Jane’s phones know you have been in contact—but no one else does. But some governments, including France, have sought that systems be centralized. This would allow health care authorities—and not just the device owners—to learn who had been exposed to someone who tested positive for the coronavirus. Apple and Google’s set of tools have been designed deliberately to prevent this type of access, which could be used for surveillance of many forms, not simply coronavirus exposure.

Even if privacy concerns are mitigated through clever design and responsible governance, there is another ethical and legal obligation to address—and that is the question of equity. Contact-tracing apps may well be less effective for—and may actually exacerbate the harm the coronavirus is causing to—the demographic groups already hardest hit by the pandemic.

For a variety of reasons, people making lower incomes and people of color are experiencing the brunt of the pandemic. People in lower socioeconomic status groups are more likely to be at the front lines as essential workers in jobs ranging from grocery store workers and nonprofessional health care workers, to food production workers and garbage truck drivers. They have less access to health care, are more likely to live in close, crowded living quarters, more likely to rely on crowded public transport, and are more likely to have underlying health conditions than people in wealthier, more educated demographics. African Americans and Latinx people are overrepresented in lower paid, high contact industries, and are more likely to have a lower socioeconomic status. They are also more likely to live in areas that experience “outsized disruption” when COVID-19 hits. American Indians and Alaskan Natives similarly disproportionately live in poverty and have the highest rate of health conditions thought to increase the lethality of COVID-19. The overlay of structural and individual race, ethnicity, language, and class bias in health care and beyond further stacks the deck against people living in poverty and communities of color.

Given these dynamics and this history, it is not surprising that a growing body of data is showing that poorer and African American, Latinx, and American Indian and Alaskan Native communities have borne the brunt of not only COVID-19 illness and deaths but also the economic fall-out of stay-at-home directives and similar government responses. Further still, at the intersection of all of these groups are immigrants and non-English speakers, who appear to be similarly disproportionately hit by COVID-19.

People in these demographic groups thus have the greatest need for effective strategies to stop the spread of the virus and allow people to safely return to work and school. And our society has an ethical and, in some contexts, legal duty to ensure that strategies we develop and deploy to fight this virus help, rather than disproportionately hurt, individuals in these groups. It is also in our own self-interest to do so—as this pandemic has underscored, when it comes to public health, we must take care of everyone in order to take care of ourselves. Yet, contact-tracing apps are likely to be less effective for demographic groups in which we most need to “stop the spread.” Worse yet, these groups will likely bear a disproportionate burden of the harms entailed by the apps. There are several reasons for this.

First, low-income people and people of color live and work in spaces in which contact-tracing apps may be more likely to generate false positives. As noted above, one of the limitations of app-based contact tracing is that Bluetooth signals can travel through walls and floors. This means that people living in apartment complexes or multi-family housing units may be more likely to encounter false-positive notifications (because an individual in a neighboring unit whom they had not actually come into contact with tests positive for COVID-19) compared to residents living in single-family housing units. And low-income people and people of color are far more likely than their demographic counterparts to live in apartment complexes or multi-family housing units.

Similarly, contact-tracing apps may be more likely to generate false positives for workers in high-contact industries, such as nonprofessional health care workers and grocery store clerks. These workers may be in proximity to virus-positive people but wear protective gear (or operate behind a barrier) during work, providing them relative safety against transmission despite the notifications they will almost inevitably receive on their phone apps.

Bluetooth-based apps will thus be more likely to generate false positives for people living in poverty and people of color. A false positive can cause psychological harm and financial hardship to anyone. But a false positive can be particularly harmful to persons in these groups. A positive test result means that those most economically vulnerable will have to stop working, risking paychecks and even their jobs, which in turn can spiral into losing housing and causing social and educational disruption for children. A positive test result also may come with an official quarantine order. Again, while quarantine is a hardship for anyone, being ordered to quarantine is likely to bring a greater risk of harm to people living in poverty and people of color given this nation’s consistent history of over-policing and disproportionately subjecting people in these groups to civil and criminal penalties for legal violations.

Second, there is reason to be concerned that adoption and use rates for any contact-tracing app may be lower in vulnerable communities. This not only impacts the potential for overall effectiveness of app-based contact tracing but also means it may be least useful in the communities where it is needed most.

Initial adoption and sustained use of a contact-tracing app within some groups may be influenced by concerns about whether and how it will be used by law enforcement. Non-English speakers and people with less education may have a more difficult time becoming educated on what the app does and doesn’t do. Further, even if initially adopted, if false positives disparately impact these groups, or the app becomes a tool for misinformation, stigmatization, or (likely disparate) enforcement of quarantine or isolation orders, people in these groups may simply stop using the app.

In addition, contact-tracing apps rely on technology that people in lower socioeconomic strata and communities of color are less likely to have, that is, Bluetooth-capable smartphones and high-quality internet access. Internationally, up to 2 billion older phones—of the 3.5 billion phones in use—will not be able to use these contact-tracing apps, which rely on Bluetooth LE (low energy) and the latest operating systems. A Pew Research Center survey conducted a year ago found that some 19 percent of U.S. adults do not have smartphones—a proportion that rises to 29 percent among those in rural areas, 34 percent among people who did not graduate from high school, and a staggering 47 percent among people over the age of 65. African Americans and Latinx own smartphones at about the same rates as white people (80 percent and 79 percent, compared to 82 percent, respectively), but people with less education and those making lower incomes are significantly less likely to own smartphones. The rate at which American Indians/Native Alaskans own smartphones is unclear, although a 2015 survey indicated that 11 percent do not own a cellphone of any kind, and we do not know what portion of the 89 percent are smartphones. Some particularly vulnerable demographics, such as people who are homeless, are also unlikely to own smartphones. While some groups have recommended making Bluetooth-capable cellphones available via Medicare and Medicaid, this is a speculative solution. Even if implemented, it would be certain to systematically exclude some segments of these populations, including some immigrants and people with criminal histories.

A final reason to be concerned about the equity impact of contact-tracing apps is the fact that, once in place, apps created even for beneficial purposes are sometimes retooled for purposes far less sanguine. Location data gathered by weather apps and games, for example, has been sold to Immigration and Customs Enforcement. History tells us that increased surveillance and punitiveness will disproportionately harm people already struggling with poverty, as well as African Americans, Latinx, immigrants and non-English speakers. We must be intentional—more so than we have been before—if we are to avoid repeating this cyclical history. Our pattern to date has been to pay lip service to the importance of not exacerbating harms to vulnerable groups and then to forge ahead in the name of necessity, with insufficient consideration of whether there is even short-term effectiveness and no actual consideration of long-term harm to vulnerable groups.

Contact-Tracing Technology Must Be Designed With Direct Attention to Equity Challenges

Many others have already offered numerous recommendations and even complete frameworks to help guide policymakers who are considering supporting broad implementation of contact-tracing technology. Public health experts see contact-tracing apps as a supplement to traditional contact-tracing methods and not as a substitute for them. As the engineer who led development of the Singapore contact-tracing app explained, “We use [the app] TraceTogether to supplement contact tracing—not replace it.” Advocates have also offered important recommendations to protect privacy, enhance transparency, and ensure that contact-tracing systems are designed with ground-level input from public health experts. These are all indisputably important points.

Missing from the public discourse, however, are recommendations on how to address equity challenges likely to arise with the adoption of any contact-tracing technology. We offer a few such recommendations here.

First, even with the strongest privacy protections and with broad support from policymakers, any contact-tracing app must operate only on an opt-in basis.

Second, to ensure that opting-out is a true option regardless of socioeconomic status, use of a contact-tracing app cannot be a condition of access to public benefit or space, or to commercial, work, or educational spaces. Examples of public benefits and spaces include access to public transportation or recreation spaces or schools, entry to grocery stores, and eligibility for public benefits.

Third, given the primacy of public trust, data associated with any contact-tracing technology must be completely off-limits for law enforcement use. Thus, for example, while statutes and regulations restricting access to health care information often include an exception for law enforcement purposes, directives related to digital contact tracing should not include such an exception—and instead should make clear that law enforcement will not have access to this information.

But fourth—and most important—any contact-tracing technology must be developed through a process designed to identify and address potential demographic disparities early and continuously. Such a process should include:

  • Early and close consultation with people from the most affected communities. Just as an effective contact-tracing system will rely on input from public health experts, an equitable contact-tracing system must rely on input from the most affected communities.
  • Disparity or equity analysis of every component of the system. Such an analysis would ask, for instance, what tool(s) (for example, a smartphone) must a user have to participate in the contact-tracing system? Are certain demographic groups disproportionately more or less likely to have the requisite tool(s)? Under what circumstances are false positives or false negatives likely to occur? Are any demographic groups disproportionately more or less likely to be in those circumstances? (One of us has developed a similar analysis tool for use in the criminal legal context.)
  • A carefully designed pilot program that precedes widespread implementation. Such a pilot program should include collection of sufficient data to conduct a comparative evaluation of the program’s success across demographic groups.

Conclusion

As policymakers coalesce around contact tracing as a means to stabilize the coronavirus outbreak while loosening harsh movement restrictions, the main worry has been about privacy qua privacy. But privacy is not the only, or even primary, concern. We must first think carefully about whether a contact-tracing app can be effective in the United States. And if such a technology is to be developed, it must be built on a foundation of fairness. We cannot ethically accept any solution that will systematically work less well for, or disproportionately harm, some communities—especially the groups that are already the hardest hit by this pandemic.


Susan Landau is Bridge Professor in The Fletcher School and Tufts School of Engineering, Department of Computer Science, Tufts University, and is founding director of Tufts MS program in Cybersecurity and Public Policy. Landau has testified before Congress and briefed U.S. and European policymakers on encryption, surveillance, and cybersecurity issues.
Christy Lopez is a Professor from Practice at Georgetown Law School, where she teaches courses on criminal procedure and policing. From 2010 to 2017, she served as a Deputy Chief in the Special Litigation Section of the Civil Rights Division of the U.S. Department of Justice.
Laura Moy is an Associate Professor of Law at Georgetown Law School and the Director of Georgetown Law's Communications & Technology Law Clinic. She is also Associate Director of the Center on Privacy & Technology.

Subscribe to Lawfare