Surveillance & Privacy

The Essential Link between Privacy and Security: Optimizing for Both

David Hoffman
Tuesday, May 3, 2016, 9:48 AM

As we explore how best to use data analytics to provide value for important social functions like healthcare, education, transportation and law enforcement, many people believe that the use of the data will necessarily erode privacy. I believe not merely that we can preserve privacy, but that data analytics can particularly serve privacy interests when we use data to increase security.

Published by The Lawfare Institute
in Cooperation With
Brookings

As we explore how best to use data analytics to provide value for important social functions like healthcare, education, transportation and law enforcement, many people believe that the use of the data will necessarily erode privacy. I believe not merely that we can preserve privacy, but that data analytics can particularly serve privacy interests when we use data to increase security.

As cybersecurity threats become more harmful and perpetrators more difficult to detect, governments and companies must work together to deploy a distributed, effective cybersecurity response. The use of data—in ways that are both robust and protected—will be critical to the success of these efforts and to optimizing both security and privacy. Using the Fair Information Practice Principles as the guide, we can avoid false tradeoffs between security and privacy and instead realize both of these important goals.

The Dramatic Increase in Cybersecurity Threats

The volume, variety, velocity, and virulence of cybersecurity attacks continue to increase dramatically. Ten years ago, Intel’s labs identified 25 new threats per day. Today we track about 500,000 new threats every 24 hours. The numbers are only part of the story: attacks now are customized to target personal data, changing the equation for cybersecurity experts. We face an asymmetry between the volume of these attacks and the ability to address them that differs markedly from just a year or two ago.

We also see a cross-over from virtual to physical threats. At first, cybersecurity protected devices and networks. The increase in data breaches then turned attention to protecting personal data. Now, as hackers attack connected cars and insulin pumps, cybersecurity can have critical, life-changing, even life-threatening consequences.

Intel Security recently published a cloud computing cybersecurity report that shows that cybercriminals learn about the latest malware from a broad community of hackers who develop, use, and then boastfully share their technologies. This criminal community allows attackers to quickly innovate on the latest threat technologies. Protecting against these threats requires mechanisms that allow cybersecurity companies to innovate at the same pace.

It Takes Data to Protect Data

Protecting devices, networks and personal data from attackers requires analysis of the information flowing over and through those technologies. Cybersecurity works best when it relies on global threat intelligence from billions of sensors worldwide. Cloud-based threat intelligence provides greater protection against known threats when it draws on the collective knowledge and experiences of thousands of other organizations. Predictive analytics then provides organizations the ability to better detect, “connect the dots,” and respond to attacks in progress based on the tactical cyber-attack experiences of other organizations. The results of those analytics then need to be shared with others to better enable the global infrastructure to isolate the malicious code. In this way, it takes data to protect data.

Isolating and sharing cybersecurity threat indicators implicates privacy. For example, these indicators will often include IP addresses which may, when combined with other information, relate to identifiable individuals. But critically, not isolating and sharing cybersecurity threat indicators also implicates privacy by potentially exposing users to more risk from an ever-proliferating array of actors.

It is not enough, however, to say that users should give responsible companies their data so we can protect them against malevolent hackers. If we are to use this data to improve both individual and collective cybersecurity, especially in an environment in which the risks are increasing and the threat actors are more difficult to locate, it is fundamental to analyze also how companies should best protect the privacy of the individuals to whom that data relates.

Rethinking Can Provide Privacy and Security

The company I work for, Intel, has a long record of encouraging the ethical and innovative use of data to accomplish important social goals, such as education, healthcare, urban planning and law enforcement and national security. Intel has worked for some time to define how to promote privacy while also empowering organizations to pursue the innovative use of data. We call this effort “Rethink Privacy” and ground our recommendations in the Fair Information Practice Principles (FIPPS) as articulated in the OECD Privacy Guidelines. The FIPPs have served as the basis for law, regulation and industry best practices globally. Regulators use the FIPPs as an important tool to apply laws in different contexts. Intel’s Paula Bruening has written an excellent blog post describing how the OECD FIPPs are “the common language of privacy.” The OECD FIPPs are foundational, and do not need to be changed. They do, however, need to be implemented in new ways to properly adjust to an environment of the internet of things, cloud computing, and advanced data analytics.

Below are each of the OECD FIPPs with some explanation of how they can be interpreted to accomplish both privacy and security in a cybersecurity context.

Collection LimitationThe Collection Limitation principle provides that organizations should obtain only the data that is necessary. However, the potential and often unexplored potential of advanced data analytics for a great variety of purposes argues that much more data may be considered “useful” or “necessary.” Contrary to some experts who believe this principle should be abandoned in favor of a focus on use restrictions, the concept is still useful, especially when analyzing which entity should store data. The metaphor of searching for a needle in a haystack is useful to describe the exercise of looking for leads on terrorists. However, government agencies do not need to hold the entire “haystack” to find these needles. The private sector keeps data for its business purposes. Instead of demanding all of that data, government can provide the private sector with algorithms to isolate specific information it wishes to access. Supplying the algorithms and requiring the provision of the data identified by those algorithms could be subject to court oversight using reasonable due process. Both the USA Freedom Act and the recently passed French intelligence law approach collection limitation in this way. Mieke Eoyang recently wrote a paper describing how this approach could work for data collection under Section 702 of the Foreign Intelligence Surveillance Act.

Data Quality—The idea that data should be relevant and of a quality commensurate with the purpose for which it is to be used, still has great value in a world of data analytics and cybersecurity. It may require analysis not just of the quality of the data, but also of the analytical algorithms and the inferences drawn from those algorithms.

Purpose Specification—This principle requires organizations to limit their use of data to the purposes that were previously described or to “not incompatible” purposesRobust cybersecurity should be considered compatible (and by extension “not incompatible”) with all legitimate purposes for using data. Therefore, it should be fine to use data collected for other purposes to increase cybersecurity.

Use LimitationIf data is provided to governments to protect against terrorism or cyberattacks, agencies should not use it for other purposes. Use of personal data to prevent terrorism or cyberattacks has been the subject of considerable discussion around cybersecurity information sharing legislation. It is important to limit law enforcement and surveillance agency use of personal data to only the most important concerns. While governments naturally tend to use the data for additional legitimate purposes, scope creep will erode support particularly in other countries.

Security SafeguardsCybersecurity is not only the intended purpose of the processing, but also necessary to protect the data being analyzed. Companies and governments need to invest more in cybersecurity. Chris Young, Senior Vice President and General Manager of Intel Security, has noted that the world has not invested sufficiently in cybersecurity. This lack of investment has created a “cyber debt” that must be paid down with greater investment now in education and training of the future cybersecurity workforce. The unfortunate stream of high profile data breaches illustrates how much work we have yet to do to properly protect data.

Openness—Companies and government agencies both need to improve mechanisms that advance transparency about the ways they process information for cybersecurity purposes. Detailed privacy policies have limited effectiveness. Requiring individuals to read these long documents imposes too much of a burden; at best they will simply click through these documents without reading them. However, detailed privacy policies allow regulators, civil society and other overseers to understand how the data is processed. In representative democracies, some information may only be transparent to government oversight bodies, instead of to the general public. Pushing for as much information to be made transparent as possible to the general public will be a critical mechanism to provide privacy protection while also allowing for data to be used for cybersecurity purposes. When information cannot be made transparent to the public without increasing security risks, disclosure to trust intermediaries like Congressional oversight committees should be required.

Individual Participation—Much of the widely available data that relates to individuals is not knowingly provided by them. This includes information shared in social media by others, data collected and held by governments and observational data (cameras, sensors, location data). It is increasingly important that individuals should have the ability to understand what data is available that relates to them, and have the ability to obscure that information if it would have a disproportionate impact on them. Paula Bruening and I have written a law review article on this topic to highlight a practical mechanism to provide this type of obscurity: In this paper we recommend a centralized “obscurity center” that would allow individuals to contest content on the internet that relates to them, but which may be incorrect, irrelevant or disproportionate in its impact. If the obscurity center approves the request, it could propagate it to all search engines and data brokers. An obscurity mechanism like this would provide much more privacy, even if information on the internet is used to promote cybersecurity.

AccountabilityAll companies and government organizations should put in place an adequately resourced privacy officer. Whether privacy is provided for individuals depends on the government and private sector putting in place resources, policies and processes to make certain personal data is managed responsibly. The Information Accountability Foundation has articulated the elements of accountability with respect to privacy. One simple test is to ask an organization who their Privacy Officer is and to whom that person reports. If an agency or company cannot identify one person in charge of privacy, or if that person is mired deep in the organization, it is a sure sign privacy is not a priority. Accountability also requires appropriate oversight entities. For the private sector, this has meant Data Protection Authorities outside of the U.S., and primarily the U.S. Federal Trade Commission in the U.S. These regulators need adequate staffing and the legal authority to do their jobs. We have long called for U.S. comprehensive privacy legislation that would provide the Federal Trade Commission with the necessary tools to protect individuals. Such legislation ideally would encourage companies to pay off the cyber debt by investing adequately in cybersecurity.

Government agencies also require effective oversight. The U.S. arguably has the most robust governance system over its federal law enforcement and national security agencies. More work should be done to describe what other countries can do to meet the same standards, while the U.S. should also examine what can be done to improve its current system of checks and balances. Providing sufficient resources and authorities to the Privacy and Civil Liberties Oversight Board is one area for consideration. Congress should also consider how it can enhance the effectiveness of the House and Senate Intelligence Oversight Committees. Focusing on these oversight bodies may create more trust in the general public to allow for the type of cybersecurity threat information and data sharing between industry and government necessary to address the increasing risks.

* * *

Editor’s note and disclosure: Intel is a generous financial supporter of Lawfare. This article, as with all articles, underwent Lawfare’s normal editorial process and review.


David Hoffman is Associate General Counsel and Global Privacy Officer at Intel, in which capacity he oversees Intel’s privacy activities and security policy engagements. Mr. Hoffman has served on the FTC Online Access and Security Advisory Committee and the DHS Data Privacy and Integrity Advisory Committee. From 2005–2009, Mr. Hoffman served on the Board of Directors for the International Association of Privacy Professionals and he is currently a member of the Advisory Board for the Future of Privacy Forum. He has lectured on privacy and security law at schools in the US, Europe, Japan and China and is a Senior Lecturing Fellow at his alma mater, Duke University School of Law.

Subscribe to Lawfare