Cybersecurity & Tech Surveillance & Privacy

GDPR Derogations, ePrivacy, and the Evolving European Privacy Landscape

Ali Cooper-Ponte
Friday, May 25, 2018, 7:00 AM

On May 25, the European General Data Protection Regulation becomes law in all EU member states, repealing and replacing the EU Data Protection Directive. The GDPR aims to harmonize data-protection standards for digital personal data across Europe. However, while companies and regulators are scrambling to comply with the regulations by this date, this week is hardly the finish line.

Published by The Lawfare Institute
in Cooperation With
Brookings

On May 25, the European General Data Protection Regulation becomes law in all EU member states, repealing and replacing the EU Data Protection Directive. The GDPR aims to harmonize data-protection standards for digital personal data across Europe. However, while companies and regulators are scrambling to comply with the regulations by this date, this week is hardly the finish line.

This is, in part, because the GDPR also permits member states to enact derogations, or supplemental legislation modifying provisions of the GDPR in accordance with national needs. Within the GDPR, there are dozens of provisions allowing these derogations by member states in key data privacy areas—including the age of child consent (Article 8), the processing of sensitive data such as genetic and biometric data (Article 9), and the processing of national identification numbers (Article 87).

The derogations introduce additional complexity for companies striving to comply with the regulations. In addition to complying with the terms of the GDPR, corporate counsel also have to track and comply with supplemental GDPR legislation in different EU member states. To date, four countries—Austria, Germany, Belgium and Slovakia—have passed supplemental data-privacy laws, while sixteen others have draft legislation. Those with draft bills include France, Ireland, Spain and the United Kingdom. Many countries with draft legislation have not yet made these bills public.

And, if the derogations weren’t complicated enough, the EU is negotiating a new, EU-wide ePrivacy regulation, which will “particularize” the GDPR “by translating its principles into specific rules.” The new ePrivacy regulation will also “complement” the GDPR “by setting forth rules regarding subject matters that are not within” its scope.

This post aims to provide a summary of how the GDPR derogations are being applied throughout the EU, highlighting a few potential state-to-state variations and implications for privacy and security. It also provides a brief explanation of the proposed ePrivacy regulation and its relationship to the GDPR.

The GDPR Derogations

Age of Child Consent

Article 8 of the GDPR sets the age of consent for the processing of the personal data of children at sixteen years old, making it unlawful for providers of “information society services” to process such data unless they have received consent from a parent. However, Article 8 also allows EU member states to pass their own laws lowering the age of consent to as young as thirteen years old, and several countries have done so. In Austria, the age of consent for data processing is fourteen and draft legislation in the UK aims to lower the age of consent to the minimum threshold of thirteen years old. Meanwhile, Germany has set its age of consent at the GDPR default of sixteen years old.

In response to the state-to-state variation in the age of consent, WhatsApp announced plans last month to raise the minimum age for its users in the “european region” to sixteen. WhatsApp did not indicate any plans to vary the age from country to country in Europe. That being said, the minimum age to use WhatsApp remains thirteen in the U.S. (per the Children’s Online Privacy Protection Rule) and elsewhere.

The Right to be Forgotten

The right to be forgotten (RTBF) came to prominence following the Court of Justice of the European Union ruling in Google Spain v. Gonzalez in 2014. There, the court held that Google’s indexing, storing and serving of personal data in its search engine constituted the processing of personal data under the Data Protection Directive, and that Google was consequently a data controller. Accordingly, Google can be required to remove information from its search results where those results are inaccurate or where they are “inadequate ... no longer relevant or excessive in light of the time that has elapsed” in order to safeguard the “fundamental rights and freedoms” of the data subject, such as the rights to privacy and the protection of personal data.

The GDPR codifies an even broader RTBF than that provided for in Google Spain. Article 17 of the GDPR allows data subjects to request the erasure of their personal data in a variety of circumstances, including situations where the data is no longer necessary to achieve the purpose for which it was processed and situations where the data subject withdraws their consent. Relatedly, other provisions in the GDPR mandate a right to access one’s data and a right to request the correction of erroneous personal data.

However, there are exceptions to the GDPR right to be forgotten. For example, data controllers may choose to anonymize data instead of deleting it. Article 17 also permits member states to derogate based on their own freedom of expression laws. Article 17(3)(a) states that the RTBF shall not apply where the relevant data processing is necessary for the exercise of the right of free expression and information. This provision is to be read in conjunction with Article 85 of the GDPR, which requires member states to affirmatively strike a legislative balance between their freedom of expression regimes and data privacy. Thus far, national legislatures have largely enacted broad provisions that mirror Article 85, leaving it to their courts to further define the boundaries of the freedom of expression exception to the RTBF. As this case law evolves, it will likely further complicate the GDPR’s right-to-be-forgotten regime.

Processing of Sensitive Personal Data

The GDPR also provides a general framework for how sensitive personal data, such health or biometric data, may be processed. Under Article 9, the processing of personal data, such as data that reveals a person’s racial or ethnic origin, a person’s sexual orientation, trade-union membership, and genetic and health data, is prohibited. However, Article 9 also allows data subjects to consent to the processing of their sensitive data for specified purposes and allows member states to further derogate how such data may be used. This has led to variation in member-state requirements for the processing of this category of data.

For example, in proposed Irish data-privacy regulations, the Irish government creates a carve-out for using sensitive personal data in the insurance industry. Meanwhile, German law allows for the processing of sensitive data without consent for scientific or historical research, as long as such processing is necessary and the data controller’s interest in processing the data outweighs the data subjects privacy interests. German law also requires that researchers deploy appropriate measures to protect the privacy of the data subjects, such as anonymizing the data. Additionally, the German legislation creates an exception for the processing of sensitive data in the administration of social security.

The Public Interest Derogation

The GDPR maintains existing restrictions on the transfers of personal data from the EU to third countries or international organizations. These restrictions are aimed at ensuring that the GDPR’s provisions cannot be circumvented by transferring personal data from the EU to a non-EU country with less restrictive data-privacy laws. Pursuant to Article 46, such transfers may only be made to countries that also have adequate data-protection requirements. However, Article 49 of the GDPR also gives member states flexibility to allow the transfer of personal data to third countries absent an adequacy determination if such transfer is “necessary for important reasons of public interest”—for example, if there is a need to transfer health data to a third country in order to deal with an international public-health emergency.

In addition, it is possible that this “public interest” exception could weigh on the transfer of data from the EU to the United States for the purposes of complying with search warrants issued under the Stored Communications Act (SCA). In the U.S., the newly-enacted Cloud Act raises possible conflicts of law issues with respect to the GDPR, because it allows U.S. law enforcement to demand data stored overseas with SCA warrants, while the GDPR only allows for the transfer of such data out of the EU under certain conditions—such as a request through a Mutual Legal Assistance Treaty (MLAT) under Article 48.

The Article 49 “public interest” derogation might also be used by a given country to facilitate data transfers in response to U.S. law enforcement demands. Article 49 also provides an exception for transfers of data when the “compelling legitimate interests” of the controller are at stake, which could be construed to allow companies to transfer data to comply with a search warrant in order to avoid a contempt order.

However, a group of EU data protection and privacy scholars have rebutted this idea, arguing that the Article 49 derogations cannot be applied to data transfers related to SCA warrants. The European Commissioner has stated that the derogations in Article 49 are to be interpreted strictly, and the Article 29 Working Party has made clear that law enforcement access to personal data implicates the rights protected by the GDPR. Moreover, the “public interest” exception is meant to reflect the public interests of the EU member state, not a third-party country. Accordingly, applying the Article 49 derogations to these law enforcement demands for data is likely incongruous with the GDPR.

Instead, the EU and the U.S. may enter into an agreement regarding law-enforcement access to digital evidence. As Jennifer Daskal and Peter Swire described earlier this week, Attorney General Jeff Sessions met with senior European law-enforcement officials on May 22 to discuss a path forward. Such an agreement could provide American and European authorities a way around the broken MLAT process that complies with both the Cloud Act and the GDPR.

That being said, it is still far from certain how EU member states and regulators will interpret the vague “public interest” exception. As with the other GDPR derogations, there is much ambiguity and potential for litigation as the contours of the exception are sketched out further by member state legislatures and courts.

Meanwhile, the EU has clarified its legal rules for EU law enforcement demands in its new Law Enforcement Data Protection Directive and Regulation that goes into effect with the GDPR. These new legal rules aim to make it “easier and faster for law enforcement and judicial authorities to obtain the electronic evidence they need to investigate and eventually prosecute criminals and terrorists” from internet service providers, including those in a different EU member state. The regulation creates new judicial authorities to issue orders requiring the production or preservation of electronic evidence by service providers throughout the EU. Still, the European Commission has made clear that these powers are constrained by the GDPR: “Personal data covered by this proposal is protected and may only be processed in accordance with the General Data Protection Regulation.”

The ePrivacy Regulation

The EU is also considering amendments to the regulation on privacy and electronic communications, which was last updated in 2002. The “ePrivacy” updates respond to new communications technologies that have rendered the 2002 regulation obsolete. For example, the new regulations aim to cover new services, such as Signal, that replicate the functionality of traditional communications systems, like landline telephones (so-called “over-the-top” services).

In its most recent proposal, the European council clarified the relationship between ePrivacy and the GDPR. The new ePrivacy regulation “particularize[s] and complement[s]” the GDPR by translating its provisions into specific rules or by creating rules in areas out of the scope of the GDPR. While the GDPR focuses on data privacy, ePrivacy seeks to protect electronic communications and also regulates the use of tracking technology (like cookies) and new marketing techniques. There is considerable overlap between these areas. For example, electronic communications may include personal data as defined by the GDPR. Consequently, ePrivacy may introduce new considerations for companies in addition to those generated by the GDPR as it further defines EU privacy rules in certain areas.

Initially, the ePrivacy updates were meant to go into effect with the GDPR, but they are now unlikely to be implemented before late 2019 at the earliest. The European parliament and the European council still need to approve the new regulations. Until then, uncertainty remains about how the new ePrivacy regulation will shape the European privacy landscape.


Ali Cooper-Ponte is a rising third-year law student at Yale Law School, where she is an articles editor for the Yale Law Journal and was co-president of the National Security Group. Prior to law school, Ali spent two years working for Google on matters related to government surveillance and privacy. She graduated magna cum laude from the University of Pennsylvania with a B.A. in Political Science and a minor in International Development.

Subscribe to Lawfare