Cybersecurity & Tech Surveillance & Privacy

Reinventing Cambridge Analytica One Good Intention at a Time

Jane Bambauer
Wednesday, June 8, 2022, 8:01 AM

Europe’s new Digital Markets Act is in tension with the General Data Protection Regulation, and the practical impact may be bad for privacy and competition.

Europe GDPR Data Privacy. (TheDigitalArtist, https://tinyurl.com/mw5fc7px; Pixabay Free for commercial use)

Published by The Lawfare Institute
in Cooperation With
Brookings

The European Commission is gearing up to begin enforcement of its landmark Digital Markets Act. This act—alongside Europe’s privacy law (the General Data Protection Regulation, or GDPR) and its other new laws in development, the Data Act and the Digital Services Act—will reshape the digital economy in Europe. The jury is still out on whether all of these laws can mesh together to form a coherent philosophy of responsible innovation or whether the laws just create a crazy quilt of regulation. But there are already signs that the Digital Markets Act may have unintended consequences for privacy, competition or both. U.S. lawmakers are considering their own federal bills that replicate some of the rules contained in the Digital Markets Act, so it’s worth taking a look at the experience in the EU. 

The goal of the Digital Markets Act (DMA) is to ensure that large “gatekeeping” platforms —such as Google, Apple, Meta, Amazon and the like—do not use their position as a core platform to restrict innovation and growth among the companies and apps that rely on them. Specifically, the DMA regulates platforms that have had at least 45 million European end users and at least 10,000 business users (for example, app developers, advertisers, small retailers) over the past three years. The DMA allows the third-party business users (say, apps) to use their own payment systems, create their own app stores, offer their own software or ancillary services, and create side channels that cut the gatekeeping platform out of the transactions. Moreover, the gatekeeping platforms must make their platforms fully interoperable with third-party companies, meaning that Apple must give every app the same access to its iPhone operating system that its own apps use. They must also enable continuous, real-time access to the data generated by the third party’s customers. This access right is asymmetrical (that is, the gatekeeping platforms cannot use the third-party business’s data for their own purposes). This ensures that an innovative new app gets access to some of the data that it helped co-create rather than allowing the gatekeeping platform to hoard it. And the act requires several new forms of transparency that would give advertisers and publishers information about how much of an ad placement price is transmitted to the publisher (in other words, advertisers using Google’s ad placement service would know how much of their payments Google transmits to the websites where the ads are placed and how much Google keeps for itself), and that provide competing search engines with anonymized information about ranking, queries, clicks and view data collected by the gatekeeping platform. 

On the one hand, I am intrigued by the DMA and sympathetic to its approach. Having switched in the past from an iPhone to Android and having spent way too much time and energy getting myself out of the iPhone’s “walled garden,” I understand the impulse to metaphorically blow it up and force interoperability on Apple, Google and the other gatekeepers. I am also predisposed to want regulation to help startups access data that has lots of untapped potential and that too often stays siloed at a large company. The DMA, alongside the EU’s new Data Act, at least partially fills this brief. But the DMA may create more problems than it solves. It’s on the express lane to regulatory headaches and industry confusion. It will be very difficult to reconcile the obligations of the DMA with other legal mandates and practical necessities related to privacy and security.

Let’s start with the DMA’s impact on data security, which, so far, has received the most attention in the U.S. media. The DMA applies to messaging platforms like Facebook’s direct messenger, Apple’s iMessage and WhatsApp and requires these platforms to allow interoperability with other messaging services. This means that Apple must facilitate message exchanges for an iMessage user who wants to message a friend who uses Fly-by-Night Chat (not a real service). Moreover, the DMA explicitly forbids Apple and other gatekeeping platforms from requiring business users to use an identification or authentication service that the gatekeeper provides, so Apple cannot even help its iMessage users ensure that the messenger they are trying to chat with is really their friend. The interoperability requirement here seems aspirational and potentially harmful, though. Many security experts doubt that the end-to-end encryption used on WhatsApp, Signal, iMessage and Android Messages can be made interoperable without undermining security.

Another concern with the law is that platforms and app stores will be forced to allow users to install applications that have lower standards of performance and security than the platform has traditionally allowed. Typically, platforms and app stores have rules about the efficiency, security and functional performance of apps that want to be hosted on the platform. But the DMA does not give platforms discretion to create and enforce their own standards. Although the DMA allows gatekeeping platforms to downgrade or refuse service to a third-party business that strains the user’s hardware or operating system (Article 6), this will be a murky line that platforms may be reluctant to approach. For example, if an app is poorly coded and causes the battery of the user’s device to drain more quickly than should be necessary, can Apple or Google remove the app from their app stores? 

Similar problems await platforms and regulators with respect to privacy. Suppose a friend that I chat with over iMessage decides to switch to Fly-by-Night Chat (still not a real service) and initiates the procedures to ensure that Apple interoperates with her new messaging service. When I next message my friend, will Apple or the new chat service alert me that my communications are about to be collected by another company? Probably yes. Under the GDPR, the collection of my communications constitutes data processing and in this case would be subject to my affirmative authorization. But this means the goals of one legal regime (the GDPR) clearly frustrate the other (the DMA) by placing a hurdle and even a consumer-held veto power over the interoperation process. The DMA is supposed to make the experience of loading and using a new chat application completely seamless, but Apple’s (and other gatekeepers’) legal duties under the GDPR all but guarantee that the consumer who is downloading the new app, and all of her chat contacts, will first have to navigate through a consent screen and affirmatively opt in to data collection. If any of the contacts don’t opt in, the functionality on the new app will not match that on the legacy app that was supposed to be interoperable. The GDPR basically ensures that some degree of network effects are preserved because, by design, it prevents personal data from being transmitted without consent.

Consider another scenario: If a third-party app developer that made a popular game (the Fly-by-Night Game) requests and receives permission from a Facebook user to download their entire history of use on Facebook and to track all future Facebook interactions, do EU regulators really want Facebook to make all of that data easily available and transferable? This matches one of the two core problems in the Cambridge Analytica scandal. Cambridge Analytica developed a survey app that asked the Facebook users who interacted with the app to answer several questions related to political preferences and to provide access to their Facebook data history, including data about their friends. At the time, Facebook had a permissive application programming interface (API) that allowed third-party app developers to access this data as long as the Facebook user gave permission. Cambridge Analytica then used the survey app users’ Facebook data along with their answers to the survey questions to create psychometric profiles. Knowing which Facebook “likes” correlate with which political attitudes, Cambridge Analytica could then make inferences about political preferences of everybody else whose data was collected. A lot of the public criticism focused on the fact that the friends of survey users never agreed to have any data collected by Cambridge Analytica. But regulators and the public were also upset on behalf of the 250,000 or so individuals who used the survey app and did agree to allow Cambridge Analytica to harvest their data because the purpose of the data collection was obscured, both to Facebook and to the end users who agreed to the data collection. (Indeed, the Federal Trade Commission complaint against Facebook cited only a single violation, and it was based on possible deception of the individuals who actually interacted with Cambridge Analytica, not the 80 million data subjects who didn’t.)

The DMA will force Facebook and other gatekeeping companies to revert back to the permissive data-sharing procedures and architectures that were in place when Cambridge Analytica collected data. In fact, the DMA will require even more: Even Cambridge Analytica was limited by Facebook to collecting only certain types of Facebook data—basic profile information and “likes” of public Facebook profiles. The DMA would presumably require all Facebook data to be interoperable (and therefore collectable) as long as the third-party company secures GDPR-compliant consent. This could include private messages, “likes” of nonpublic profiles, and even web tracking data. 

One could argue that the regulatory response to Cambridge Analytica in the U.S. and abroad was an overreaction, but I am quite certain the EU, with its emphasis on user control, data minimization, and purpose restrictions on data processing, does not share this view. 

The DMA gets around this problem superficially by relying on the substance of the GDPR and a chain of responsibility that keeps gatekeeping platforms on the hook. Article 7 of the act requires gatekeepers to ensure that data access and interoperability measures are implemented in compliance with the GDPR. But this puts the gatekeeping platforms in the precarious position of having a duty to interoperate all the way up to the GDPR line, from X to Y, and then having a duty to stop. Facebook and Google will have to interpret where the outer limits of GDPR compliance are: If a third-party company’s consent and processing practices are good enough, data must flow. 

Given all of these interlocking goals, I see three possible scenarios for large platforms operating in Europe:

Scenario 1: Privacy and Security Subterfuge. Large platforms will exploit exceptions to the DMA based on very expansive interpretations of the GDPR and data security, so that they still effectively have leeway to engage in anticompetitive behavior. For example, Apple could engage in the mirror opposite of what privacy practitioners call “dark patterns” by designing their user interface to encourage users to decline consenting to data collection by business users. (Mark Zuckerberg would argue that Apple is already using privacy as cover for anticompetitive behavior.)

Scenario 2: Cambridge Analytica 2.0. Large platforms interoperate with so many companies, large and small, that ensuring compliance with the GDPR and privacy principles will be impossible. Facebook will see that the contracts it has in place with a new version of Cambridge Analytica are consistent with industry practice, but they will not be able to audit and discover every company that uses consumer data in a way that violates the GDPR. EU lawmakers should be aware that the DMA is dramatically increasing the risk that data will be mishandled. Nevertheless, even though a new scandal from the DMA’s data interoperability requirement is entirely predictable, I suspect EU regulators will evade public criticism and claim that the gatekeeping platforms are morally and financially responsible.

Scenario 3: Public Utility. Given the tensions among privacy, data security, and interoperability, European regulators will be so involved in the day-to-day management of the gatekeeping platforms that they will essentially become privately owned public utilities. 

All of these scenarios cause another concern to surface: Even if the tensions among privacy, security, and opportunities for third-party services are optimized well enough, the DMA combined with other European laws essentially abandons any hope for competition at the level of the gatekeeping platforms. All of these gatekeepers create and distribute their platforms for free in order to improve their position in some other market—hardware, payment systems, advertising, and the like. Why would a smaller company develop the complex mix of code and marketing required to create a well-functioning mega-platform if it knows it will have to share all of its infrastructure with competitors, and also insure against the law-breaking of third-party businesses that use its infrastructure? If Target had to share its floor space with a smaller retail store any time it built a new retail structure, how many new buildings would it build? 

I am not at all confident that the U.S. has found the best policy position that balances all of the competing consumer needs. I suspect U.S. law can do a better job giving tools—legal or otherwise—for regulators and consumer class actions to discover and prove that a large platform has engaged in conduct that violates the antitrust rules the U.S. already has. But the trouble with the DMA is that compliance is so difficult and expensive that dominant gatekeeping platforms will be shielded from any meaningful competition. No small platform can afford the lawyers and staff needed to stay in compliance with the law.

My own experience switching from Apple to Android illustrates the most striking flaw in the DMA. I much prefer the iPhone for aesthetics, but my choice was driven by a preference for Google’s cross-app functionality. I prefer Gmail, Google calendar, Google Maps and other Google products not despite, but because of, their intensive tracking of users and anticipation of users’ needs. And so far, I have found that the Play Store does a good enough job protecting me from my own ignorant decisions to download apps with poor data-security hygiene. But most of my friends are on Apple out of principle. They like the strict privacy and data security standards that Apple is currently able to enforce through its house rules. Apple and Google, in other words, are competing using differentiated strategies.

What this means is that despite the switching costs, I will move to whatever gatekeeping platform satisfies my preferences the most, and Apple and Google will have an incentive to compete with each other. Although it seems hard to fathom right now, some other platform may eventually grow to be a smartphone operating system like Apple or Google and start to eat their lunch. (I’d put my money on a gaming company.) The DMA, by design, assumes that the market for gatekeeping platforms is static. All platforms above a certain number of users or with large enough global revenue are presumed to be gatekeepers, and these are platforms that the EU believes have an “entrenched and durable position.” Although there is a procedure in place to remove the gatekeeper designation, it’s hard to imagine how a new rival could emerge when the DMA and the GDPR have such high compliance burdens. The EU is giving up on dynamism among the mega-platforms. I am not that cynical yet.


Jane Bambauer is a professor of law at the University of Arizona, where she teaches and researches about free speech, privacy and technology policy.

Subscribe to Lawfare