When Data Is Currency: Thoughts in Response to Jack and Evgeny Morozov
On Friday Jack offered some thoughts on Evgeny Morozov's piece in the Financial Times about the broader implications of the Snowden revelations, overlooked by those intent on fitting each new leak into a one-dimensional picture of NSA overreach. Morozov writes, "[W]e might be living through a transformation in how capitalism works, with personal data emerging as an alternative payment regime . . . .
Published by The Lawfare Institute
in Cooperation With
On Friday Jack offered some thoughts on Evgeny Morozov's piece in the Financial Times about the broader implications of the Snowden revelations, overlooked by those intent on fitting each new leak into a one-dimensional picture of NSA overreach. Morozov writes, "[W]e might be living through a transformation in how capitalism works, with personal data emerging as an alternative payment regime . . . . [T]o remain relevant and have some political teeth, the surveillance debate must be linked to debates about capitalism---or risk obscurity in the highly legalistic ghetto of the privacy debate."
Jack agrees with Morozov on two points: personal data is replacing cash as the currency for many services (and, I would add, products), and data-mining and state surveillance are enabled by individual decisions to voluntarily give up personal data possibly to the detriment of the collective interest. But Jack also points out that there may be no way to halt what Morozov describes as the “disturbing trend whereby our personal information---rather than money---becomes the chief way in which we pay for services."
I think Jack's right. We probably have to accept the trend for what it is and figure out how to deal with it. This is different from the position staked out by Morozov, who a) largely dismisses the value of measures that fall short of actually countering the trend and b) asserts that "[n]o laws and tools will protect citizens who, inspired by the empowerment fairy tales of Silicon Valley, are rushing to become data entrepreneurs, always on the lookout for new, quicker, more profitable ways to monetise their own data."
I disagree on both fronts. It doesn't make sense to expect users to stop using their data as currency; people will continue to engage in what essentially amounts to privately contracting with companies (data for services) for the foreseeable future. But this need not herald digital doom. One small example of what we can do to protect the citizenry, legally speaking: change how we construe such contracts. Present construction favors not consumers but companies, which wield almost all the power when it comes to creating and changing the terms of their privacy policies and license agreements (think Facebook) and tend to retain the upper hand even when they harm consumers by mishandling or misusing data. This power imbalance should not be overlooked when discussing the problems with a data-as-currency world.
I wrote briefly on how the law figures into this dynamic back in October as part of my series on how to hold software makers liable for insecure code:
[C]ourts will not hold software providers liable for harms brought about for products or services for which users did not offer some form of payment---or what lawyers call “consideration.” This is the basic rule underlying Bruce Schneier’s observation that “[f]ree software wouldn’t fall under a liability regime because the writer and the user have no business relationship; they are not seller and buyer.” Schneier is correct—as long as we’re talking about a private ordering regime. A different legal framework, however, might make for a different rule. For example, providers of free software generate revenue not by extracting money from the users, but rather by extracting data that they are then able to monetize. A statute that creates a duty for software providers to institute safeguards to secure this data or restrict its use might allow users to bring suit in the event of a security breach under tort theories of negligence or misrepresentation. But in the absence of such a statute, the fact that much software and many Internet services are free will remain a sticking point for users seeking compensation for security-related injuries. Last year the social networking service LinkedIn was hit with a high-profile class action suit after hackers breached the company server and posted 6.5 million hashes corresponding to LinkedIn accounts on a forum. Sixty percent of these hashes were later cracked. The plaintiffs alleged that LinkedIn had failed to utilize industry standard protocols and technology to protect its customers’ personally identifiable information, in violation of its own User Agreement and Privacy Policy. A federal court in California threw out the case this spring in part on the grounds that the policy was the same for users of the free and premium versions of the service. Specifically, the court found that the complaint “fails to sufficiently allege that Plaintiffs actually provided consideration for the security services which they claim were not provided.”As Morozov observes, personal data has become part of an alternative payment regime in practice. But as the above excerpt suggests, it is critical to recognize that our legal institutions have not yet evolved to acknowledge this shift. As far as the courts are concerned, consumers are using Facebook and (basic) LinkedIn for free. If we want to be serious about privacy and the broader repercussions of digital capitalism moving forward, we will need to change this cramped legal understanding of what consumers are giving up in exchange for "freebies"---and of what consumers are owed when that exchange turns out to be unconscionable at a macro level. Tweaking the terms under which private companies get to collect, store and use our data will of course have implications for government access to our data. But there are other ways to begin resolving what Morozov describes as a new and foundational tension between capitalism and democracy. As others have already observed, the time is ripe to reexamine the third-party doctrine, which allows the government to obtain personal data from private companies without triggering Fourth Amendment protections. The problem with the third-party doctrine is that it makes for nefarious bedfellows. The government should have a vested interest in protecting me from predatory big-data companies, not giving those companies unbridled access to my information. Morozov closes with a question that strikes me as a bit of a non sequitur: "Should we not be more critical of the rationale, advanced by the NSA and other agencies, that they need this data to engage in pre-emptive problem-solving?" At least in the context of the particular problems created by data-driven commerce, it seems to me that the Snowden leaks do not logically lead us to question NSA's rationale for sifting through personal data. The revelations do, however, compel us to revisit the legal and (as Morozov puts it) "unbearably technical" pathways by which NSA and other government entities are able to access that data.
Jane Chong is former deputy managing editor of Lawfare. She served as a law clerk on the U.S. Court of Appeals for the Third Circuit and is a graduate of Yale Law School and Duke University.