Cybersecurity & Tech

Normalizing Surveillance

Susan Landau
Monday, August 30, 2021, 10:20 AM

In developing a system for preventing the spread of child sexual abuse material that involves scanning the material of all those using certain apps, Apple is acclimatizing the idea of bulk surveillance.

A person holds an iPhone (Aaron Yoo/https://flic.kr/p/2iZpHFJ/CC BY-ND 2.0/https://creativecommons.org/licenses/by-nd/2.0/)

Published by The Lawfare Institute
in Cooperation With
Brookings

Apple has repeatedly supported user privacy through past technical implementations and policy initiatives. However, Apple took a large and very public step in the opposite direction on Aug. 5. The company provided a technical solution that it believes would simultaneously continue to protect the privacy of peoples’ communications and devices while allowing the company to track those who share child sexual abuse material (CSAM). But in a conflict of two social goods—providing privacy and security of communications and stored data, and preventing harm to children—Apple has made a gamble that normalizes surveillance.

The company’s action arises from a decades-old fight over encryption. Law enforcement continues to fight against the public’s use of strong encryption despite the national security community essentially acceding to it long ago. While former FBI Director James Comey chose not to raise the CSAM issue, current FBI Director Christopher Wray is leading full force with this incendiary argument. The FBI’s approach is a dangerous direction to take the encryption discussion (which is likely why Comey and others avoided doing so). The added sensitivity around such a hot button topic only compounds the difficulty of reaching agreement on the appropriate public policy response to CSAM.

The CSAM problem is also exacerbated by all evidence of the crime being encrypted. So Apple has decided to address the problem by scanning images prior to encrypting them in two critical applications.

The first involves images sent by children via the Messages app. When children who are part of Family Sharing upload or download sexually explicit photos sent over the app, the photo will initially be blurred, and the children will be warned about the content and asked whether they want to view it. Children under 13 will also be told that their parents will be notified if they choose to view the photo (those 13 and over receive only the initial notice). Apple will use machine learning technology to recognize the sexually explicit photos. Parents of children under 13 learn only about the photo (but don’t see the image). Parents of children 13 and over are not informed.

The second application seeks to prevent the uploading of CSAM images to iCloud Photo. This measure involves scanning photos on a user’s device when the user is uploading photos to iCloud Photo. The scan relies on perceptual hashes, compression functions that take a large file and produce a much smaller one. The function changes only slightly with a small change in the input image, such as a change in background shadows or cropping. Apple has put various technical protections into iCloud Photo, including having a threshold of 30 images that appear to be CSAM before a human reviewer is called in to check.

Apple has designed both systems to prevent information leakage. In the case of the Messages app, privacy protections include that all machine learning will occur on the device and that Apple will not receive copies of the images. The privacy-protective measures of the iCloud Photo CSAM detection system include a second check from Apple servers on whether the image is CSAM before people review it; reviewers will examine only low-resolution versions of photos to protect the user’s privacy.

At first, this approach appears to be a good compromise—but digging deeper uncovers some disturbing issues.

First, for some children, use of the technology will be profoundly distressing. LGBTQ children often begin to question their sexual orientation or identity before they are willing to share such information with others, including parents. Sometimes these children, seeking community, explore their identity online. Yes, they are at risk from sexual predators. Many are at serious risk of suicide. Outing them, even to their parents, greatly increases peril to their mental and physical safety. Yet this is likely to be an effect of the Apple technology.

Second, surveillance of childrens’ communications carries a terrible lesson for young people. It would be inappropriate for parents to turn to spying technology without informing their children that someone—or something—is observing the messages they send and receive. This puts parents in the unenviable position of explaining that surveilling communications—albeit done by a program—is acceptable. That’s not a set of values I would want to impart to my children. Nor would many of us.

Third, Apple’s solution for its messaging app works through a redefinition of end-to-end encryption (E2E) with a new meaning that a communication is end-to-end encrypted until it reaches the recipient’s phone. Previously an iPhone (or iPad) user could use the Messages app to send a message to another iPhone or iPad user and it would be E2E encrypted via iMessage, Apple’s E2E encrypted messaging app. But Apple’s new definition of E2E encrypted means that Apple tools could have access to any decrypted contents.

Such a redefinition of E2E encryption is hard to understand. The public is well aware of how much communications and devices are under attack. E2E encryption has always signaled that a communication is secured between sender and receiver; securing communications against intruders means securing communications against other apps on the phone. By creating a tool that enables surveillance of the images of all users, rather than a targeted capability, Apple is starting down the road of bulk surveillance. It is as if the Supreme Court had ruled in Katz v. United States that people have a constitutionally protected expectation of privacy in communications but only after their communications have been analyzed and determined to be free of criminal activities.

Finally, the technologies, especially for iCloud Photo, are highly complex. In making its announcement, Apple published three analyses of the behind-the-scenes cryptography by well-known and well-respected scientists. I trust their analyses that the cryptography works correctly. But the Apple CSAM detection system is not a cryptosystem; it is a complex security scheme for detecting targeted CSAM content. And that complexity is the weak spot. Scratch any computer security researcher, and they’ll tell you that complexity is the bane of security. So it is not just privacy that Apple is endangering with its two client-side scanning systems; it’s security. A surveillance system that undermines end-to-end encryption for a messaging app and scans data on the device—yes, even only images destined for iCloud Photo—creates serious risk. Apple is making its devices less secure.

All of these concerns pale, however, with the surveillance mechanism Apple will have in the Messages app and iCloud Photo. Apple’s software will conduct surveillance of images in the Messages app while the data is on the user’s device. And unlike the searches that law enforcement does, which are required by law to be targeted to individuals under a court order, these searches will be on all images sent or received through the Messages app by children enrolled in a Family Sharing plan or images to be uploaded to iCloud Photo. That’s not users suspected of sharing CSAM; that’s all users. I call that bulk surveillance. Currently both technology and policy prevent mobile devices from being ubiquitous surveillance machines. Apple has now provided technology for enabling such surveillance.

Apple says that the CSAM detection system has been designed to prevent its misuse and that it would refuse government demands to repurpose its system. But as the Electronic Frontier Foundation and others have observed, all it will take to force Apple to accede is a law requiring scanning for other types of forbidden material, whether it be leaked government documents or a list of students in an LBGQT organization. Princeton University computer science professor Jonathan Mayer and his graduate student Anunay Kulresetha discussed how easily an Apple-like system can be repurposed for censorship and surveillance. They should know; they built a system for CSAM detection very similar to Apple’s—and tested it.

Apple’s technology will educate children that it’s reasonable if someone snoops on your communications as long as the surveillance is done by a program, not a person. Apple will similarly be telling the world that it’s okay if someone snoops on your data as long as the surveillance is done by a program, not a person. These are terrible lessons.

Apple took a courageous stance in the San Bernardino case. Here it appears that Apple was seduced into thinking that beautiful cryptography could build a privacy-protective CSAM detector. But it’s not just the security complexity that will likely prevent the system from achieving its goals; a more fundamental issue is that the system will normalize ubiquitous surveillance.

Mobile phones are the radios in people’s pockets that reveal where they’ve been and, often, whom they’ve been with. The client-side scanning that Apple is building will be the snoopers in people’s pockets that reveal all the minutiae of their lives. Maybe the effect of the Apple changes will be that people trust their devices less and go back to talking honestly only face-to-face and paying in cash, but that seems unlikely. The more likely outcome is that people keep their devices of convenience and suffer a major loss of privacy. And Apple, the company that says, “Privacy is a fundamental human right ... and one of our core values,” will have been the cause.


Susan Landau is Professor of Cyber Security and Policy in Computer Science, Tufts University. Previously, as Bridge Professor of Cyber Security and Policy at The Fletcher School and School of Engineering, Department of Computer Science, Landau established an innovative MS degree in Cybersecurity and Public Policy joint between the schools. She has been a senior staff privacy analyst at Google, distinguished engineer at Sun Microsystems, and faculty at Worcester Polytechnic Institute, University of Massachusetts Amherst, and Wesleyan University. She has served at various boards at the National Academies of Science, Engineering and Medicine and for several government agencies. She is the author or co-author of four books and numerous research papers. She has received the USENIX Lifetime Achievement Award, shared with Steven Bellovin and Matt Blaze, and the American Mathematical Society's Bertrand Russell Prize.

Subscribe to Lawfare