Criminal Justice & the Rule of Law

Opening a File Whose Hash Matched Known Child Pornography Is Not a ‘Search,’ Fifth Circuit Rules

Orin Kerr
Saturday, August 18, 2018, 10:09 AM

The Fifth Circuit has handed down a fascinating computer search case in United States v. Reddick. Here's the question: If a private company runs a hash of a file and compares the hash to those of known images of child pornography, and it finds a match to a known image and forwards on the file to the government, is it a “search” for the government to then open the file to confirm it is child pornography?

Published by The Lawfare Institute
in Cooperation With
Brookings

The Fifth Circuit has handed down a fascinating computer search case in United States v. Reddick. Here's the question: If a private company runs a hash of a file and compares the hash to those of known images of child pornography, and it finds a match to a known image and forwards on the file to the government, is it a “search” for the government to then open the file to confirm it is child pornography? Held, per Judge James Ho: No, it is not a search under the private search reconstruction doctrine.

First, some background. The private search reconstruction doctrine lets the government recreate a private search as long as it doesn’t exceed the private search. The idea is that the private search already frustrated any reasonable expectation of privacy. Merely recreating what the private party did is within the private search and is not a new government search. But in the case of computers, that raises difficult issues: What is merely a recreation of a prior private search, and what exceeds the search?

In Reddick, the Fifth Circuit held that actually opening a file that had matched to a known image of child pornography was not a search because “the government effectively learned nothing from [the agent's] viewing of the files that it had not already learned from the private search.” Here's the analysis:

When Reddick uploaded files to SkyDrive, Microsoft's PhotoDNA program automatically reviewed the hash values of those files and compared them against an existing database of known child pornography hash values. In other words, his “package” (that is, his set of computer files) was inspected and deemed suspicious by a private actor. Accordingly, whatever expectation of privacy Reddick might have had in the hash values of his files was frustrated by Microsoft’s private search.

When Detective Ilse first received Reddick's files, he already knew that their hash values matched the hash values of child pornography images known to NCMEC. As our court has previously noted, hash value comparison “allows law enforcement to identify child pornography with almost absolute certainty,” since hash values are “specific to the makeup of a particular image's data.” United States v. Larman, 547 F. App'x 475, 477 (5th Cir. 2013) (unpublished). See also United States v. Sosa-Pintor, 2018 WL 3409657, at *1 (5th Cir. July 11, 2018) (unpublished) (describing a file's hash value as its "unique digital fingerprint").

Accordingly, when Detective Ilse opened the files, there was no “significant expansion of the search that had been conducted previously by a private party” sufficient to constitute “a separate search.” Walter v. United States, 447 U.S. 649, 657 (1980). His visual review of the suspect images—a step which merely dispelled any residual doubt about the contents of the files—was akin to the government agents’ decision to conduct chemical tests on the white powder in Jacobsen. “A chemical test that merely discloses whether or not a particular substance is cocaine does not compromise any legitimate interest in privacy.” 466 U.S. at 123. This principle readily applies here—opening the file merely confirmed that the flagged file was indeed child pornography, as suspected. As in Jacobsen, “the suspicious nature of the material made it virtually certain that the substance tested was in fact contraband.” Id. at 125.

Significantly, there is no allegation that Detective Ilse conducted a search of any of Mr. Reddick’s files other than those flagged as child pornography. Contrast a Tenth Circuit decision authored by then-Judge Gorsuch. See United States v. Ackerman, 831 F.3d 1292 (10th Cir. 2016). In Ackerman, an investigator conducted a search of an email and three attachments whose hash values did not correspond to known child pornography images. 831 F.3d at 1306. The Tenth Circuit reversed the district court's denial of a motion to suppress accordingly. Id. at 1309. Here, by contrast, Detective Ilse reviewed only those files whose hash values corresponded to the hash values of known child pornography images, as ascertained by the PhotoDNA program. So his review did not sweep in any “(presumptively) private correspondence that could have contained much besides potential contraband.” Id. at 1307.

Interesting case.

It seems to me that there are two different questions potentially at work here. One question is whether opening a file after a private party has run a hash on the file exceeds the scope of the private-party search for any kind of file. A second question is whether there are special rules for opening images of child pornography under the contraband-search cases of Jacobsen and Illinois v. Caballes. On my initial read, I see Reddick as more about the second question than the first.

With that said, I have to think more about whether Reddick is a persuasive application of those cases. Here’s why I’m not sure. The key to the contraband-search cases of Jacobsen and Caballes is that the field-testing and dog-sniffing revealed nothing other than the presence or absence of contraband. The drug field testing in Jacobsen either returned positive or negative. The well-trained drug-sniffing dog in Caballes either alerted to the presence of drugs or didn’t. It was a binary situation in which the only information learned was the presence or absence of contraband.

When a government agent opens a file, though, is more learned than whether the image is child pornography? I gather the opener of the file sees the full image, and then, after seeing the image, makes a judgement about whether the file is child pornography. The ultimate goal is to confirm that the image is child pornography. But more is learned than that; it’s arguably less like using a drug-sniffing dog to alert for drugs than it is actually opening the trunk of the car and seeing the drugs. That latter act would be a search, even if the goal is just to confirm that a dog’s alert for drugs was correct and to actually find the contraband.

I suppose this hinges on what the baseline knowledge should be for a opening a file. It’s an interesting question. If it is known that a particular hash value corresponds with a particular known image, how do you model what is learned by opening a file that matched that hash? Do you say that the opener of the file already has the knowledge of what that particular image looks like, and that opening the file to see that it is that image really just confirms that it’s a match and doesn’t tell the agent anything else? Or do you model the agent’s knowledge as just being that a file matched with some known image, and that opening the file thus gives the opener more information about what the file looks like? And in trying to answer that, do you consider just the individual opener’s knowledge, or do you impose some sort of collective-knowledge doctrine under which you consider the knowledge set of some broader group? I’m not sure.

It occurs to me that a related (but perhaps stronger) way for the court to have reached the same result would have been to rely on what some have called the single-purpose container doctrine. This doctrine goes back to a footnote in Arkansas v. Sanders, in which the Supreme Court stated that “some containers (for example a kit of burglar tools or a gun case), by their very nature, cannot support any reasonable expectation of privacy because their contents can be inferred from their outward appearance.” In Robbins v. California, the court explained that for this doctrine to apply, "a container must so clearly announce its contents, whether by its distinctive configuration, its transparency, or otherwise, that its contents are obvious to an observer."

It seems at least plausible that this could apply to opening a file with a known hash. If you know that a particular image has a particular hash, and you then have a file with that hash, then the information you have before you open the file “clearly announce[s] its contents ... by its distinctive configuration” so that “its contents are obvious to an observer.” The contents “can be inferred by [the file’s] outward appearance,” at least if you take “appearance” to include the hash value of the file. Notably, though, this approach would be broader than just child pornography. It would apply to opening any files with known hashes.

Finally, I gather that Reddick does not implicate the existing circuit split on how the private search reconstruction doctrine applies to computer searches. The existing split is on how to measure how much is “searched” when a private party accesses a computer: Does the private party access search the entire computer, or just the file, or the folder, or what was actually observed? In this case, however, there was apparently just one file at issue.

Anyway, it's a fascinating case. And it was a very well-written opinion from Judge Ho, I thought, at least after you ignore the extraneous citations to legal scholarship.


Orin Kerr is a Professor at the University of California, Berkeley School of Law. He is a nationally recognized scholar of criminal procedure and computer crime law. Before becoming a law professor, Kerr was a trial attorney in the Computer Crime and Intellectual Property Section at the Department of Justice and a Special Assistant U.S. Attorney in the Eastern District of Virginia. He is a former law clerk for Justice Anthony M. Kennedy of the U.S. Supreme Court and Judge Leonard I. Garth of the U.S. Court of Appeals for the Third Circuit.

Subscribe to Lawfare