The FTC, 1Health.io, and Genetic Data Privacy and Security
A genetic testing company publicly stored consumers’ genetic data with no encryption. The FTC stepped in.
Published by The Lawfare Institute
in Cooperation With
The Federal Trade Commission (FTC) has finalized an order with 1Health.io (formerly Vitagene), a genetic testing company that was the subject of a June 2023 FTC complaint. 1Health.io, to quote the FTC’s recent press release, “left sensitive genetic and health data unsecured, deceived consumers about their ability to get their data deleted, and changed its privacy policy retroactively without adequately notifying consumers and obtaining their consent.”
Genetic data privacy has been receiving more attention in the U.S.—including from intelligence organizations worried about national security risks; medical professionals evaluating discrimination threats and gaps in current law; advocacy groups concerned about law enforcement abuses and unchecked state surveillance; and members of the public with mixed opinions on companies sharing genetic data with police. While the particulars of these concerns vary, there is a clear, underlying recognition of the importance of genetic data and its protection.
The FTC’s finalized order with 1Health.io is the first time the commission has pursued an enforcement action focused on both genetic privacy and security. Amid these aforementioned concerns, it represents a likely first step of many more—as the genetic testing and genetic data analysis markets continue to grow. Critically, the FTC’s action also highlights the critical nature of continued privacy enforcement actions and the gaps in current U.S. laws and regulations regarding genetic data privacy and the prevention of associated harms.
This article examines the FTC’s initial June 2023 complaint against 1Health.io and its core concerns about the company’s practices. It then examines the FTC’s more recent, finalized order with 1Health.io, which includes a penalty paid to the commission and a number of prohibitions and other requirements to improve the company’s data-handling procedures and hopefully prevent similar incidents in the future. The article concludes by analyzing how the FTC’s action fits into the broader U.S. privacy law and regulation landscape—and argues that the U.S. needs stronger controls focused on avoiding egregious privacy harms in the first place.
The FTC’s June 2023 Complaint
When the FTC filed a complaint against 1Health.io in June 2023, its enforcement concerns fell into three buckets: deceptive privacy and security promises, failing to notify consumers of changes to genetic data sharing with third parties, and publicly exposing consumers’ health and genetic information on the internet. It argued that the first bucket of practices was misleading, the second was unfair, and the third related to a deceptive practice.
Padlock icons and privacy-protective language abound on 1Health.io/Vitagene’s website, according to the FTC complaint, with the company making such statements as “we use the latest technology and exceed industry-standard security practices to protect your privacy.” Numerous other statements on the website’s main pages, in “Frequently Asked Questions,” and elsewhere expressed a company commitment to data privacy and security. 1Health.io also stated that (a) DNA samples collected through consumer testing kits and (b) the results of the DNA tests “are stored without your name or any other common identifying information.” The company promised that consumers can delete their information from all of the company’s servers “at any time.” And in similar form, 1Health.io/Vitagene said on multiple webpages that it destroyed consumers’ DNA saliva samples after analysis.
Little of this was true. Although the company claimed to have industry-exceeding data security measures, it in fact “did not exceed industry-standard security practices to protect the privacy of consumers’ sensitive personal information.” Instead, the company stored DNA results alongside consumers’ names and “other common identifying information” (and on a public server with no encryption, discussed further below). Consumers may have been able to reach out to request that their data be deleted, but 1Health.io “did not have an inventory of consumers’ information” and lacked the ability to delete the data of all consumers who requested it. It’s not entirely clear based on the complaint what happened when consumers filed deletion requests, other than the fact that the company could not actually process them. And when it came to saliva samples, the FTC complaint said, 1Health.io was similarly misleading consumers; its contract with the genotyping laboratory did not mandate DNA saliva sample destruction after results were produced.
Drawing on its authority under Section 5 of the FTC Act, the FTC alleged that these statements constituted deceptive trade practices. This is because these statements were “false or misleading” when compared against the reality of 1Health.io’s data practices.
The second and third buckets of the FTC’s complaints—about third-party data sharing and public information exposure—follow in similar form, with the company not clearly informing consumers about the facts and departing from the most basic privacy and security measures.
With respect to the data sharing, 1Health.io’s privacy policy until April 2020 stated that it would share consumers’ personal information only under certain conditions, including with physicians and medical professionals under a consumer’s direction; with the company’s own business partners and service providers, such as credit card processors and genotyping labs; “only as necessary to” improve company services; as required by law; with any third party with a consumer’s consent; or in the process of transferring its business to another entity. That changed when the company revised its policy in April and again in December 2020. Following the revisions, 1Health.io stated, in the FTC’s wording, that it:
shares personal information with third parties such as pharmacies, supermarket chains, nutrition and supplement manufacturers, and other providers and retailers so they can promote and offer their products and services to [1Health.io’s] customers; with third parties for their own services and marketing purposes unless a consumer opts out of such sharing; and with partners, third parties, or affiliates, including for those third parties’ own purposes.
The company did not take any steps to notify consumers who provided their information prior to the policy’s 2020 revision. Here, the FTC argued that the third-party data-sharing changes were an unfair act or practice under Section 5 of the FTC Act, as the company’s “retroactive application of its revised privacy policies caused or is likely to cause substantial injury to consumers that is not outweighed by countervailing benefits to consumers or competition and is not reasonably avoidable by consumers.” This language echoes the FTC’s three-pronged unfairness test: An injury must be substantial, not outweighed by any countervailing benefits to consumers or competition produced by the practice, and not reasonably avoidable by consumers. Here, individuals who already gave the company their information years prior were likely unaware the company could now share it with supermarket chains, supplement manufacturers, marketing firms, and others.
Third, 1Health.io stored data using Amazon Web Services (the Amazon cloud), including health reports about consumers and consumers’ raw genotype data. The company made some of this highly sensitive information accessible to the public. Around 2016, it created a publicly accessible bucket of data with health reports for at least 2,383 consumers and a publicly accessible bucket with raw genetic data—sometimes with individuals’ first names—for at least 227 consumers. It “did not use any access controls to restrict access to this sensitive data, encrypt it, log or monitor access to it, or inventory it to help ensure ongoing security.”
Making matters worse, Amazon Web Services alerted the company about this public exposure in July 2017; a security testing company (unnamed in the FTC’s complaint) conducted a test for 1Health.io in November 2018 and notified them of the lack of access controls; and in June 2019, a security researcher emailed the company about the problem, stating they were also able to verify the data corresponded to real people. The researcher then reported it to the media, and it was covered in early July 2019. Genealogy reports made public, according to Bloomberg, “included customers’ full names alongside dates of birth and gene-based health information, such as their likelihood of developing certain medical conditions.”
When 1Health.io/Vitagene finally commenced an investigation in July 2019, its lack of access logging rendered it “unable to determine exactly when the Buckets had been created or whether anyone other than the security researcher had accessed, downloaded, or transferred any of the sensitive health, genetic, and personal information they contained.” Notified consumers had plenty to say. Emails back to the company included:
- “I am horrified that my dna is out there for anyone to use.”
- “This is worse than credit card and financial info because it’s related to my health.”
- “Shame on Vitagene for not having its consumers in their best interest.”
It is not clear why the company took so long to investigate reports of consumers’ health and genetic information being stored in a publicly available cloud bucket. For this third bucket of concerns, a lack of inventory of this and other data, the FTC said, rendered 1Health.io/Vitagene unable to delete the data of all customers who requested it—making its representations “false or misleading.”
The FTC’s September 2023 Order With 1Health.io
This month, the FTC finalized an order with the company, following up on the commission’s June 2023 complaint. The finalized order requires that 1Health.io pay the FTC a $75,000 fine and provide customer information to the commission so that the FTC can redress affected consumers. Perhaps more significant are the prohibitions and other requirements in the order.
1Health.io has to instruct any laboratory it has worked with to destroy physical DNA saliva samples retained for more than 180 days after the company received the lab’s analysis, and the company must provide a written statement to the FTC, under penalty of perjury, that it has done so. The company must also establish and implement, within 60 days of the order, a comprehensive information security program. This includes designating a qualified employee or employees to run the program, documenting the program in writing, assessing risks to company-held data at least once every 12 months, and specific technical measures, among others. For instance, the FTC requires the company to implement encryption or “at least equivalent protection” for all health information “reasonably linkable to an individual consumer, computer, or device, including in transit and at rest.” Given the company’s prior lack of even the most basic security measures—such as encrypting stored genetic data and not making it publicly accessible on the internet—the requirements for a security program fit well with attempting to prevent similar issues in the future.
1Health.io also must undergo third-party information security assessments, where the company proposes an assessor to the associate director for enforcement for the FTC’s Bureau of Consumer Protection, and the associate director can approve or veto the assessor. The initial assessment must be conducted for the first 180 days after the order, and subsequent assessments must follow every two years thereafter for 20 years. Each time an assessment is completed, the company has 10 days to submit the complete, unredacted version to the FTC as well as a “proposed redacted copy suitable for public disclosure.” Additionally, within 10 days of notifying any U.S. government organization (including state or local) about an incident, 1Health.io must notify the FTC. (Because the order does not stipulate when the company must notify government agencies of a covered incident, this presumably means that companies already reporting an incident to the government must essentially add the FTC to their notification list.) Within 10 business days of discovering that a consumer’s individually identified health information was publicly exposed without authorization, the company must also submit a report to the FTC about it.
The list of requirements continues, such as stipulating that 1Health.io cannot misrepresent in any manner how it matches up against industry-standard security and privacy practices, how it does or does not store health information alongside other data, its deletion of DNA samples and personal information upon request, and more.
Privacy, Genetic Data Sharing, and Existing Laws and Regulations
This case stands out for at least a couple of reasons. First, this is the FTC’s first case focused specifically on genetic data privacy and genetic data security. Given the rapid growth of the direct-to-consumer genetic testing market (by one estimate, worth over $465 million in the U.S. in 2020), and the lack of robust security across genetic systems, this is very likely to be a continued area of focus for enforcement actions. Second, the FTC is signaling its attention on companies that store consumers’ data and then subsequently modify their privacy policies without sufficiently informing those people. Indeed, Samuel Levine, director of the FTC’s Bureau of Consumer Protection, said when the complaint was first announced in June that “the FTC Act prohibits companies from unilaterally applying material privacy policy changes to previously collected data.”
Within the broader data-gathering and data-sharing landscape, though, the case underscores the persistent gaps in U.S. law around particularly sensitive kinds of data. There are interesting, long-standing, and consequential debates in the privacy community about whether data regulations should be based on types of data, based on use cases for data, based on harms for data (including collective harms), based on the ability to combine data together, and so on. In the U.S., many federal data privacy laws are directed at specific industries, such as the Health Insurance Portability and Accountability Act (HIPAA) for certain covered health entities or the Family Educational Rights and Privacy Act (FERPA) for certain covered educational institutions. (These two, and other similar laws, do not fully cover entire industries and all their data activities.) But the nature of genetic data—unique to individuals, linkable to relatives, and, unlike an email address, unalterable—necessitates consideration of specific, additional data controls, privacy protections, and security measures.
Despite the FTC’s determination that 1Health.io’s practices were illegal, this case sheds light on the limitations of federal privacy rules for companies handling genetic data to better prevent harms. It was not illegal at the federal level for 1Health.io to share consumers’ genetic data with third parties per se, but just for the company to do it deceptively. This echoes the FTC’s February 2023 enforcement action against prescription drug provider GoodRx: It was not the sharing of health data with third parties that was prohibited (or heavily constrained) by law, but merely the sharing of health data with third parties in a manner that is considered deceptive to consumers.
Hence, a pattern persists, wherein the current U.S. regulatory structure tackles (some) harmful uses of data after they occur—rather than trying to minimize their occurrence in the first place, including by placing more federal limits on initial data collection and sharing. With information as sensitive as genetic data, the question for legislators is how best to empower the FTC to continue pursuing these enforcement actions while creating a strong baseline of protections for particularly sensitive types of data, particularly vulnerable populations, and particularly acute kinds of harm.
Despite the lack of these federal rules, though, the FTC took an important enforcement action in the 1Health.io case. It fined a company for unfair and deceptive trade practices and went a step further to better prevent future harm by finalizing an order with the company. Requiring 1Health.io to implement an information security program, for example, would hopefully avoid future instances of making customers’ genetic data publicly available online. Within the broader scheme of U.S. privacy law and regulation, the FTC took a productive action for consumers’ privacy and did what it could within the scope of its legal authority—while continuing to punch well above its weight given its privacy resource limits.
But a company was able to collect consumers’ genetic data and store it without having the most basic security measures in place. Because the company was not even logging who accessed the buckets, there is no telling whether any malicious individuals or entities accessed the data and, if so, how many. The most significant harms to consumers may not yet be realized.