Cybersecurity & Tech Foreign Relations & International Law

Governing Platforms Through Apple’s App Store in the U.S. and China

Farzaneh Badiei
Thursday, April 28, 2022, 8:01 AM

The tools governments use to regulate behavior online are very similar, even in countries as seemingly dissimilar as the United States and China, but what differs is the incentive structures they create.

"Glass cube" Apple store in New York. (atmtx, https://flic.kr/p/athJFn; CC BY-NC-ND 2.0, https://creativecommons.org/licenses/by-nc-nd/2.0/)

Published by The Lawfare Institute
in Cooperation With
Brookings

The American and Chinese approaches to online platform governance are seemingly vastly different. One is regarded as a democratic utopia, and the other as an undemocratic dystopia. But how is Apple’s App Store governed in the two nation-states? This post will illustrate that the U.S. and China, regardless of their claimed political values, use very similar policies and regulatory frameworks to govern technology. 

However, what does vary between states is how private actors can respond to the policies: To clarify, it is the extent to which private actors and members of civil society can object to and change their government’s course of action. Without the participation of civil society, every government would follow the same playbook: regulating, removing, and disclosing user data. 

This post will use Apple’s App Store as a case study. The App Store is an important internet platform, both globally and locally. It is paradigmatic of online platforms because app stores not only regulate apps on their platforms, but they can also regulate the users of those apps. So, if an online platform that has to be hosted on the App Store has governance values that do not align with Apple’s interests, the platform in effect cannot uphold those values on the App Store.  

The discussion is divided into three parts: The first part describes how app stores function and explores the parts of the digital ecosystem of an app store over which their providers have digital governance responsibilities. Then the post discusses the regulation of Apple’s App Store in the United States and China. The third part of this piece conceptualizes Apple’s relationship to China and the U.S., applying the theory of outsourcing governance to argue that China is more able to successfully outsource governance of apps to Apple because there are fewer ways to challenge Chinese regulatory and legislative decisions. 

Apple’s Governance of the App Store 

App stores’ role in governance of apps can be divided into two parts: They act as enforcers of local laws, and if the apps do not follow local laws, the stores remove them from the platform; and they play a quasi-regulatory role through imposing their app policies. Usually, the enforcement measures they take are rejection of the app, removal of the app, suspension of the developer’s account, limited visibility of the apps and account termination. 

Apple has had a large effect on user-generated-content apps, especially in three aspects: content governance, security governance and privacy governance.

Content Governance 

Apple provides a “curated” app store. This means that it has reviewers and editorial teams that monitor the content of an app for approval. It has lengthy guidelines about what should be allowed on the app and what should not be.

Apple in effect has become a quasi-regulatory body for the apps. In the legal section of the guidelines , Apple obliges app developers to comply with local laws wherever the app provides a service: “Apps must comply with all legal requirements in any location where you make them available.” App stores (in this case, Google’s) also provide the mechanism for app developers to restrict access to their apps on a geographic basis.

Apple has a large and effective role in content governance by regulating app developers’ policies and behavior. For example, Apple can oblige apps to have a content governance mechanism in place. Because of requests by Apple and Google, Telegram added its first terms and conditions about content, requiring: “No calls for violence, no porn and no copyright infringement on public broadcast channels.” In 2018, Apple had first threatened and then removed Telegram from the App Store due to inappropriate content. In 2018, Apple removed Tumblr from the App Store because it had not adequately filtered child abuse materials. And in January 2021, during the riot at the U.S. Capitol, Apple removed Parler from the App Store for not complying with Apple’s terms of service.

Apple even asks app developers to follow specific content moderation measures. App Store review guidelines (see paragraph 1.2, “User-Generated Content”) require the developers of apps with user-generated content to provide a method for filtering objectionable material from being posted to the app, a mechanism to report offensive content and timely responses to concerns, the ability to block abusive users from the service, and contact information so that users can easily reach the app. 

The scope of inappropriate and objectionable content covered by Apple’s policies has been expanded over time. A brief comparison of the App Store’s review guidelines in 2021 to its previous guidelines shows that, for example, Apple defined objectionable content early on as “excessively objectionable or crude content … primarily designed to upset or disgust users[.]” Apple’s guidelines indicated that such content would be rejected.

This brief section became much more elaborate, and, by 2021, objectionable content had six categories, including (1) defamatory, discriminatory, or mean-spirited content; (2) realistic portrayals of people or animals being killed, maimed, tortured, or abused; (3) depictions that encourage illegal or reckless use of weapons and dangerous objects, or facilitate the purchase of firearms or ammunition; (4) overtly sexual or pornographic material; (5) inflammatory religious commentary or inaccurate or misleading quotations of religious texts; and (6) false information and features.

Enforcement of these policies can take place through direct content governance. For example, Apple can identify certain channels on an app and ask developers to block users’ access to the content of those channels. This happened in Belarus in 2020 when Apple asked Telegram to both block access to and delete the posts within a channel that was being used to organize protests. Apple stated that the channel contained personal information and deleted the posts because the content violated its privacy policies.

Security Governance

The App Store can also provide security governance of listed apps, which enhances the apps’ security. For example, app store providers can take down insecure or known malicious apps, or can impose a uniform and strict security requirement on apps, increasing overall app security. Apple imposes security requirements through its Apple Developer Program License Agreement. It prohibits apps from containing malware (malicious or harmful code or programs) or other internal components (such as computer viruses, trojan horses or “backdoors”). The agreement also requires security checks before apps are approved and reserves for Apple the right to undertake security checks, for example, for malware or other harmful or suspicious code, after an app is on the App Store. For example, it prohibits harmful adware that can create artificial click-through of ads that generate traffic.  

However, the process is not perfect. On some occasions, Apple did not allow developers to deploy necessary security updates, in one case for Telegram. This inability to effectively patch security vulnerabilities can make users even more vulnerable to cyberattacks from malicious actors (including identity theft, password exfiltration, unauthorized financial transactions, data theft and location monitoring). 

Privacy Governance

Apple can reject or remove apps from its store if they violate their users’ privacy. Apple requires developers to adhere to a variety of privacy-focused, privacy-conscious measures such as data minimization, obtaining user consent, and agreements not to broaden the legitimate purpose of data collection without prior consent. Reasons for rejection or removal can include developers collecting more data than they need or mishandling the data they collect. Apple’s app review team rejected approximately 215,000 apps for privacy violations in 2020.

Apple has become more proactive. Instead of relying on apps to obtain informed consent, Apple has acted on its own. Most recently, when downloading an app, Apple gives consumers a choice regarding being tracked by the app.

 Regulation of the App Store in the U.S. 

In the U.S., the regulation of the App Store takes place through a combination of legislative, adjudicative and executive processes. 

Legislation

In the U.S., one of the first pieces of legislation that addressed content regulation on digital platforms and transient networks was the Digital Millennium Copyright Act (DMCA) of 1998. The DMCA specifically focused on copyright-infringing materials and required internet platforms to take action against such content. Section 512 of the DMCA established limitations on liability relating to online material hosted by digital platforms. According to this section, digital platforms are not liable for copyright-infringing materials, as long as they take action against them when discovered. While Section 230 of the Communications Decency Act (CDA) immunizes platforms regarding liability, Section 512 of the DMCA requires them to act. 

The scope of the content regulation is clear and limited, however. Over the years, numerous cases have provided jurisprudence on how the DMCA should be applied. The App Store makes it possible to submit disputes about apps that infringe on intellectual property rights. According to Apple’s review guidelines, the company goes beyond copyright infringement and requires that apps with user-generated content also attend to objectionable materials. 

Beyond the DMCA and the CDA, other laws in the U.S. require digital platforms to engage with content moderation, including the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) and the Stop Enabling Sex Traffickers Act (SESTA). The laws eliminated the immunity protections granted to platform providers by Section 230. 

FOSTA’s aim was to amend and clarify that Section 230 of CDA does not prohibit the enforcement of federal and state criminal and civil law that relates to sexual exploitation of children, sex trafficking, and other purposes. 

Although the law was intended to facilitate the application of state and federal law to certain types of online content, the law made it conducive for platforms to take sweeping action against content and services related to sex workers. 

Adjudication

Adjudication involves third parties and the users of Apple. They generally sue Apple for hosting apps on the App Store that are deemed illegal. In Nelson et al. v. Apple, the plaintiffs, all private citizens, sued Apple because it benefited from hosting an illegal gambling app. The lawsuit claimed that by hosting illegal social casino apps, Apple facilitated and directly assisted creating the unregulated market of virtual casino games. The argument was that Apple enabled the apps to access a new market of consumers. California’s Business and Professions Code, Section 17200 (Unfair Competition Law and Unlawful Business Practices), was also invoked in this case against Apple. 

Another recent case involved the removal of Telegram from the App Store. Shortly after the Jan. 6, 2021, riot at the Capitol, Marc Ginsberg (former U.S. ambassador to Morocco) and the Coalition for a Safer Web filed a lawsuit (Ginsberg et al v. Apple Inc.) against Apple for not having Telegram removed from the App Store.

The plaintiffs argued in the complaint that Apple knew that Telegram was being used to intimidate, threaten, and coerce members of the public, but that it had not taken action against the app. The lawsuit alleged that Telegram facilitated anti-Semitic and anti-Black extremism and violence during Black Lives Matter protests in 2020, and was an important tool facilitating terrorist activity, including the attack on the Capitol on Jan. 6, 2021. 

The causes of action were negligent and infliction of emotional distress, and violation of the “Unfair and Unlawful” section of California’s Business and Professions Code. The plaintiffs argued that Apple did not follow its own terms and conditions when it did not take Telegram down from the App Store. Apple apparently found Telegram’s content moderation sufficient both during and after the Capitol riot, as it did not issue any public warning to Telegram, as it had before. In this lawsuit and similar lawsuits in general, the complainants usually ask for compensation for damages and the removal of the app through court ruling. 

Executive

The U.S. can attempt, and on occasion has succeeded in, regulation of social media platforms and apps through executive order. In 2020, the Trump administration issued Executive Order 13942 that forbade app stores from hosting TikTok and WeChat. The rationale behind the executive order was based on censorship, data protection and privacy concerns. The order also argued that through the spread of Chinese-developed applications, the government of China threatens the United States’ national security, foreign policy and economy. 

The executive order did not directly require app stores to take down TikTok, but Section 1(a) of the order prohibited any transaction by any person or with respect to any property subject to the U.S. jurisdiction with ByteDance (the owner of TikTok). The executive order was challenged in court, and federal judges issued multiple injunctions against it. In effect, it never became applicable in the United States.

Law enforcement agencies can utilize various laws to request information from online service providers. While the U.S. government has not requested removal of apps from the App Store, it has requested Apple customer data by invoking national security. Usually the FBI, in accordance with 18 U.S.C. § 2709 (Section 21 of the Electronic Communications Privacy Act of 1986), directs Apple to produce the name, address, and length of service for all services as well as accounts provided to the individuals that the FBI specifies in the request letter. 

There is no court order involved at this stage of disclosure; however, Apple has the right to challenge the request in a district court. Most of the unsealed National Security Letters are not about the metadata that Apple has about user activities on their apps, but the possibility exists for the FBI under the Foreign Intelligence Surveillance Act to request noncontent and content information via these letters. 

Regulation of the App Store in China

In this section, the focus shifts to considering similar rubrics for how China governs Apple’s App Store. Differences in the political and legal systems of the United States and China mean the rubrics will not be exactly the same, and in some cases issues relevant in the U.S. may be irrelevant in a Chinese context. But generally similar to the U.S., in China the App Store is regulated through legislation and executive branch processes. While adjudication also takes place in China against the App Store, the cases are more focused on Apple’s conduct regarding the fee structure and patent. 

Legislation and Decrees

As early as 2000, China promulgated regulations about online content. These initial regulatory attempts included legislation and decrees. Bodies such as the National People’s Congress and the State Council enacted legislation, in addition to myriad decrees and regulations from other ministerial units. China’s laws were not as limited as those in the U.S., in terms of defining boundaries. Chinese laws banned pornography, violence, anti-government content, and content that damages the reputation of the state and leaders. 

Under the Regulation on Internet Information Services of the People’s Republic of China, internet information service providers need to ensure the legality of the content on their platforms. The  provisions for the Administration of Internet News and Information Services

 (Article 19) forbid various kinds of content on online platforms such as threats to national security and interest, harmful content, disinformation, defamation, and illegal protests. 

Recently, China has consolidated existing laws and added a range of new internet laws that can affect content regulation, privacy and security. These new measures range from regulations for algorithmic content control to laws for personal information protection. Internet intermediaries with a large number of users are now required to restrict access for providers of illegal content and to publish social responsibility reports.

Adjudication

In China, lawsuits against Apple are usually brought by private parties claiming patent infringement. 

More recently, individual consumers in China have been allowed to sue Apple over the high App Store fee. However, private individuals have not brought lawsuits against Apple concerning harmful apps on the App Store, nor are there any class-action lawsuits related to the App Store. 

This is a notable distinction between American and Chinese approaches to platform governance. It may be that processes enabling civil society to challenge Apple or certain laws in China might be weaker than those in the U.S., but this claim needs further research, as it is anecdotal and based on desk research regarding the nature of the lawsuits. Further, the lawsuits filed by claimants in China may not be activism but, instead, simple commercial disputes. 

Executive

Internet platforms in China conduct self-regulation, as they do in the U.S. But as conceptualized by Ian Weber and Lu Jia, such self-regulation is directed mostly by the wishes of the government. Self-regulation in China is similar to co-regulation. Even if platforms are formally free to make their own policy decisions, the government tries to provide them with a structure. Weber and Jia argue that the “public pledge on self-discipline for China’s Internet Industry” issued in 2002 was such a structure.  

More recently, in 2021, Chinese content platforms such as Weibo and Tencent have agreed to enforce their self-regulation with more discipline to help preserve a “clean” cyberspace. This was in reaction to the Chinese regulatory crackdown on Chinese celebrity online fan culture. The authorities decided that ‌fan groups should be regulated, and some of the platforms pledged that they would encourage users to actively report illegal content.

Apple also receives requests from the Chinese government to remove apps because of illegal content, which it obliges. Apple’s 2020 Transparency Report revealed that it receives the highest number of removal requests of any country from China and that the “[r]equests predominantly related to apps with pornography or other illegal content.”

Apple and Hong Kong

During the Hong Kong uprising that began in 2019, Apple, on at least two known occasions, changed its platform governance as a direct result of this movement: Apple removed apps that protesters used to organize themselves during the uprising, and Apple did not join the coalition of tech platforms that announced they would not disclose data to the Hong Kong authorities.

Apple has been reportedly cooperating with the Chinese government on many fronts, especially in removing “problematic” apps from the App Store available in China. Apple complied with  of the requests for app removal it received from China in the first quarter of 2020. 

During the Hong Kong uprising, Apple removed HKmap.live. Apple did not initially approve HKmap.live, an app that protesters used to track the police, but even after being approved, the app was removed a short while later. Apple allegedly argued that it had to take down the app because Hong Kong police claimed that protesters used the app to locate and attack police. 

In 2020, China imposed a new national security law that removed legal protections enjoyed by Hong Kong nationals and removed the provisions that required authorities to obtain a court order before they can demand data from internet companies. During the uprising, some tech corporations announced they would pause or stop the processing of data requests from Hong Kong authorities. Apple, however, did not join the other tech corporations and said instead that the company would be assessing the new law.

Apple’s refusal to join other tech corporations weakened its attempts to prevent potential human rights violations. For example, Telegram is still accessible via the App Store in Hong Kong as of this writing. It was also an important app used by the protesters to organize themselves. The availability of Telegram in Hong Kong does not free it from Apple’s reins. Since Apple did not pause disclosing data to the authorities, it could still disclose the metadata of App Store users if Chinese (or Hong Kong) authorities request it. The metadata includes important information about Telegram users: the location of the user, potentially the identification of the user, and when and how many times the user downloaded the app. Such a disclosure would not necessarily disclose the identity of a given Telegram user, but it would disclose that a given identity had installed Telegram or other apps. 

The Conceptual Framework

Without considering the granular levels and the more democratic nature of the U.S. approach to platform governance, the tools that China and the U.S. use are similar. How those regulatory, governance tools can be challenged is different. For example, the executive orders that the U.S. president issues can be challenged in an independent court; the law enforcement requests for data disclosure can also be challenged. The difference is in Apple’s use of remedial processes to stand up against a law or to challenge certain decisions. In China, Apple rarely attempts to or has the avenues to challenge decrees or law enforcement decisions.

Apple’s different approaches might be explained by a combination of different incentive structures and different political systems. Apple has a greater incentive to be more compliant with government orders in China than it has in the U.S. On many occasions, Apple has given in to government demands for censorship, both through removal of content from the App Store and by undermining the privacy of Chinese users. Apple’s incentive and approach to customer data in China has changed radically since 2014. After the Edward Snowden revelations that year, Apple announced that it has never worked with the U.S. National Security Agency (NSA) to create a backdoor. Apple also explicitly says that it will not work with any government agency to access its servers: “Apple has never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will. It’s something we feel very strongly about.” That said, some years later it set up a data server in China that is run by a state-owned firm.

One important reason for this evolution in Apple’s incentives may be that almost all of Apple’s products are assembled in China. Losing access to China’s labor market could paralyze the company’s manufacturing operations. Another reason is that Chinese legal processes do not leave Apple with many avenues to object, resulting in—as in the case of Hong Kong—Apple not joining other actors in refusing to comply with certain laws in collective protest. 

This analysis illustrates that Apple’s approaches to self-governance in these two countries differ widely. These differences can be conceptualized using the incentive analysis of multinational corporations (MNCs) and their effect on the MNC’s governance structure. 

Scholars such as John Ruggie argue that foreign firms, including Apple, invest in countries with authoritarian regimes to secure the commitment of those governments by creating local economic embeddedness, by creating supply chains inside a country, and by having the government as one of their stakeholders. But these firms are usually not able to participate in local political processes and cannot influence local politics. The authoritarian regimes, however, can outsource governance to these MNCs—as China does with Apple. Outsourcing governance can also happen in countries without authoritarian regimes. The difference is that Apple, in many instances, has resisted such attempts in the United States. 

Outsourcing Governance

Outsourcing governance consists of a corporation governing its users’ behavior in accordance with both formal and informal requests of the government. It is in line with a shift seen in China from entrance restriction to operational intervention. Instead of restrictions on how MNCs can enter the market, or restrictions on the ownership of the business or within the industry, the Chinese government promulgates regulations that restrict the operation of MNCs. 

Joseph Nye Jr. calls such regulated corporations the instruments of influence, when governments use corporations to influence trade, policy and security. The use of corporations as instruments of influence is not unique to China. Co-regulation or outsourced governance can be seen in both Europe and the U.S. On occasion, the legislatures outsource such governance through laws, such as the Digital Millennium Copyright Act, which outsources copyright infringement regulations to platform intermediaries. 

Outsourcing governance is feasible when one of the parties holds power over the other. In more democratic countries, corporations can influence political processes and resist government requests that would go against their values. Due to market incentives, however, this may not be the case for Apple in China. 

China’s approach to Apple and the App Store can be framed as “institutional governance outsourcing” whereby the government asks Apple to take regulatory actions. The question is, why has China been successful in outsourcing governance to Apple, even though many of its core values are contrary to Apple’s? There are three possible reasons: commercial incentives, economic embeddedness and differing values. 

Commercial Incentives

Apple has more commercial incentives to be compliant with government orders in China than it has in the U.S. Market interest might also have a role in shaping how Apple interacts with and reacts to Chinese government requests. As mentioned earlier, losing access to China’s labor market could paralyze its operation. Some news reports even argue that no multinational corporation has as much stake in China’s market as Apple has, since China assembles nearly all of Apple’s products and Apple’s performance in China heavily affects its stock prices.

Economic Embeddedness

Apple might not directly participate in the local political processes of China or have as much effect as it can have on the U.S., as multinational corporations and foreign firms usually cannot participate directly in these political processes. But lack of direct participation does not mean no participation at all or no influence on political processes. And even though the traditionally democratic processes such as congressional hearings, legislative and judicial testimony, and a neutral adjudicatory process are less available in China than they are in the U.S., Apple can still influence Chinese political processes. 

In the absence of those traditional democratic processes, Apple must look for other avenues to influence the internal political processes of the countries in which it operates. Embedding the government as a quasi-stakeholder in the corporation can provide Apple with business privileges, such as more lenient and privileged adjudicatory processes. Apple’s approach to this method was to create a strong monopsony, becoming the single buyer in several markets, including for specialized semiconductor chips. One reason that Apple could become a monopsonist was that the chips that Chinese firms produced were compatible only with Apple products and Apple was the only buyer. Apple uses this power informally to affect political processes for its commercial interest, not necessarily to protect the values it claims to have. 

Apple’s Differing Values

Formal institutions, governments, regulatory bodies, firms and nongovernmental organizations can influence firms’ decisions about their norms and values. The difference between Apple’s approach to governance in the U.S. and in China is that—when it comes to App Store regulation—Apple is more influenced by those formal institutions in China than it is in the U.S., where it has a certain autonomy and establishes its own values and norms. In China, that autonomy is weaker. 

Conclusion 

The tools governments use to regulate people’s behavior online are similar, even in countries as seemingly dissimilar as the United States and China. What differs is that incentive structures in the U.S. and other democratic countries give more autonomy to private actors to challenge government decisions and influence them, while private actors in China have fewer avenues to challenge and influence government regulations.

All in all, governments have their usual tools such as regulation, national security laws, executive orders, and decrees to regulate the internet in the U.S. and China. The major difference between the two countries is not government approaches but the routes through which government decisions and Apple’s decisions can be objected to by private actors and members of civil society.


Dr. Farzaneh Badiei is the founder of Digital Medusa, an initiative that focuses on protecting the core values of our global digital space with sound governance. For the past decade, Dr Badiei has directed and led projects about Internet and social media governance. She has undertaken research at Yale Law School, Georgia Institute of Technology and the Humboldt Institute for Internet and Society in Berlin. She holds a PhD in law from Hamburg University, Germany.

Subscribe to Lawfare