Cybersecurity & Tech

Robust Consumer Protection Can Support a Rights-Respecting Digital Future

Paul Nelson
Monday, April 3, 2023, 9:05 AM

Safeguards that influence the global marketplace can protect human rights, but many countries either have weak or non-existent protections.

Users at an internet cafe in Bali, Indonesia. (Jesse Wagstaff, https://flic.kr/p/kcYWVL; CC BY 2.0, https://creativecommons.org/licenses/by/2.0/)

Published by The Lawfare Institute
in Cooperation With
Brookings

The Carnegie Endowment for International Peace released a report on March 13 on cyber vulnerabilities across Africa and the potential harm these can cause for underserved communities. It shared the experience of a woman in Nigeria whose reliance on an insecure merchant payment system led to her account being siphoned off of funds. One thousand miles away in East Africa, an article in May 2022 reported widespread reliance on virtual private networks (VPNs). During the lead-up to elections in the region in 2020 and 2022, VPN use spiked when certain governments disrupted or monitored internet activity. As a lifeline for information, not to mention continued economic activity, these VPNs were indispensable. The VPNs were typically products developed and marketed toward consumers, who adopted them based on claims of a certain level of safety. 

Consumer protection and privacy safeguards, which protect against unfair, deceptive, and abusive practices in the economy, can instill confidence in digital products and services in emerging markets and developing economies (EMDEs). By influencing the marketplace to treat people fairly, these safeguards also protect human rights. Yet in many of these countries, safeguards are weak, nonexistent, or poorly implemented. 

The Consequences of Weak Safeguards Can Be Profound

Inadequate Industry Practices Can Lead to Preventable Consumer Harm

Consumer data has been used in discriminatory and abusive ways across EMDEs. At least 12 billion online accounts have been the subject of a data breach. Many breaches are the result of provider practices, not an individual’s degree of digital literacy. Likewise, certain industries, such as digital lending and online advertising, use business models that incentivize the collection and use of consumer data. This data environment makes consumers vulnerable to losing money or being harassed or surveilled. The rise of algorithmic decision-making and artificial intelligence can also amplify divides due to gender, abuses of privacy, and economic inequality

Cyber vulnerabilities are prevalent across EMDEs, partly due to limited capacity. A series of studies found that providers of mobile app-based finance across EMDEs exposed end users to a wide variety of cyber threats due to poor provider-level cyber practices and limited capacity, particularly in Africa. The studies found that weak provider practices were compounded by potentially unfair terms and conditions that disclaimed any responsibility for provider conduct leading to data breaches or loss of funds. 

Deceptive design patterns (also referred to as “dark patterns”) are frequently used by online sellers to manipulate and trick consumers. A sweep conducted earlier this year by two dozen authorities found that, of 399 online retailers reviewed, 148 used dark patterns, a set of practices designed to trick consumers into decisions against their interests. These practices, mapped out by the Organization for Economic Cooperation and Development and explained by the Mozilla Foundation, include hidden terms, ads masquerading as independent content, hard-to-cancel subscription fees, and deceptive data policies. In 2019, a Princeton University study of 11,000 shopping websites and over 53,000 product pages found over 1,800 instances of deceptive design patterns, costing consumers an estimated $1.9 billion per year. It also identified close to two dozen firms advertising their ability to help sites employ these practices. 

Fraud and Scams Pose Risks to Marginalized Communities

Counterfeit or harmful goods have proliferated in online commerce. Reports by the Office of the U.S. Trade Representative, consumer groups, and multilateral institutions describe a sophisticated ecosystem of actors that use online platforms to sell fake, stolen, or unsafe goods to unsuspecting consumers globally, including in many EMDEs. This forces legitimate sellers to compete with fraudsters. In EMDEs, social media platforms are widely used for informal commerce. These platforms typically have weaker systems to prevent economic fraud, which can harm consumers as well as legitimate sellers, many of whom are women.

Fraudulent financial technology (fintech) apps have preyed upon consumers across many EMDEs. Fraudulent or predatory fintech applications, which unwitting consumers download from an app store, spiked over the course of the coronavirus pandemic. These fraudulent applications phish sensitive data from users and potentially lead to a loss of funds. During the pandemic, the download rates of these apps sometimes outpaced legitimate apps. Reports suggest similarly widespread exposure among consumers across EMDEs to fraudulent, sometimes malware-infected, e-commerce websites. Echoing this dynamic are the rise of get-rich-quick online scams, which have targeted many African youth who have fewer employment opportunities. 

Widespread Industry Practices Combined With Other Threats Undermine Trust and Confidence 

Digital transformation accelerated during the pandemic, contributing to a steep rise in consumer complaints. In the Philippines, the government reported a 600 percent increase in complaints from 2019 to 2021, largely related to e-commerce activity. As of 2022, the country’s Department of Trade and Industry reported that nearly half of all consumer complaints related to e-commerce, up from 37 percent in 2020. In Kenya, 56 percent of digital finance consumers reported being the victim of a phishing attempt, and 28 percent had at least one customer care issue (for example, not being able to access a help line). These dynamics reflect a pattern in many countries where growth, sometimes driven by necessity rather than trust, has outpaced the capacity of safeguards.

No wonder the digital economy has a trust deficit. These practices can lead to devastating consequences for anyone participating in the digital economy, including having their savings and identities stolen, falling victim to abusive debt collection practices, being harmed via social media to coerce behavior, having their data on private platforms surveilled unlawfully by state actors, seeing false online reviews that harm small businesses, suffering injury from unwitting use of toxic products bought online, and ransomware attacks. If not addressed, the cumulative impact of these examples can erode confidence in democratic institutions tasked with upholding the rule of law and protecting the public. But it doesn’t have to be this way. 

How Can Consumer Protection Help Improve Safety in the Digital Economy? 

Consumer protection and privacy laws typically address unfair, deceptive, and abusive practices (UDAPs) in the economy. Countries with UDAP laws in place might use slightly different definitions or legal standards by which conduct is assessed. For example, whether a practice is considered a UDAP in the United States depends on a few factors and well-established standards. Some conduct might be considered unfair, but not deceptive, or vice versa. Certain laws are specific to data privacy issues, like compelling providers to treat customer data or personally identifiable information (PII) responsibly and use it only for the purpose for which it was collected.

Common threads for how EMDEs think about UDAPs are materiality and harm. Did a practice actually influence the consumer to do something? Did a practice actually result in harm? What might this look like for a consumer in EMDEs? For example, an unfair practice might involve using a customer’s PII to do something totally unrelated to the purpose of the customer-provider relationship. Similarly, a deceptive practice might involve creating fictitious online reviews for a product that results in consumers buying products that they otherwise wouldn’t have purchased. Lastly, an abusive practice might involve threatening to contact friends and family on social media if a digital loan is not paid off.

Below are a few examples of how consumer protection and data privacy safeguards can help.

First, safeguards can ensure that industry truthfully advertises data and cybersecurity practices to the public. Consumers use services based on what they are led to believe. Let’s say a firm tells the public that its VPN or banking application uses industry-standard cybersecurity practices. If the firm in fact does not, then consumers who rely on the firm’s promises are exposed to a range of cyber threats, from both state and non-state actors. Laws against false advertising or abusive practices offer some level of protection. 

Second, safeguards can strengthen information integrity. In the private sphere, laws about false advertising provide an incentive for firms to treat consumers fairly and to provide reliable, credible information. This is, in a sense, a question of information integrity. For consumer and small businesses, this could involve online advertising and fraud. For civil society groups, this could involve state actor-supported disinformation efforts. Regardless of the vantage point, the result is the same: less trust in online interactions.

Finally, safeguards can stimulate development and adoption of more “rights-respecting” business practices or business models. A diverse movement of technologists, entrepreneurs, researchers, and consumer advocacy groups has spurred interest in approaches that are commercially sustainable yet also safer for consumers and providers alike. This movement encompasses a range of business and technological approaches. 

Foreign Assistance Can Support Concrete Steps to Improve These Safeguards

Each of the illustrative actions discussed below builds on or reinforces the others. 

Support Governmental Stakeholders, Including Consumer Protection and Data Authorities

Countries should ensure that foundational legal and regulatory frameworks for consumer protection and privacy are in place and fit for purpose. Consumer protection laws protect society from fraud, scams, and other abusive practices in the economy. Many countries boast sector-agnostic laws that apply as much to conduct in the digital economy as the offline economy. Certain countries, however, still lack a consumer protection law. The majority of these are in Africa and the Asia-Pacific region. Others have standalone frameworks for data protection and privacy, which may present coordination issues with consumer protection laws. Technical assistance, peer reviews, and legal diagnostics can help ensure that these frameworks are in place and align with applicable principles, standards, and guidelines (whether these are sector-agnostic or specific to e-commerce or product safety). 

Further, it is necessary for countries to support sound implementation and enforcement. Laws do not implement themselves. Many consumer protection authorities lack the institutional, human capital, financial, and technological resources necessary to fulfill their statutory mandates to protect consumers. These authorities might struggle to conduct investigations related to online business models or consumer data issues. Many authorities lack adequate mechanisms to resolve consumer complaints or publish data for external analysis. This data can inspire awareness campaigns and give advocacy groups useful insights.

Finally, countries need to facilitate international coordination and information sharing. The digital economy encompasses an extraordinary amount of cross-border activity, with service providers (and fraudsters) often operating in different countries than consumers. This underscores the value of having arrangements among jurisdictions for collaborating on complaint handling, research, investigations, and coordination on enforcement actions. 

Support Nongovernmental Stakeholders, Including Researchers and Watchdog Groups

It is essential to build core knowledge on digital consumer protection, data, and cyber risks. As well documented as risks are in developed economies, many countries across Africa, Southeast Asia, and Latin America are only beginning to secure information on the scope or nature of harms. If data breaches occur in West Africa, are they related strictly to economic theft, or is there a nexus with human rights activity? Is there a gendered dimension to harm? Many stakeholders can help find answers, including consumer advocacy groups, international organizations, universities, and industry actors

Strengthening the capacity of watchdog groups to hold industry accountable and represent consumer interests with the government will also help further this goal. These groups promote incentives for responsible market conduct yet often have limited resources. Technical assistance and capacity building can help watchdog groups deploy limited resources effectively. Approaches that might be supported include developing industry scorecards for consumer-friendly online practices; mystery shopping to identify unfair treatment of consumers; setting up ad hoc or formal dialogues with public authorities; helping consumers report complaints to authorities for resolution; ensuring that private recourse mechanisms are equitable; and improving the effectiveness of complaints-handling systems. Working with these nongovernmental groups can fill an important gap when governmental actors themselves do not consistently respect due process or abide by the rule of law in exercising their enforcement authorities.

Support Firms, Investors, Independent Researchers, and Industry Groups

In furtherance of this goal, governments can facilitate the adoption of customer-centric product design practices and business models that mitigate end-user data or safety risks. For example, privacy-by-design business models can reduce the amount of data collected and thus reduce the harm from a data breach or misuse of sensitive data. Customer-centric user interfaces, developed with end users, can reduce the likelihood of errors or mistakes. Better online disclosures practices can help consumers avoid harm, and equitable, efficient consumer complaint-handling mechanisms can help secure redress if it occurs. Firms can mitigate risks, such as algorithmic bias or data leakage, by conducting a human rights or data-privacy impact assessment

Strengthening skills within industry to employ rights-respecting digital business models and sound cybersecurity practices is also essential. Many EMDEs boast a vibrant entrepreneurial ecosystem. Yet entrepreneurs, universities, and the information and communication technology sector itself all lack resources to apply best practices for cybersecurity or artificial intelligence systems. These efforts might also lead certain firms to explore different corporate structures that attempt to better align interests between them and their customers.

Conclusion

The Declaration for the Future of the Internet (DFI) affirms the integral role that consumer protection plays in realizing and sustaining a positive future. Under the “Trust in the Digital Ecosystem” principle, the DFI lists the need to “promote the protection of consumers, in particular vulnerable consumers, from online scams and other unfair practices online.” The actions mentioned above would be taken alongside actions already familiar to Lawfare readers, including promoting trustworthy network infrastructure suppliers. 

Everyone is a consumer in EMDEs. Dissidents are consumers. Human rights activists are consumers. The staff of civil society organizations are consumers. Small business owners are consumers. One way or another, most people access services or engage in online activity by relying on organizations or firms subject to laws designed to protect consumers. 

These safeguards are not panaceas, but they can help. If consumer protection safeguards are paired with complementary efforts like those outlined by the DFI, the foreign assistance community can set the stage for better outcomes for all.

Disclaimer: The opinions expressed herein are the author’s and do not necessarily reflect the views of USAID.


Paul Nelson is a Senior Advisor and the Acting Team Lead for Digital Finance at USAID. Alongside work on gender and Africa, he leads the Digital Ecosystem Fund and Trust and Competition in Digital Economies (an effort of USAID and the FTC). He was a State Department fellow on AI ethics. He holds a J.D. from the University of Wisconsin Law School and degrees in architecture and economics from the University of Minnesota.

Subscribe to Lawfare