Opt-out Privacy Policy Worsens Algorithmic Price Discrimination: The Case of the American Privacy Rights Act

By: Cole Edick*

 

In a data-driven world, data governance has serious implications for social inequality—from civil rights to consumer protection.[1] But the current approach to data governance in the United States may overemphasize individual privacy choice at the cost of collective approaches to inequality that better mitigate the harms inherent in the datafication of an already-unequal society.[2] This blog post examines that tension, critiquing the exceptions and opt-out mechanisms present in the recently announced American Privacy Rights Act of 2024 (APRA), arguing that such provisions fail to adequately address inequality—in particular, the potential exacerbation of algorithm-powered price discrimination through APRA’s loyalty program exception.

The American Privacy Rights Act

On April 7, 2024, U.S. Representative Cathy McMorris Rodgers and U.S. Senator Maria Cantwell announced a draft of APRA, designed to set “clear, national data privacy rights and protections for Americans.”[3] APRA is the latest major attempt at a comprehensive federal consumer data governance standard, the last being the American Data Privacy and Protection Act, which was introduced in 2022 but never came to a vote.[4]

APRA seeks to accomplish many things, including: 1) defining what consumer personal data can be protected; 2) regulating protected data’s treatment by social media companies, data brokers, and other “covered entities” (notably exempting small businesses and government); and 3) setting “data minimization” standards that prevent the collection, processing, retention, and transfer of data “beyond what is necessary, proportionate, and limited to” the provision of products and services requested with consumer consent.[5] Notably, APRA directs the Federal Trade Commission to “establish requirements and technical specifications” for an opt-out mechanism that consumers would use to signal that they do not consent to the transfer, sale, and processing of their data by covered entities.[6] In doing so, APRA’s authors purport that the law would “[put] people in control of their own personal data.”[7]

APRA’s goal to “put people in control” is admirable, and many of its provisions would no doubt empower consumers across the 34 states currently lacking statewide data privacy laws.[8] APRA’s data minimization standard—in particular—holds potential to limit the unmitigated collection of consumer information that feeds big data algorithms, a churn that time-and-again has been shown to further social inequality, be it through disparate impact on low-income consumers[9] or discrimination against marginalized groups.[10]  But even with data minimization, APRA contains numerous exceptions to its otherwise broad protections and takes an opt-out-first approach, leaving room for loopholes that will continue to exacerbate algorithmic inequality.[11]

The Danger of Opt-Out Mechanisms and the Loyalty Program Exception

Indeed, opt-out mechanisms, studies show, may not prevent negative externalities that subject vulnerable or marginalized consumers to disproportionate targeting by algorithmic processes.[12] When it is up to individuals in a group to opt out, there is no guarantee that other group members are protected if data processors can still aggregate data from enough similarly-situated consumers whose apathy or acquiescence to data collection means the group as a whole can still be effectively targeted.[13] Such is often the case given that it may be “costly for vulnerable consumers to opt out of data sharing because the benefits of sharing data” outweigh individual privacy concerns.[14] Consider, for example, the ten to twenty percent of consumers that hand personal data over to a car insurer to save on premiums.[15]

APRA’s loyalty program exception, as written, is likely to uphold this dynamic. APRA’s Section 8(a), on its face, prohibits companies from offering discriminatory prices or quality for consumers who choose to opt out of data sharing.[16] But 8(b)(1) outlines an exception to this prohibition any covered entity that is not a social media company or data broker with a voluntary loyalty program that collects data (loyalty running the range from rewards programs to mere discounts).[17] The 8(b)(1) exception generates what some observers have called a “pay-for-privacy scheme” that encourages consumers to hand over privacy rights, threatening a “society of privacy ‘haves’ and ‘have-nots.’” [18]

Privacy Disparities May Worsen Algorithm-Powered Price Discrimination

That possibility should concern anyone who thinks privacy is a right rather than a luxury good. Incentivizing privacy disparities—as 8(b)(1) does—is also bound to exacerbate pre-existing inequalities as data-hungry algorithms encroach on the economics of everyday consumption. Businesses use algorithms that use consumer data to set prices—in March, Wendy’s was widely panned for suggesting that it would deploy real-time algorithms that consumers feared would lead to Uber-like “surge pricing” for fast food.[19]

Such “dynamic pricing” itself is not revolutionary.[20] When Wendy’s faced backlash, some observers pointed out how such strategies are already quite common for high-value goods and services like air travel, rideshares, and concert tickets.[21] What is new is the power, prevalence, and access to consumer data that algorithms need to bring dynamic pricing to sectors that provide essential or low-cost goods and services—including supermarkets,[22] insurance,[23] and public transportation.[24]

Even without encroachment on these sectors, algorithm-powered dynamic pricing effectively results in race- or income-based price discrimination, exacerbating inequality. The Princeton Review used such technology, charging price differences of as much as $1,200 more for zip codes with large Asian populations seeking test preparation services.[25] Similarly, neighborhoods with larger non-white populations, higher poverty levels, and younger residents are associated with higher fare prices from ride hailing algorithms.[26] Dynamic pricing algorithms disproportionately harm low-income and racialized groups in how they operate, but also due to consumers’ ability to respond given asymmetries in information or resources.[27] For example, dynamic pricing at supermarkets is particularly harmful to communities living in food deserts “that do not have time to price match or commute to a discount grocery store.”[28] Furthermore, dynamic pricing algorithms are ripe for predation, and can be gamed in order to create ripple effects in the market that lead to higher prices for consumers across the board, especially when used by large players.[29]

The asymmetries already inherent in algorithm-powered dynamic pricing are only bound to worsen with loopholes like APRA’s Section 8(b)(1). This demonstrates a key flaw in APRA-style data governance—even as it pursues data minimization, it still emphasizes individual choice and control through its opt-out mechanisms and exceptions. Legislators would be wise to limit exceptions and reconsider an opt-in approach that mitigates datafication that reproduces social inequality.[30]

*Cole Edick is the Lead Managing Editor for Volume 43 of the Minnesota Journal of Law & Inequality

[1] See, e.g., Sabina Leonelli, Without Urgent Action Big and Open Data May Widen Existing Inequalities and Social Divides, London Sch. of Econ. & Pol. Sci.: LSE Impact Blog (Feb. 14, 2018), https://blogs.lse.ac.uk/impactofsocialsciences/2018/02/14/without-urgent-action-big-and-open-data-may-widen-existing-inequalities-and-social-divides/.

[2] Cf. María P. Angel & Ryan Calo, Distinguishing Privacy Law: A Critique of Privacy as Social Taxonomy, 124 Colum. L.R. 507, 551 (2024) (“Privacy disempowers marginalized populations, for example, by rendering their oppression less visible, but empowers these populations to resist unequal treatments that result from the misuse of their information . . . [h]ow do we balance privacy against privacy?”).

[3] Committee Chairs Rodgers, Cantwell Unveil Historic Draft Comprehensive Data Privacy Legislation, House Comm. on Energy & Commerce (Apr. 7, 2024) [hereinafter APRA Release] https://energycommerce.house.gov/posts/committee-chairs-rodgers-cantwell-unveil-historic-draft-comprehensive-data-privacy-legislation.

[4] H.R. 8152, 117th Cong. (2022).

[5] American Privacy Rights Act of 2024 Discussion Draft, House Comm. on Energy & Commerce (Apr. 7, 2024) [hereinafter APRA Draft] https://d1dth6e84htgma.cloudfront.net/American_Privacy_Rights_Act_of_2024_Discussion_Draft_0ec8168a66.pdf.

[6] Id.

[7] APRA Release, supra note 3.

[8] Andrew Folks, US State Privacy Legislation Tracker, Int’l Ass’n of Privacy Pros., https://iapp.org/resources/article/us-state-privacy-legislation-tracker/ (last updated Apr. 22, 2024).

[9] See, e.g., Solon Barocas & Andrew D. Selbst, Big Data’s Disparate Impact, 104 Calif. L. Rev. 671, 729 (2016).

[10] See, e.g., Safiya Umoja Noble, Algorithms of Oppression (2018).

[11] See The Difference Between Opt-In vs Opt-Out Principles In Data Privacy: What You Need To Know, Secure Privacy, https://secureprivacy.ai/blog/difference-beween-opt-in-and-opt-out (last updated Feb. 2024) (noting the contrast with the European Union’s General Data Protection Regulation’s predominantly opt-in consent methods for data processing).

[12] Zhuang Liu, Michael Sockin & Wei Xiong, Data Privacy and Algorithmic Inequality 34 (Nat’l Bureau of Econ. Rsch., Working Paper No. 31250, 2023).

[13] Id.

[14] Id.

[15] Paul Stenquist, Letting Your Insurer Ride Shotgun, for a Discounted Rate, N.Y. Times (July 16, 2020), https://www.nytimes.com/2020/07/16/business/car-insurance-app-discounts.html.

[16] APRA Draft, supra note 5 at 25 (“A covered entity may not retaliate against an individual for exercising any of the rights guaranteed by the Act, or any regulations promulgated under this Act, including denying products or services, charging different prices or rates for products or services, or providing a different level of quality of products or services”).

[17] Id. at 25–26.

[18] Mario Trujillo, Americans Deserve More Than the Current American Privacy Rights Act, Electronic Frontier Found. (Apr. 16, 2024), https://www.eff.org/deeplinks/2024/04/americans-deserve-more-current-american-privacy-rights-act.

[19] Justin Klawans, Wendy’s Dynamic Pricing Change Could Upend the Fast Food Industry, Week (Mar. 4, 2024), https://theweek.com/business/wendys-dynamic-pricing.

[20] Omar H. Fares, Wendy’s Won’t Be Introducing Surge Pricing, but It’s Nothing New to Many Industries, Conversation (Mar. 6, 2024), https://theconversation.com/wendys-wont-be-introducing-surge-pricing-but-its-nothing-new-to-many-industries-224910.

[21] Id.

[22] Planet Money, Is Dynamic Pricing Coming to a Supermarket near You?, Nat’l Pub. Radio (Mar. 6, 2024), https://www.npr.org/2024/03/06/1197958433/dynamic-pricing-grocery-supermarkets.

[23] Vlad Popovic, How Can You Use Your Data to Boost Your Sales in Insurance and Safely Cut off Unprofitable Clients?, Symfa (Apr. 19, 2024), https://symfa.com/blog/how-to-develop-dynamic-pricing-software-insurance.

[24] Marina Rabin, Tube Fares Could Get ‘Dynamic Pricing’–Here’s What That Actually Means for Travelling Around London, TimeOut (Jan. 15, 2024), https://www.timeout.com/london/news/tube-fares-could-get-dynamic-pricing-heres-what-that-actually-means-for-travelling-around-london-011524.

[25]Julia Angwin, Surya Mattu & Jeff Larson, The Tiger Mom Tax: Asians Are Nearly Twice as Likely to Get a Higher Price from Princeton Review, ProPublica (Sept. 1, 2015), https://www.propublica.org/article/asians-nearly-twice-as-likely-to-get-higher-price-from-princeton-review.

[26] Akshat Pandey & Aylin Caliskan, Disparate Impact of Artificial Intelligence Bias in Ridehailing Economy’s Price Discrimination Algorithms, 9–10 (arXiv Working Paper No. 2006.04599, 2021) https://arxiv.org/abs/2006.04599.

[27] Anna Helhoski, Wendy’s Isn’t the First: Dynamic Pricing Is Everywhere, NerdWallet (Mar. 14, 2024), https://www.nerdwallet.com/article/finance/dynamic-pricing (“You end up with . . .  winners and losers,” because dynamic pricing mechanisms “typically benefit sophisticated consumers” with the time and resources to be “checking the price online and . . . saying, ‘OK, let’s go now because prices are lower.’”).

[28] Katelin Harmon, Target’s Dynamic Pricing Feeds the Food Desert Fire, Daily Tar Heel (Sept. 25, 2023), https://www.dailytarheel.com/article/2023/09/opinion-local-food-desert-dynamic-pricing-target-chapel-hill-carrboro.

[29] See Complaint at ¶¶ 309–11, FTC v. Amazon.com, Inc., No. 2:23-cv-01495 (W.D. Wash. filed Sept. 26, 2023).

[30] Cf. Salomé Viljoen, A Relational Theory of Data Governance, 131 Yale L.J. 573, 653 (2021) (“[D]atafication may be wrong not only because it manipulates people, but also because the social effects it produces or materializes violate standards of equality. As an economic process, datafication may lead to unfair wealth inequality that violates distributive ideals of justice. As a social process, datafication may reproduce and amplify forms of social hierarchy that violate relational standards of justice.”).