Paid by AI: Algorithmic Wage Discrimination in the Gig Economy

Artificial Intelligence (AI) has been articulated to be the next revolutionary force within various areas of the law including antitrust, privacy, education, and labor law. However, AI’s specific implementation in the gig economy, such as personalizing wages for individual workers like rideshare drivers, may reverse groundbreaking decisions and pose new challenges in the evolving landscape of labor law protections for gig workers. Soon, it may influence labor in its entirety. The use of recent technological developments for the extraction and processing of data gives way to concerns about the reduction of privacy in the workplace as well as continued discrimination against gig workers contrary to major decisions by the National Labor Relations Board (NLRB). Atlanta Opera, Inc., for instance, classified gig workers as covered employees, granting them common-law protection. Despite this and similar rulings, the implementation of algorithmically personalized wages highlights the  gaps in federal policy and law that fail to account for the nuanced and unconventional workplaces of the gig economy. [1] Addressing this will require, in addition to ongoing efforts by workers and worker advocates, recognizing gig workers as covered employees, increasing legislation, and enforcing antitrust laws to curb employers’ algorithmic use of tools.

The influence of technological developments on the workplace and on consumer choice is not new.  Over the last few decades, technological developments have led to increased workplace surveillance and monitoring. Technology has been used to record and quantify workers’ activities, process datasets into machine learning systems for employment decisions, increase worker productivity, and more recently: determine worker pay. [2] Consumers are familiar with targeted and personalized content or prices, as evidenced by college students frantically going back and forth between the Uber and Lyft apps. In the workplace, and especially in the gig economy, employer technologies and incentives have led to similar forms of opaque and unstable labor prices. Algorithmic wage discrimination develops a labor market where workers perform the same services, with the same skill, for the same company, but receive different hourly wages. [3]  Thus, given their complex structures and mechanisms, simply providing covered employee status to gig workers is not enough.

The NLBR’s June 2023 decision in The Atlanta Opera, Inc. reversed the Board’s 2019 decision in SuperShuttleDFW, Inc. (2019), which narrowed the test for determining employee status and excluded many employees from the protections afforded by the National Labor Relations Act (NLRA). [4] The decision in Atlanta Opera, Inc. revived the precedent set in FedEx Home Delivery , where the Board “restated and refined its approach to assessing whether workers are employees covered under Section 2(3) of the NLRA or, instead, are independent contractors, excluded from coverage.” [5] The hair and makeup stylists petitioned for union representation at the Atlanta Opera, holding that they were employees entitled to the protections of the NLRA, not independent contractors, and the board sided with them. By doing so, the board revised the standard narrowed by SuperShuttle on who is a covered employee under the National Labor Relations Act (NLRA); this has a significant influence on the protection of other gig workers including app-based ride-hail and delivery workers.

Technological developments in the workplace differ significantly between conventional employment settings and the gig economy.  Employers in the gig economy strategically exploit the lack of structure, relations, and communication between workers themselves and between employers and workers to amplify the harms of algorithmic wage discrimination. For example, in late October 2024, the Department of Justice (DOJ) filed a lawsuit against Lyft to stop the company from deceiving drivers with misleading earnings claims. The DOJ, acting on a referral from the FTC, proposed a settlement to which Lyft has since agreed. The settlement requires Lyft to base its earnings claims on typical driver income, provide evidence to support those claims, clearly outline terms for earnings guarantees, and pay a $2.1 million penalty for its ongoing violations. [7] In the lawsuit, DOJ attorneys first acknowledged that “Lyft classifies its Drivers as independent contractors rather than employees,” and alleged that as demand for rideshare services increased in 2021 and 2022, “Lyft widely disseminated inflated hourly earnings claims in web search ads, on social media, on internet job boards, and on Lyft’s website” in an effort to attract new drivers. [8] The lawsuit alleged that Lyft “advertised that drivers around the country could make specific hourly amounts,” failing to disclose that “these amounts did not represent the income an average driver could expect to earn, but instead were based on the earnings of the top one-fifth of drivers.” [9] The DOJ complaint further notes that these figures overinflated the actual earnings achieved by most drivers by as much as 30%. [10] In its advertisements, Lyft also tried to entice drivers by touting “earnings guarantees,” which supposedly guaranteed that drivers would be paid a set amount if they completed a specific number of rides in a certain time. However, Lyft failed to adequately disclose that it would only pay the difference between what the driver earned and the promised amount if there was any shortfall. [11]

This action emerges alongside the NLRB’s victories as well as the FTC’s groundbreaking, ongoing efforts to protect workers in the gig economy. In its October 2024 “Policy Statement on Enforcement Related to Gig Work,” the FTC explained the rationale for extending its authority to the gig economy, highlighting its settlement with Amazon in 2021 which returned more than $60 million to Amazon Flex drivers whose tips were illegally withheld, as well as its 2023 action against HomeAdvisor for misleading service providers which similarly resulted in the paytout of millions of dollars in redress. [12] However, the continued use of non-transparent algorithms to capture revenue from workers’ services may complicate the already exploitative dynamics of gig companies, dictating core aspects of workers’ relationships with a given company’s platform. Gig workers are then left with both an invisible, inscrutable boss and invisible, unorganized colleagues.

The Atlanta Opera decision addressing protections for gig workers opened the door for the NLRB to consider whether ride-hailing and delivery workers are also entitled to union rights under federal labor law. Though the DOJ and FTC’s $2.1 settlement against Lyft helped set a precedent against the use of false and misleading claims on earnings, there has not been any case at the intersection of both. Even if ride-hailing and delivery workers are protected under federal labor law, and even if employers like Lyft are required to have clear terms on earnings guarantee offers, the use of algorithmically personalized wages and  strategic amplification of existing gaps in the gig economy is still not accounted for. These provisions fail to consider the ever-changing algorithmic mechanisms in the gig economy that make workers more susceptible to exploitation beyond the conventional workplace. Given the decentralized nature of workplaces, the potential absence of legal protections for organizing, high turnover rates, and the rise of personalized algorithmic wages, algorithmic processes further erode the bargaining power of gig workers, leaving them vulnerable to ongoing exploitation through unfair, deceptive, and anticompetitive practices created by these very conditions. Algorithmic implementations, including digital tracking and data collection at work, pose salient concerns not just about the privacy of employees but also about workers’ autonomy. On-demand workforces are primarily made up of immigrants and racial minorities, which racializes the impacts of wage discrimination on worker mobility, security, and collective action, all while benefiting monopolistic greed.

            Anti-monopolistic approaches and the mobilization of workers can address algorithmically personalized wages. Non-productivity-related data is evidence of illegal wage pricing, and price discrimination and differentiation were deemed a monopolistic practice by the 1887 Interstate Commerce Act and the 1914 Clayton Act. [13] The argument for a complete ban on labor price discrimination, which itself serves as an indication of market dominance, becomes particularly compelling when large companies like Uber or Lift control substantial parts of the rideshare labor market. In addition to governmental regulatory efforts, workers and their supporters have leveraged existing legal mechanisms and cooperative frameworks to challenge wage discrimination driven by algorithmic systems. When the California Civil Code in 2023 imposed limits on businesses’ collection of consumer personal information and required notice of the purposes behind data collection, drivers – like the App Drivers & Couriers Union in the United Kingdom –  were able to position themselves to establish their worker status and lay claim to rights concerning the data and algorithms used to determine their pay. [14] Workers have also formed “data cooperatives” to assert their agency over their labor and data by collectively demanding transparency around their wage scales as well as the extraction of data from their labor.

Algorithmic wage discrimination may primarily harm gig workers now, but it represents a broader attack on the entire workforce. The experiences of gig workers with wage discrimination should serve as a vessel for understanding how other employers may use similar tactics to undermine economic democracy in the workforce at large. The gig economy, in this new and threatening context, is a critical site of power and mobility.

Edited by Mohammad Hemeida 

[1] National Labor Relations Board. The Atlanta Opera, Inc., 372 NLRB No. 95, Case 10–RC–249936 (February 22, 2023).

[2] Dubal, Veena. "On Algorithmic Wage Discrimination." Columbia Law Review 123, no. 4 (2023): 789-834.

[3] Teachout, Zephyr. "Algorithmic Personalized Wages." Yale Law Journal 132, no. 6 (2023): 1245-1289.

[4] National Labor Relations Board. SuperShuttle DFW, Inc., 367 NLRB No. 75 (2019).

[5] National Labor Relations Board. The Atlanta Opera, Inc., 372 NLRB No. 95, Case 10–RC–249936 (February 22, 2023).

[6] United States v. Lyft, Inc., Case No. 24-cv-7443, United States District Court for the Northern District of California (2024).

[7] Federal Trade Commission. "FTC Takes Action to Stop Lyft from Deceiving Drivers with Misleading Earnings Claims." Press Release, October 10, 2024. https://www.ftc.gov/news-events/news/press-releases/2024/10/ftc-takes-action-stop-lyft-deceiving-drivers-misleading-earnings-claims.

[8] United States v. Lyft, Inc., Case No. 24-cv-7443, United States District Court for the Northern District of California (2024).

[9] United States v. Lyft, Inc.

[10] United States v. Lyft, Inc.

[11] United States v. Lyft, Inc.

[12] Federal Trade Commission. "FTC Policy Statement on Enforcement Related to Gig Work and Other Alternative Work Arrangements." Policy Statement, June 2023. https://www.ftc.gov/system/files/ftc_gov/pdf/Matter%20No.%20P227600%20Gig%20Policy%20Statement.pdf.

[13] United States, Interstate Commerce Act of 1887, 24 Stat. 379 (1887); United States, Clayton Antitrust Act, Section 2, Public Law 63-212, 38 Stat. 730 (1914).

[14] California, Civil Code § 1798.100 (2023).

Wena Teng