AI in the Workplace: The Dangers of Generative AI in Employment Decisions

The digital age fosters ingenious yet unprecedented developments, such as using generative AI to streamline typical human resource processes. More companies are using generative AI to source and select candidates, assess current employees, and determine layoffs. However, under the guise of great efficiency and innovation lies an even greater ethical and legal problem: the use of AI technology to hire, assess, and fire employees creates a system that violates workplace anti-discrimination laws, negatively impacting minority groups. 

The cases Mobley v. Workday, Inc. and EEOC v. iTutor Group demonstrate the complexities of navigating this uncharted legislative territory. [1] In examining these instances, it is clear that any company or organization relying heavily on AI technology for employment decisions violates crucial workplace anti-discrimination laws. 

A typical artificial intelligence system uses a machine learning model to analyze and make predictions about existing data or materials it was given. More recently, generative artificial intelligence systems have emerged. These models use a Language-Learning-Model (LLM) to produce new data or materials based on those it was trained with. [2] An LLM is trained by developers feeding it vast amounts of data in different forms, which is then used in generative AI technology to interpret and generate text like a human. [3] Therefore, the materials to train these AI systems contain all of our society’s long-standing, systemic biases, making it inevitable that AI’s generated responses will reflect these biases. 

AI training methods cause AI to be a reflection of humanity’s complicated past and present by “automating” the long-existing negative qualities and systemic prejudices naturally present in humanity. For example, individuals with disabilities are twice as likely to be unemployed, so they’re twice as likely to be underrepresented in the employment data that AI systems are trained with. [4] By attempting to automate hiring processes with such unfamiliar, convoluted technology, the structural inequities toward disabled and marginalized populations are amplified. [5] 

In 2022, the U.S. Equal Employment Opportunity Commission and the Department of Justice raised concerns about AI employment tools. While an increasing number of employers are implementing AI technology to “select new employees, track performance, and determine pay or promotions,” these agencies are warning about the inescapable tension between crafting effective or equitable generative AI software. [6] In particular, iTutorGroup, Inc., an English language tutoring company located in China, hired U.S.-based tutors to provide remote services via AI application software. The EEOC filed a lawsuit in 2022 which showed that iTutorGroup’s software purposely rejected female applicants age 55 and older and male applicants age 60 and older. [7] iTutorGroup’s actions show the grave potential impact AI hiring tools can have when AI’s inherent bias meets human intention to discriminate.

These alleged actions violate the Age Discrimination in Employment Act (ADEA), which protects “certain applicants and employees 40 years of age and older from discrimination on the basis of age in hiring, promotion, discharge, compensation, or terms, conditions or privileges of employment.” [8] The legal issue arises not only with iTutorGroup’s application software itself but also with their intentional programming of AI to filter out applicants of a certain age. As iTutorGroup’s hiring practices were systemic in nature, this created a discriminatory pattern that affected a large number of applicants. Therefore, the EEOC used this opportunity of iTutorGroup’s direct violations of the ADEA to set an example based on their prohibited practices of the use of AI in automated hiring processes. [9]

This lawsuit was settled in 2023, with iTutorGroup paying $365,000 to be split among the approximately 200 applicants victimized by their discriminatory hiring practices. Furthermore, iTutorGroup was required to train all employees involved in hiring and update anti-discrimination policies. They were also prohibited from requesting birth dates and filtering applicants based on age. The final stipulation was for iTutorGroup to interview the discriminatorily rejected applicants if they ever resumed hiring in the U.S.

This was the first settlement of an AI discrimination lawsuit by the EEOC, and it is pivotal in raising awareness of the AI-driven discrimination emerging from AI’s newest implementations within HR departments. In fact, EEOC Chairwoman Charlotte Burrow states that over 80% of employers are using AI for work and hiring processes, and the number is only bound to increase. [10] While AI has shown its usefulness for understaffed and overburdened administrators and eager applicants, it’s also proven that it can’t replace crucial human judgment.

While this settlement is not an established legal precedent by the courts, it allowed the EEOC to publicize its condemnation of AI-driven discrimination in employment processes, and it can serve as an example of how AI discrimination cases should be ruled. Further, the case serves as a warning and deterrent for other companies currently using or intending to use AI in their hiring practices to realign or configure their procedures in accordance with federal anti-discrimination law.

Similarly, Mobley v. Workday, Inc. is a class action lawsuit being heard in the United States District Court for the Northern District of California. [11] Workday Inc., a human capital management software vendor, developed an AI-based HR application to select candidates for hire and has sold it to hundreds of companies. In February 2023, Derek Mobley filed a case claiming that Workday’s product actively and systemically discriminated against job applicants who were African American, above the age of 40, or had disabilities. Mobley believes Workday is an employment agency and should be held liable for their product’s discriminatory selections. [12]

Mobley asserted that these alleged actions violated Title VII of the Civil Rights Act, the ADEA, and the Americans with Disabilities Act. [13] The laws protect current and potential employees from workplace discrimination on the basis of race, color, religion, gender, national origin, age, and disability status. Workday filed a motion to dismiss, which the EEOC countered with an amicus curiae brief. The EEOC’s brief stated that if Mobley’s accusations are true, Workday is subject to federal anti-discrimination laws as an “employment agency.” [14]

 The court ultimately ruled that Mobley didn’t sufficiently prove that Workday was an employment agency, and therefore wasn’t responsible for the discrimination by its AI software under the current claims. The court recommended Mobley amend his complaint to cover Workday as an indirect employer or agent of employment so they’re liable for discrimination. He eventually submitted his amendment, and the case is currently being heard.

It is of utmost importance that Workday is found liable for the discrimination of its software. Since Workday's AI software has prevented many applications from reaching their employers’ desks, the company can be held accountable for the consequences of their algorithm and should have been more proactive with testing. This includes whether Workday designed and tested its algorithms to mitigate biases, provided sufficient transparency, and adhered to industry best practices and anti-discrimination laws. Just as iTutorGroup was responsible for their discrimination using AI software, Workday should be liable for the discriminatory choices made by the AI hiring software they developed and programmed themselves. 

Workday made this software and thus, is responsible for the unlawful hiring decisions made by proxy. While the AI made hiring decisions under companies that were clients of Workday, not Workday itself, many applications were withheld from the companies, a decision by the software programmed and trained by Workday. Further liability could be established if Mobley proves that Workday neglected to address known biases or did not offer adequate guidance to its clients on its technology’s legal and ethical use. 

Using iTutorGroup and Mobley as examples, employers must implement protocols to mitigate AI hiring discrimination to ensure compliance with existing EEOC policies and lawsuits. The AI Bill of Rights by the White House and the Algorithmic Accountability Act by the EEOC both aim to promote responsible uses of AI and educate the public about the associated risks. [15] The recommendations in each of these documents must be used as guiding principles for companies as they navigate the expanding scope of AI. Other states should also consider adopting a law similar to New York’s Local Law 144 which  regulates automated employment decision tools by mandated regular bias audits. Such laws could offset the inevitable biases AI has. [16] 

In addition, the courts should rule in accordance with recommendations from the EEOC and other government entities. These recommendations include, but aren’t limited to: mandating companies to train AI with diverse data sets, closely monitor and update AI uses, consistently train employees on AI, and publicize all of their AI uses. Due to its wide array of applications and potential for corruption, companies must be thoroughly prepared for the risks of using AI technology in their employment considerations.

Looking beyond the shadows cast by AI’s blinding prospects, it becomes evident that its unregulated use in critical employment processes violates workers’ rights and reinforces systems of oppression. We should embrace the integration of AI technology while remaining cautious. It should be used to  enhance efficiency without  attempting to replace human jobs with generative technologies. Clearer precedents and more solid legislation are also needed to clarify the legal usage levels of AI in HR tasks so automated employment discrimination doesn’t fly under the radar.

Edited by Noelle Shih

[1] “ITutorGroup to Pay $365,000 to Settle EEOC Discriminatory Hiring Suit.” US EEOC, September 11, 2023. https://www.eeoc.gov/newsroom/itutorgroup-pay-365000-settle-eeoc-discriminatory-hiring-suit.; Mobley v. Workday (n.d.).; Mobley v. Workday, Inc., 3:23-cv-00770 – courtlistener.com. Accessed July 3, 2024. https://www.courtlistener.com/docket/66831340/mobley-v-workday-inc/.

[2] Zewe, Adam. “Explained: Generative AI.” MIT News | Massachusetts Institute of Technology, November 9, 2023. https://news.mit.edu/2023/explained-generative-ai-1109.

[3] “What Are Large Language Models (Llms)?” IBM, November 2, 2023. https://www.ibm.com/topics/large-language-models.; How CHATGPT and our language models are developed | openai help center. Accessed July 3, 2024. https://help.openai.com/en/articles/7842364-how-chatgpt-and-our-language-models-are-developed.

[4] Nugent, Selin E, and Susan Scott-Parker. “Recruitment AI Has a Disability Problem: Anticipating and Mitigating Unfair Automated Hiring Decisions.” Essay. In Towards Trustworthy Artificial Intelligent Systems. Cham, Switzerland: Springer Nature Switzerland AG, 2022.

[5] Nugent, Selin E, and Susan Scott-Parker. “Recruitment AI Has a Disability Problem: Anticipating and Mitigating Unfair Automated Hiring Decisions.” Essay. In Towards Trustworthy Artificial Intelligent Systems. Cham, Switzerland: Springer Nature Switzerland AG, 2022.

[6] Heckman, Jory. “EEOC, DOJ ‘sounding Alarm’ over AI Hiring Tools That Screen out Disabled Applicants.” Federal News Network - Helping feds meet their mission., May 12, 2022. https://federalnewsnetwork.com/artificial-intelligence/2022/05/eeoc-doj-sounding-alarm-over-ai-hiring-tools-that-screen-out-disabled-applicants/?readmore=1.

[7] “ITutorGroup to Pay $365,000 to Settle EEOC Discriminatory Hiring Suit.” US EEOC, September 11, 2023. https://www.eeoc.gov/newsroom/itutorgroup-pay-365000-settle-eeoc-discriminatory-hiring-suit.

[8] “Age Discrimination.” DOL. Accessed July 3, 2024. https://www.dol.gov/general/topic/discrimination/agedisc#:~:text=The%20Age%20Discrimination%20in%20Employment,conditions%20or%20privileges%20of%20employment.; “The Age Discrimination in Employment Act of 1967.” US EEOC. Accessed July 3, 2024. https://www.eeoc.gov/statutes/age-discrimination-employment-act-1967.

[9]  “Prohibited Employment Policies/Practices.” US EEOC. Accessed July 3, 2024. https://www.eeoc.gov/prohibited-employment-policiespractices.

[10] “EEOC, DOJ ‘sounding Alarm’ over AI Hiring Tools That Screen out Disabled Applicants.” Federal News Network - Helping feds meet their mission., May 12, 2022. https://federalnewsnetwork.com/artificial-intelligence/2022/05/eeoc-doj-sounding-alarm-over-ai-hiring-tools-that-screen-out-disabled-applicants/?readmore=1.

[11] Mobley v. Workday (n.d.).; Mobley v. Workday, Inc., 3:23-cv-00770 – courtlistener.com. Accessed July 3, 2024. https://www.courtlistener.com/docket/66831340/mobley-v-workday-inc/.

[12] “Class Action Complaint; Jury Trial Demanded against Workday, Inc.” Court Listener, February 21, 2023. https://storage.courtlistener.com/recap/gov.uscourts.cand.408645/gov.uscourts.cand.408645.80.0.pdf.

[13] “Title VII of the Civil Rights Act of 1964.” US EEOC. Accessed July 3, 2024. https://www.eeoc.gov/statutes/title-vii-civil-rights-act-1964#:~:text=Title%20VII%20prohibits%20employment%20discrimination,several%20sections%20of%20Title%20VII.; Liu, Henry, and Staff in the Office of Technology. “Protections against Discrimination and Other Prohibited Practices.” Federal Trade Commission, November 24, 2021. https://www.ftc.gov/policy-notices/no-fear-act/protections-against-discrimination.; “Age Discrimination.” DOL. Accessed July 3, 2024. https://www.dol.gov/general/topic/discrimination/agedisc#:~:text=The%20Age%20Discrimination%20in%20Employment,conditions%20or%20privileges%20of%20employment.; “The Americans with Disabilities Act.” ADA.gov. Accessed July 3, 2024. https://www.ada.gov/.

[14] EEOC, April 9, 2024. https://www.eeoc.gov/sites/default/files/2024-04/Mobley%20v%20Workday%20NDCal%20am-brf%2004-24%20sjw.pdf.

[15] “Blueprint for an AI Bill of Rights.” The White House, November 22, 2023. https://www.whitehouse.gov/ostp/ai-bill-of-rights/.; “Artificial Intelligence and Algorithmic Fairness Initiative.” US EEOC, 2021. https://www.eeoc.gov/ai. 

[16] Automated Employment Decision Tools: Frequently asked ..., June 29, 2023. https://www.nyc.gov/assets/dca/downloads/pdf/about/DCWP-AEDT-FAQ.pdf.

Jasmine Lianalyn Rocha