AI Bias Is a Legal Risk: What the Workday Case Means for Employees and Students

Artificial intelligence is now woven into nearly every part of modern life — from hiring and grading systems to workplace and school evaluations. But when technology makes decisions that affect real people, the question becomes: who’s accountable when the algorithm gets it wrong?

A recent case, Mobley v. Workday, Inc. (N.D. Cal. 2024), is helping to answer that question. In that case, a federal judge allowed a class-action lawsuit to proceed against Workday, whose AI-driven recruiting software allegedly screened out applicants based on race, age, and disability. The court rejected Workday’s defense that it was “just a vendor,” ruling that companies deploying biased algorithms may still face liability under the ADA, Title VII, and the Age Discrimination in Employment Act (ADEA).

Why It Matters

This case marks a turning point. It confirms that AI tools are not above the law. If an automated system contributes to discrimination, the employer — and sometimes the software company — can still be held accountable.

For employees and job applicants, that means hiring bias or wrongful exclusion based on a “machine’s” decision can still violate civil rights laws.

For students, the same principle applies: automated grading, behavioral evaluations, or “predictive” analytics that penalize students with disabilities can raise serious IDEA and Section 504 concerns.

The Broader Message

Whether it’s an employer using an algorithm to “screen out” candidates or a school district using AI to assess student behavior, automation doesn’t erase responsibility. The same civil rights protections that apply to human decision-makers still apply to AI.

How Baikow Disability and Education Law Group Helps

At Baikow Disability and Education Law Group, we represent employees, students, and parents who are harmed by unfair or biased systems — including the use of automated tools that discriminate or deny accommodations. We investigate, advocate, and litigate under the ADA, IDEA, Section 504, Title IX, and related laws to ensure that both human and technological decisions comply with civil rights standards.

Technology may evolve, but accountability remains the same.

Next
Next

How the Supreme Court’s decision in SFFA is affecting Discrimination Law