Mobley vs Workday: Broader Consequences of the Case for Recruiters and HR Technology (Part IV)

 
 

The Mobley v. Workday lawsuit has far-reaching implications for recruiters, HR professionals, and the growing industry of AI-driven hiring tools. It is the first major class action to challenge an algorithmic screening system under employment discrimination laws, and it is likely to set precedents for how such cases are handled. Here are 5 key consequences and lessons emerging from this case:

Lesson #1: AI Vendors Can Face Liability

One immediate takeaway is that companies providing AI HR software (like Workday) may not be shielded from discrimination claims. In the past, if a biased hiring decision occurred, liability rested with the employer. Now, this case suggests that if an employer relies on a third-party algorithm that produces biased results, that third-party could also be on the hook. This dramatically raises the stakes for HR tech vendors: they must ensure their products comply with anti-discrimination laws, because they might be treated as an “agent” of the employer in court. The case effectively warns that outsourcing a hiring function to AI does not outsource the legal risk; both the vendor and the employer can be accountable if that AI introduces bias. We also discussed this in Parts II and III of this article series.

Lesson #2: Employers Using AI Are “Next in Line”

Although Workday is the defendant here, the reasoning extends to employers themselves. If an employer uses an AI tool (whether built in-house or from a vendor) that causes disparate impact, the employer is certainly liable under Title VII/ADA/ADEA. The Judge’s willingness to see a common policy across different companies implies that courts might treat a widely used algorithm as a single practice, even across multiple employers. This could embolden more lawsuits directly against employers who deploy such tools. Employers can’t hide behind a vendor and say “it was the algorithm’s doing.” The law might treat bias in AI as bias by the company. Observers note this case is a “preview [of] how courts are likely to treat AI suits brought directly against employers,” suggesting that many more cases could follow, targeting any employer whose recruiting software systematically filters out protected groups.

Lesson #3: Disparate Impact Theory Remains Vital

The case clearly highlights the importance of disparate impact liability in the age of AI. In algorithmic systems, bias is often unintentional because the machine isn’t explicitly told to prefer a demographic group, say younger or White candidates. However, it may learn patterns that produce that effect. Title VII and related laws account for this through disparate impact claims that don’t require discriminatory intent. The court leaning on disparate impact theory means that lack of intent is not a defense if an AI tool disproportionately disadvantages a protected group. Interestingly, this comes at a time of political debate over disparate impact: the Fisher Phillips report pointed out that as of 2025, federal enforcement of disparate impact was potentially under threat (citing an executive order directing the EEOC to scale back disparate impact cases). However, Mobley v. Workday shows that private litigation can still enforce disparate impact standards, regardless of regulatory changes. For recruiters and HR, this means due diligence on AI tools is essential even if those tools have no overtly biased intent. It’s the outcomes that count.

Lesson #4: Recruiting Practices May Be Treated as “Unified” Across Firms

A remarkable aspect of the Judge’s ruling was treating Workday’s system as a single employment practice affecting many employers and applicants nationwide. In employment class actions, companies typically defended themselves by pointing to differences, such as different managers, locations, or applicant qualifications, to argue that no common policy exists. But in this case, because one algorithm was used across the board, the court treated it as a single, uniform practice. This suggests that when many companies adopt the same AI hiring platform, plaintiffs might successfully tie together what happens at Company A and Company B as part of one collective problem (the vendor’s algorithm). For HR, this is a double-edged sword: it means if you are using a popular tool that is found to be biased, your company could get swept into a larger class action indirectly, even if no one has sued you individually yet. It reinforces that industry-standard tools are not risk-free. A flaw in one widely used AI system can spur industry-wide litigation.

Lesson #5: Huge Class Size Is Not a Deterrent to Courts

The court’s willingness to manage a class potentially comprising millions of job applicants shows that judges are prepared to tackle even massive, nationwide classes if the issue is pervasive. For organizations, this means a widespread discriminatory effect caused by technology can result in equally widespread legal exposure. The notion of sending notice via social media or the platform itself is novel, but it could set a precedent for reaching gig applicants or online users in future cases. In practical terms, recruiters and companies should realize that “too many affected people” is not a good argument. It’s actually a sign of a bigger compliance problem.

This wraps up our final part in the Mobley vs Workday article series. In summary, Mobley v. Workday rings a warning bell: HR technology must be handled with the same care for civil rights compliance as any human decision-maker would be. Employers also must be proactive by scrutinizing biases and the tools they use to ensure they don’t inadvertently disadvantage protected groups. 

If you want something deeper than a list of unconscious biases with trendy names, The Equity Edge is worth your time. Here, I show where bias actually appears—at every step of the hiring process. You’ll see how bias quietly or overtly shapes decisions long before interviews even begin. The book guides you to ask better questions and examine your systems in the contexts that matter most, then offers solutions you can apply right away. It’s written for people serious about building more equitable workplaces.

Jennifer Tardy