Mobley vs Workday: You Don’t Need Intent to Do Harm (Part I of IV)
Increasing diversity, equity, and inclusion often runs into this familiar refrain: We didn’t mean to. It’s usually said with surprise, sometimes even defensiveness, by folks who didn’t set out to cause harm—but did. That’s the trap of intent. And legally speaking, in many cases, intent doesn’t matter.
That’s the core of disparate impact law. Under Title VII of the Civil Rights Act, it’s illegal for employers to use hiring practices that disproportionately exclude people in protected groups, even if they had no discriminatory intent. The harm lies in the impact, not the motive. That’s what the Mobley v. Workday lawsuit is all about.
Derek Mobley is a Black military veteran over 40 who has managed anxiety and depression. He’s applied to more than 100 jobs since 2017, all through employers using Workday’s hiring software. He never got a callback. Some rejections landed within minutes. Many came before any human could’ve reasonably read his resume. An algorithm filtered him out before he ever got a shot.
Interestingly, Mobley’s lawsuit doesn’t accuse Workday of overt racism or explicit ageism. He claims that he was excluded by design, even if the designers didn’t know it.
That should make every hiring team pause.
Workday’s tools let employers use AI to sort, score, and sometimes auto-reject candidates. Mobley alleges that those tools, which are trained on historical data and shaped by past decisions, ended up screening out people like him. The algorithms, unlike humans, didn’t ask whether he was qualified. They made a fast decision based on patterns that favored younger, non-Black, and non-disabled candidates. And those patterns, Mobley argues, were baked in from the beginning.
Now, here’s where the law gets clear. Under disparate impact doctrine, employers and their agents can’t disproportionately harm people (or use tools) based on race, age, or disability unless they can prove the tools are necessary for the job and no less discriminatory alternative exists.
In this case, Mobley didn’t need to prove Workday meant to discriminate. He only needed to show that their system did discriminate, systematically and measurably. That’s what makes disparate impact so powerful. It forces us to shift our attention from ‘what we intended’ to ‘what we created.’
And this is where equity work gets real.
Equity becomes essential because neutrality or good intentions alone aren’t enough. It pushes us to ask who the process was built to serve and who it quietly leaves out. Was Mobley’s rejection a personal slight? The courts say no. If anything, it was structural. He got filtered out by software that judged his résumé against historical hiring patterns, patterns shaped by years of bias and exclusion. That’s exactly the issue: when technology or any tools repeats the past, it repeats its mistakes too.
For many recruiters and organizations, “we didn’t mean to” still feels like a defense. Unintentional harm is still harm, and staying quiet when people are excluded is still a choice. Ignorance is no longer legal protection, nor it is moral. So, the tools you use to sort people, especially at scale, need to be tested, monitored, and audited through an equity lens.
If your hiring process uses AI or automated systems, here’s the challenge: don’t wait for a lawsuit to ask whether your tools are fair. Don’t assume neutrality just because a machine is making the call. And definitely don’t get comfortable with “we didn’t mean to.”
Equity work demands more than good intentions. It demands awareness, action, and accountability. Regularly audit your AI hiring tools for adverse impacts on protected groups, clearly linking every criterion directly to job skills. Continuously monitor results, incorporate the lived experience intelligence of diverse perspectives, and maintain human oversight to ensure equity at every step.
This dilemma, causing harm without meaning to, is a key theme in my book, The Equity Edge. I dive into how bias often hides in everyday default settings and business-as-usual processes rather than overt intent. Mobley’s case underscores exactly that: a system that didn’t mean any harm but learned to exclude anyway. If you’re serious about increasing equity and removing one more obstacle in your organization, The Equity Edge gives you practical tools to spot these hidden patterns and design systems that center equity. Learn more and get your copy here.