Why the Workday Discrimination Lawsuit Should Be a Wake-Up Call for HR Leaders.
When the news broke about a class-action lawsuit accusing Workday of enabling discrimination through its AI hiring tools, it sent shockwaves through the HR and tech community. And it is positioned to set a major precedent with AI in Human Capital Management.
As someone who has spent over twenty-five years in talent strategy, technology, and inclusion, I had two immediate reactions:
AI Doesn’t Create Bias. It Inherits It.
Let’s be clear: AI does not have a propensity for bias. Humans do. This is a well-established concept in psychology and cognitive science.
Algorithms, especially those in talent platforms, are only as good—or as flawed—as the data we feed them.
In this case, what most people don’t realize is that tools like Workday rely heavily on inputs such as:
So, when bias shows up in the outcomes, we shouldn’t just blame the system—we should ask: What patterns and priorities have we hard-coded into our hiring culture without even realizing it? AI is NOT the problem. And once you understand what IS the problem, AI can provide several solutions.
The Real Culprit: Bad, Biased Job Descriptions
In my 25 years of working in the Human Capital Management outsourcing industry, I’ve had well over a hundred customers of all sizes and across all of the major industries. And one theme is constant: job descriptions are broken. I know, I’ve seen them.
They’re:
If you’ve ever written a job description that says “must have a degree from a top university” or “minimum 10 years of experience in a startup” … congratulations, you’ve just unintentionally biased the algorithm.
I know a little something about outsourcing to companies like Workday – but that does not mean you can outsource responsibility to them.
Blaming Workday or any HCM tech vendor for algorithmic bias is like blaming a spreadsheet for a bad budget. These tools amplify what’s already in your system—they don’t invent it.
Shift the Focus: From Blame to Better Design
Here’s how companies can respond constructively:
Audit Your Hiring Data
Stop Using "Success Profiles" Based on Yesterday’s Leaders
Use Validated, Skills-Based Assessments in hiring
Train Your Teams, Not Just Your Tools
The Workday Lawsuit Is a Wake-Up Call—Not a Warning Label against AI
This moment should push us not to fear AI, but to understand it better, and get serious about the quality of what we feed into it.
These technologies provide so many solutions and they keep getting better. But until you understand how to work with your own data—hiring histories, job specs, implicit definitions of “fit”—you’ll keep getting the same results from different tools.
It’s not Workday’s fault. It’s your hiring data. Fix that—and the tools will work just fine.
Final Thought
Of course, I must caveat that we do not know all the details of the Workday lawsuit yet; more information could be brought to light. But that does not change anything mentioned here. Having been in this business for as long as I have, I’ve seen fads come and go, I’ve seen other companies propose solutions, and I’ve seen how poor, outdated, inaccurate data is the cause of many issues and headaches in the industry.
Here at Ignis AI we are poised to deliver real solutions to existing workforce problems, including this one. If you’re looking for help in building a people-first, future-ready workforce, let’s talk.
Unlock potential. Transform careers. Elevate the future of work.