Edition 003: When People Don’t Count

Imagine that you have a deaf sister. You have spent your life feeling helpless as you watch her navigate a world that was not built for her. You watched her in classrooms where the teacher forgot to face the front when speaking, where group discussions became a blur of moving mouths she could not follow, where she learned — earlier than any child should have to learn this — that it was her job to close the distance the world insisted on keeping. You watched her develop a patience with people that you did not always feel yourself. Her grace infusing situations that would have made you furious. Her way of entering a room and finding, almost immediately, the one person who would take the time.

You watched her work twice as hard for half the recognition. You watched her smile at people who had just made her life harder. You felt, more times than you can count, the specific helplessness of loving someone the system keeps failing — the knowledge that no amount of your own frustration would change the fact that the world was designed for a different kind of person, and that she would have to find her way through it anyway.

And you watched her do exactly that. Never easily or without cost. But steadily, over years, through the accumulation of small dignities earned and larger indignities survived. A supervisor who finally saw what she was capable of. A colleague who became a friend. A performance review that reflected what she actually did, rather than what the system had expected from someone like her. You watched her build something real, piece by piece, in a world that kept putting obstacles in her path and calling them neutral.

You felt a specific, fierce pride in that. The kind of pride that only comes from watching someone the world underestimated refuse, quietly and persistently, to be what the world decided she was.

Now imagine watching her sit down in front of a screen. In front of a camera, alone, answering questions that will be evaluated not by a human being but by an algorithm — a system trained on data that did not include people like her, that hears her voice and does not recognize it, that measures her against a standard built for someone else. The system rejects her.

And then the system tells her, in the language of automated feedback, generated by the same algorithm that just decided she was not qualified, that she needs to work on her communication skills. It suggests to her that she should practice active listening.

You read that feedback and you feel something you do not immediately have a word for. It is not quite rage. It is colder and frightening, arising from a dangerous place deep within. It is the recognition that the system did not just fail her. The system evaluated her, found her wanting, and then explained to her — in the voice of neutral, objective assessment — that the problem was hers.

***

That was not a hypothetical.

Her name, in the public record, is D.K. She is an Indigenous woman. She is Deaf. She communicates using American Sign Language and English with a deaf accent[1] — the accent that belongs to someone who has built a relationship with spoken language on their own terms, without the hearing that most people take for granted.

She has held seasonal roles at Intuit — the financial software company whose products include TurboTax and QuickBooks, used by millions of Americans every tax season — since 2019. Every year she has been there, she has received positive performance reviews. Every year, she has received a bonus. Her supervisors know what she can do. One of them, in 2024, encouraged her to apply for a promotion.[2] She is also, while doing all of this, pursuing a master’s degree in data science — the field that underlies the very technology that would be used to evaluate her.

Her story is in the public record because she has chosen to put it there. That choice was itself an act of courage. D.K. knew, when she agreed to have the American Civil Liberties Union, among others, file a complaint on her behalf against Intuit and HireVue, a vendor of human resources assessment software using AI, that her story would become part of a legal proceeding, that her employer would dispute it, and that the outcome was not guaranteed. In fact, D.K. had previously informed the Chair of Intuit's Accessibility Team that HireVue was inaccessible and harmful to deaf applicants[3], but according to the Complaint, Intuit continued to use the product and required D.K. to undergo a HireVue interview regardless. So D.K. chose to proceed. She said, in her own words, that her experience reflects the systemic discrimination built into AI-driven hiring tools that continue to exclude and disadvantage marginalized communities. She is not speaking only for herself. She filed her lawsuit because she understands that her story impacts us all.

The case is pending. She has put her name (her initials) on a legal proceeding and waited for the system that failed her to account for itself. The law has not yet answered.

***

Now pull back from D.K.'s story far enough that you can see not one screen, or one camera, or one woman sitting alone, being evaluated by an algorithm that cannot hear her. Pull back far enough to see the scale.

Derek Mobley is a graduate of Morehouse College, one of the most respected historically Black universities in America. He holds an honors degree from ITT Technical Institute. He is African American. He is over forty years old. Understandably, he has suffered from anxiety and depression[4] because between 2017 and 2023, he applied for more than one hundred positions at companies that used Workday's AI-powered hiring platform to screen candidates and was rejected every single time.

Again, he was rejected, not after interviews or human review of his qualifications, not after any conversations or assessments, and not after any process that required a person to look at his name and make a considered judgment. Before any of that. The rejections came automatically. They came quickly. At least one came in the middle of the night — when no human being in any human resources department was at a desk, when no one was awake and reading applications, when the only thing moving was the data being processed by an algorithm that had been designed to make decisions so that humans wouldn’t have to.

Think about what that means. Because while Derek Mobley slept, an algorithm was rejecting him. Because while he went about his day, just like you go about your day — eating breakfast, making coffee, doing whatever people do on ordinary mornings — an algorithm was processing his applications and discarding them. His applications weren’t considered. They were processed in the way a machine processes inputs: receiving the data, running it against the model’s criteria, and producing an output. The same output, over-and-over again, for more than one hundred applications, across eight years. Rejected.

And yet, without a human being sitting at a screen deciding that Derek Mobley was not worth interviewing, there was no moment of prejudice, no conscious act of discrimination, no person who looked at his race or his age or his disability and made a decision. There was just an algorithm doing what algorithms do — finding patterns in data, applying those patterns to new inputs, producing outputs at a speed and scale that no human process could match.

That is what makes it so hard to see. And that is what makes it so dangerous.

Mobley filed suit against Workday. Not against the companies that had used Workday's platform to reject him — there were too many of them, spread across too many industries, and none of them had made the decision in any meaningful sense. They had licensed a platform. The platform had made the decision. Mobley sued the platform.

Workday moved to dismiss the case. Its argument was straightforward and, under the existing legal framework, not unreasonable: Workday was not the employer. It was a vendor. It did not have a relationship with the applicant. It simply provided software that companies used to make hiring decisions about their applicants. The companies made the decisions. Workday just built the tool.

The court disagreed.

In a ruling that some say may prove to be one of the most consequential employment law decisions of the AI era, the judge found that drawing an artificial distinction between human decision-makers and software decision-makers — treating the algorithm as a neutral instrument rather than a decision-maker in its own right — would, in the court's own words, “potentially gut anti-discrimination laws in the modern era.”[5]

The case was preliminarilycertified as a collective action on the age discrimination claim in May 2025.[6] It remains pending,[7] and is now one of the first large-scale legal challenges to AI-driven hiring tools— representing not just Derek Mobley but potentially millions of job applicants who were rejected not by humans making considered judgments but by software making snap calculations based on patterns that no one was required to explain, justify, or understand.

Workday disclosed “1.1 billion applications were rejected” through its platform during the relevant period; the collective “could potentially include 'hundreds of millions'”[8]  of people who sent applications in good faith, who prepared their resumes and wrote their cover letters and submitted their materials through the proper channels with hope, only to be processed and discarded. Processed and discarded. Processed and discarded. Repeatedly. By an algorithm applying criteria that no one was required to disclose, producing rejections that no one was required to explain, at a speed and scale that made the entire exercise invisible to the people it affected most.

D.K. sat in front of a camera and was told to practice active listening. Derek Mobley sent over a hundred applications and was rejected before any human being saw his name. These are not the same story.  But they are happening within the same system.

***

And the same system is also rejecting white, non-disabled applicants in the middle of the night. The process is identical. The person is processed rather than considered, discarded without explanation, and denied any meaningful opportunity to understand what happened or why. The harm is real. People are truly being hurt.

***

But the laws we have to address that harm were written for a different problem. Employment discrimination statutes assume a discriminator. They assume someone who looked at a person, or a name, or an address, and decided. The applicant rejected in the middle was not turned away by anyone in that sense. Instead, that applicant was turned away by the output of a AI model trained on data assembled by a vendor, and sold to an employer who never saw the person’s file. There is no person who is the decision-maker. Instead, there is a chain of decisions, distributed across companies and contracts and training data sets, and the law was not written for that. That’s the gap addressed next.

***
‍ ‍

[1] See "Complaint Filed Against Intuit and HireVue." American Civil Liberties Union, 19 Mar. 2025, www.aclu.org/press-releases/complaint-filed-against-intuit-and-hirevue-over-biased-ai-hiring-technology.

[2]American Civil Liberties Union, 19 Mar. 2025, www.aclu.org/press-releases/complaint-filed-against-intuit-and-hirevue-over-biased-ai-hiring-technology.

[3]ACLU of Oregon, 14 Apr. 2025, www.aclu-or.org/en/news/i-should-not-have-fight-fair-treatment-workplace.

[4] See Mobley v. Workday, Inc., First Amended Complaint, 740 F. Supp. 3d 796 (N.D. Cal. 2024).

[5] “Judge Allows AI Lawsuit Against Workday to Proceed." TechTarget, 15 July 2024, www.techtarget.com/searchhrsoftware/news/366593351/Judge-allows-AI-lawsuit-against-Workday-to-proceed.

[6]See "Federal Court Allows Collective Action Lawsuit Over Alleged AI Hiring Bias." Holland & Knight, 27 May 2025, www.hklaw.com/en/insights/publications/2025/05/federal-court-allows-collective-action-lawsuit-over-alleged.

[7]Mobley v. Workday, Inc., 3:23-cv-00770." CourtListener, Free Law Project, 29 Apr. 2026, www.courtlistener.com/docket/66831340/mobley-v-workday-inc/.

[8]See "AI Bias Lawsuit Against Workday Reaches Next Stage." Law and the Workplace, 11 June 2025, www.lawandtheworkplace.com/2025/06/ai-bias-lawsuit-against-workday-reaches-next-stage-as-court-grants-conditional-certification-of-adea-claim/.

Previous
Previous

Edition 004: When Laws Don’t Fully Protect

Next
Next

An Author's Note on Where This Is Going