news

Your Job Rejection May Have Been an A.I. Mistake—How to Spot and Fix It

AntonioGuillem | Getty Images

In 2015, before anyone had ever uttered "ChatGPT," artificial intelligence already impacted job seekers in a big way.

That year, Amazon realized that its machine-learning algorithms, meant to speed up its hiring process, were biased against women. Specifically, they accidentally weeded out resumes with the word "women's" — for instance, if a candidate had participated in a "women's chess club," according to a Reuters report.

The algorithms were trained to look for candidates similar to people Amazon had previously hired, replicating pre-existing gender inequalities. Other cases of AI hiring discrimination abound, so much so that the Equal Employment Opportunity Commission offers guidance for employers to ensure AI doesn't jeopardize fair hiring processes.

"The lesson of the Amazon case is that you can't just delegate hiring to AI and think everything is OK," says Ifeoma Ajunwa, a law professor at the University of North Carolina, Chapel Hill who authored the book, "The Quantified Worker: Law and Technology in the Modern Workplace."

Hiring bias isn't a new phenomenon: Some groups have long been unjustifiably overlooked in the job pool. AI replicates this discrimination "at scale," Ajunwa tells CNBC Make It.

Legislators in some areas — like New York City — are creating policies to regulate the technology's blind spots. But "technology is always way ahead of legislators," says Lindsay Greene, a labor issues attorney who's part of the online legal network LegalShield.

In the meantime, job seekers are caught in the crossfire of AI's hiring mistakes. Here are some errors an AI might make, and what you can do about them.

Why AI can make hiring discrimination worse

AI is only as unbiased as the humans who programmed it. There are several points in the hiring process where an AI's encoded biases could interfere with a candidate's chances of securing the job:

  • When AI misunderstands a resume's time gaps. AI filters may wrongly exclude candidates with gaps in employment on their resumes, Ajunwa says. They're less likely to automatically disqualify resumes that account for every month, even if that means providing a brief explanation of time off.
  • When an AI interviewer isn't built for all faces or speech patterns. AI isn't always smart enough to understand every face. For example, darker-colored skin isn't always correctly interpreted by an AI, Ajunwa says. AI may misinterpret the body language of a neurodivergent person who doesn't make much eye contact with the computer, says TopResume career expert Amanda Augustine. A heavy accent or speech impediment could also unjustifiably hurt a candidate. If you aren't comfortable with an AI interview for any of these reasons, Greene suggests reaching out to the human recruiter directly and asking for an accommodation.
  • When scanning social media. Employers use automated technology to scan the internet for a candidate's "social media footprint," Greene says. There's only one sure-fire safeguard against facing professional consequences for your social media activity: Keep your personal accounts private and remove any content that might give a hiring manager the wrong impression.

How to fix AI's mistakes

If you suspect that AI had something to do with a discriminatory job rejection, get in touch with the company directly, experts say.

Many companies source their AI hiring filters from third-party vendors, and they may not always understand how the technology works firsthand. "A lot of times, employers don't even know what the AI system is doing," says Ajunwa.

The good news: Getting ghosted by a recruiter may not necessarily mean rejection. If you haven't heard back because you think you've been wrongly dismissed, send a follow-up email to a human recruiter with your resume attached, Ajunwa advises.

She's seen that strategy land someone a job even after an AI filter weeded out their resume, she says.

Another tactic: Make follow-up emails part of your standard practice. "Maintaining that human connection, whether it be a follow-up phone call or cover letter, is a way to separate yourself out and to distinguish yourself in this process," says Greene.

If contacting a human directly doesn't help, you might be short on options — for now. The U.S. legal system is still figuring out how to litigate the issue, Greene says. Consulting an attorney might be useful in some ways, but she notes that any legal advice on AI discrimination can't be taken at face value.

"We are still literally at ground zero of trying to figure out how this new world of AI is going to intersect with our federal and state discrimination laws," says Greene.

DON'T MISS: Want to be smarter and more successful with your money, work & life? Sign up for our new newsletter!

Get CNBC's free Warren Buffett Guide to Investing, which distills the billionaire's No. 1 best piece of advice for regular investors, do's and don'ts, and three key investing principles into a clear and simple guidebook.

Copyright CNBC
Contact Us