Why You Should Use Bias-Detecting Tech in Your Hiring Process

Organizations that implement blind resume practices are not simply concealing candidate information — they may be obscuring their own awareness of the biases that persist among recruiters and hiring managers.

Written by Cliff Jurkiewicz
Published on Feb. 12, 2025
A recruiter looks at a resume on a clipboard as a candidate in a tie sits across the table
Image: Shutterstock / Built In
Brand Studio Logo

It sounds innocent enough, right? Remove job candidates’ race, gender and life experiences to focus solely on skills, competencies and accomplishments. Voila! Equitable, non-biased hiring.

Wrong. “Blinding” — deleting the information that makes a person unique (i.e., ethnicity, religion, sorority membership, veteran status) so a recruiter or hiring manager only sees the essentials on a resume — does not guarantee equitable outcomes. In actuality, it just snowplows potential bias to later in the hiring process.

Some people see this nefarious practice as a workable alternative to recent anti-DEI efforts. It’s not. The value of emphasizing the collective self is about fostering organizational culture. Denzel Washington explained the essence of culture when asked why Fences needed a Black director (hint: it’s not about race).

“It’s not color, it’s culture,” he said. “Steven Spielberg did Schindler’s List. Martin Scorsese did Goodfellas. Steven Spielberg could direct Goodfellas. Martin Scorsese probably could have done a good job with Schindler’s List. But there are cultural differences.”

Get his point?

There’s something nuanced in understanding a cultural point of view that can only be expressed from a person who has that lived experience. And it pertains to job candidates as much as movie directors.

More From Cliff JurkiewiczAfter Gov. Newsom’s Veto, Is an AI Law in California Still Possible?

 

Lived Experiences Hold Tremendous Value

That lived experience is an enormous part of the value someone can add to an organization. Why would an employer want to take that away? The altruistic belief is that blinding places every candidate on equal footing, but that’s not what organizations are made up of.

What’s going to happen when potential hires show up on a screening video call? At some point they will be seen and heard. What then? If an organization hasn’t learned to embrace cultural differences, then it will have an inherent bias that will play out later in the process. That is an absolute certainty. So, why not teach organizations how to embrace those cultural differences because they add value, not only to the individual, but that individual to an organization?

True diversity and inclusion happens when people are freely allowed to live and demonstrate the very values an employer expects them to.

Every single individual needs and deserves to be wholly represented. Removing someone’s personal attributes, the things that make them unique and interesting, so that you can mask bias in the hiring process is just wrong. It simply means that an organization has not evolved. And it fosters a perception that the hiring process is biased against resumes that have non-white-sounding names.

So, what’s an organization to do?

More on Data EthicsWhy Your Company Needs a Data Ethicist

 

Employ Bias-Detecting Technology

Technology is already out there in the market that automatically blinds data that could be used to create bias.

Take someone’s location. Say there’s an engineer from a country outside of the United States who’s competing with an engineer from Silicon Valley. Some technologies would actually downgrade the non-U.S. candidate’s skills based solely on physical location. That’s not right.

The tech that automatically blinds where someone lives happens for a reason: it’s not allowing the algorithm to consume that data  as part of its calculation of position fit. Although the technology is blinding the candidate’s location, it can provide an unbiased result, so long as it is combined with bias detection.

The system itself isn’t using that data, and it’s still presenting that person holistically.

What Is Bias Detection?

In the context of HR, bias detection exists to see if an organization hires a diverse group of people minus their demographic factors such as age, race, gender, disability status, etc.

The goal is to detect if there is an anomaly that could be bias so that humans could examine the data. Bias detection is meant to help address the identification of an issue, not necessarily determine if that issue is actually bias or not. That's up to humans to decide.

Sometimes what looks like bias actually isn’t. That is why demographic data should not be used algorithmically, because then it will create a bias.

If demographic factors aren’t in the system, then how does one know a person's unique characteristics? Easy -- the recruiter or hiring manager sees and hears the candidate. That’s why we say the system doesn’t “see” a person’s characteristics, but a human does.

Without technology, one would have to examine every person ever hired by a hiring manager, and all the data and criteria used in the hiring of those people, to form an opinion as to whether those hires were done in an unbiased way.

That’s the importance of the bias detection component: that the personal attribution data is never making its way into the algorithmic calculation of whether someone is a good fit or not. It is really being based on skills, competencies and experience.

Adverse impact reporting is the other critical component, because if someone’s personal attributes are being displayed (on social media profiles, videos or face-to-face) then it comes down to whether the human being is biased. We need technology to detect that.

Adverse impact reporting is basically checking if your hiring process accidentally or purposefully discriminates against certain groups of people by looking at the data and patterns to see if you’re hiring way fewer folks from one group (like women, minorities, or older people) compared to others.

To be clear, it doesn’t determine whether it’s accidental or purposeful. It’s just reporting on the patterns.

Explore Job Matches.