We use cookies to personalize content and ads, provide social media features, and analyze our traffic. By continuing to use this website, you consent to the use of cookies in accordance with our Privacy Policy.

Skip to main content

Gender, Race, and Intersectional Bias in Resume Screening via Language Model Retrieval

In this article, researchers used large language model (LLM) artificial intelligence (AI) hiring tools to screen over 500 resumes and determine whether the biases within LLMs would create gender and racial bias in resume screening. They found that AI screening tools are biased, significantly favoring White-associated names and male-associated names. To read more, click here.