-
AI Hiring: The Hidden Discrimination and How to Counteract It – Image Credit Unsplash+
- A recent study shows that AI resume screening tools often favor White and male candidates, indicating a potential for inadvertent discriminatory hiring practices.
- The researchers propose five practical steps to avoid AI bias in the workplace. These include regular audits of AI tools, data transparency, avoiding overreliance on automation, adopting inclusive job descriptions, and implementing data-driven safeguards.
In a significant study unveiled at the October Association for the Advancement of Artificial Intelligence/Association for Computer Machinery (AAAI/ACM) Conference on AI, Ethics, and Society, researchers Kyra Wilson and Aylin Caliskan unearthed unsettling biases existing in several leading open-source AI resume-screening models.
The study employed 554 resumes and 571 job descriptions, with over 3 million combinations analyzed across different names and roles. The researchers switched the names in the resumes, using 120 first names generally associated with male, female, Black, or White individuals. The resumes were submitted for positions ranging from chief executive to sales workers.
The results were disconcerting. Resumes with White-associated names were selected 85% of the time for the next hiring step, while resumes with Black-associated names were only preferred 9%. Additionally, resumes with male-associated names were chosen 52% of the time, even for roles with a traditionally high representation of women. Worryingly, Black men faced the most disadvantage, with their resumes being overlooked 100% of the time in favor of other candidates.
The researchers ascribe these biased outcomes to the data used to train the AI models. AI systems inherently mirror the patterns present in their training datasets. If these datasets are drawn from sources with historical or societal inequities, the AI system is likely to replicate or even amplify those inequities, leading to biased decision-making. The phenomenon, termed “garbage in, garbage out,” warns that AI tools that lack sufficient attention to diversity and equity in their training data risk becoming automated gatekeepers of discrimination, systematically disadvantaging qualified candidates from underrepresented groups.
Employers adopting AI in hiring need to be aware of the legal, ethical, and reputational risks associated with potentially biased outcomes. Discriminatory hiring practices can lead to costly lawsuits or government investigations. Even unintentional discrimination can lead to an adverse finding and cannot be fully defended. The government has indicated that employers cannot deflect responsibility by blaming AI vendors if the technology discriminates against applicants or workers. Furthermore, by screening out candidates who don’t fall into the White male category, organizations lose out on diverse, qualified talent that can strengthen their foundation. Accusations of AI bias could also lead to damaging publicity, negatively impacting recruitment and retention efforts and tarnishing the reputation among clients and customers.
To counteract AI bias in hiring, the researchers suggest five best practices. Regular audits of AI tools should be conducted to detect racial, gender, and intersectional biases. Employers should ensure that the data used to train their models is balanced and representative and prioritize models with in-built transparency. An over-reliance on automated decisions should be avoided, and human oversight should be integrated into AI decisions. Job descriptions and selection criteria should be neutral and inclusive, removing unnecessary criteria that may unfairly disadvantage certain candidates. Finally, implementing data-driven safeguards by regularly analyzing the outcomes of AI screening tools and comparing demographic results can help identify and address any biases.
While AI technology promises efficiency, it is essential to be aware of its potential to inadvertently perpetuate biases and implement measures to ensure fairness in hiring practices.
Discover more at HospitalityLawyer.com.