Skip to content

Feds warn employers against discriminatory recruiting algorithms

    As companies increasingly Involving AI in their recruiting processes has continued to sound the alarm for lawyers, advocates and researchers. Algorithms have been found that automatically assign different scores to applicants based on arbitrary criteria, such as whether they are wearing glasses or a headscarf or have a bookshelf in the background. Hiring algorithms can penalize applicants for having a black-sounding name, listing a women’s college, and even submitting their resumes using certain file types. They can disadvantage people who stutter or have a physical disability that limits their ability to use a keyboard.

    All of this has gone largely unchecked. But now the US Department of Justice and the Equal Employment Opportunity Commission have issued guidance on what businesses and government agencies should do to ensure their use of AI in hiring is compliant with the American with Disabilities Act.

    “We cannot allow these tools to become a high-tech path to discrimination,” EEOC Chair Charlotte Burrows said in a briefing with reporters on Thursday. The EEOC instructs employers to disclose to applicants not only when algorithmic tools are used to evaluate them, but also what properties those algorithms assess.

    “Today we are sounding alarms about the dangers associated with blind reliance on AI and other technologies that we are seeing more and more use by employers,” Assistant Attorney General for Civil Rights Kristen Clark told reporters in the same news conference. “Today we make clear that we need to do more to break down the barriers faced by people with disabilities, and no doubt, the use of AI is exacerbating the long-term discrimination faced by job seekers with disabilities.”

    The Federal Trade Commission issued broad guidance on how companies can use algorithms in 2020 and again in 2021, and a White House office is working on an AI Bill of Rights, but this new guidance sets out how the two agencies will deal with violations of federal civil rights law using algorithms. It also brings with it the credible threat of enforcement: the Department of Justice can file lawsuits against companies, and the EEOC receives discrimination complaints from job seekers and employees that can lead to fines or lawsuits.

    According to data from the US Bureau of Labor Statistics, people with disabilities are out of work at a rate twice the national average. People with intellectual disabilities are also high in unemployment, and Burrows says employers should take steps to screen the software they use to make sure people with disabilities aren’t excluded from the job market.

    A number of actions approved Thursday by the EEOC and DOJ were previously suggested in a 2020 report by the Center for Democracy and Technology on the ways in which hiring software could discriminate against people with disabilities. They include eliminating the automated screening of people with disabilities and providing “reasonable accommodation” for people who might otherwise have problems with the software or hardware involved in the hiring process. The CDT report also calls for audits of hiring algorithms before and after they go live — a move not included by the EEOC — and refers to bias against employing people with disabilities online as an “invisible injustice.” ‘.

    As a teenager, Lydia XZ Brown found taking personality tests at job interviews a fun or strange game. They can’t prove it, but they now suspect they experienced discrimination when applying for a job at the mall near where they grew up in Massachusetts. Brown, a co-author of the 2020 CDT report on hiring discrimination, called Thursday’s guidance a major victory, after years of advocacy, for people like her with disabilities.