abstract UN073

New EEOC Guidance on the Use of Software, Algorithms, and AI in Talent Assessment Practices

Read our recommendations on how employers can comply with the new EEOC guidance to ensure fair employment practices, especially when using AI for hiring.

The U.S. Equal Employment Opportunity Commission (EEOC) recently released new guidance about how employers’ use of artificial intelligence (AI) and other software tools to make employment decisions may result in unlawful discrimination under the Americans with Disabilities Act (ADA).

This guidance stems from the potential concern that algorithmic decision-making tools when used to hire, monitor performance, determine pay, or establish other terms and conditions of employment, may discriminate against people with disabilities. The guidance provided by the EEOC places the responsibility on the employer administering the test to ensure that candidates receive transparent communication, prior to the administration of an assessment, in order to allow for accommodations to be requested if needed.

There are several reasons that led the EEOC to address this topic. First, the ongoing unemployment challenge for workers with disabilities. The Bureau of Labor Statistics’ September 2022 data displayed an unemployment rate of 3.1% for individuals with no disability, and that rate increases to 7.3% for those with a disability. Second, the use of AI-based decision-making tools in employment practice is on the rise and predicted to continue trending in that direction, hence the need for legal guidance surrounding it.

At SHL, we are committed to helping organizations create a diverse, agile, and innovative workforce through the use of inclusive assessments. In doing so, we want to summarize these recommendations provided by the EEOC and offer our recommendations for how employers can comply with these best practices as your organization creates its talent assessment programs.

What are some kinds of tools or assessments covered by the guidance?
  • Resume screening tools
  • Virtual asynchronous interviews that screen candidate responses
  • Computerized tests that measure abilities, personality, traits, or characteristics, including the use of games
  • Video interviewing that evaluates facial expressions or speech patterns

The Bureau of Labor Statistics’ September 2022 data displayed an unemployment rate of 3.1% for individuals with no disability, and that rate increases to 7.3% for those with a disability.

According to the new guidelines, how can algorithmic decision-making violate the ADA?
  1. Reasonable accommodations are not provided by the employer that would allow the applicant or employee to be rated fairly and accurately by the technology used.
  2. The technology used by an employer to administer a test is intentionally or unintentionally screening out candidates that would be able to perform the job successfully with reasonable accommodation. The selection tool itself is acting as a barrier to employment by not providing an accurate representation of candidate potential.
  3. The technology is being used to make “disability-related inquiries” or seek information that qualifies as a “medical examination”. An assessment includes “disability-related inquiries” if it asks questions that are likely to uncover information about a disability or directly asks an individual to disclose a disability. An algorithmic decision-making tool that could be used to identify an applicant’s medical conditions would violate these restrictions if they were administered prior to a conditional offer of employment.

All of this raises the question of what employers should do when utilizing decision-making software to make selection decisions. The EEOC provides a non-exhaustive list of promising practices, and we have summarized our top 3 recommendations for organizations to ensure the responsible use of software, algorithms, and artificial intelligence in talent assessment programs based on the new guidance:

Encourage requests for accommodations
  • Use inclusive language to inform candidates that reasonable accommodations are available. We know from our own research that individuals are hesitant to disclose a disability, even in a low-stakes practice assessment context. Inclusive language can encourage candidates to feel accepted and comfortable to request accommodations when needed.
  • Provide clear information about the accommodation process, making sure candidates know how to request accommodations and what accommodations are available.
  • Provide documentation that outlines the traits being assessed, how they are being assessed, and any factors and/or disabilities that could potentially impact candidates’ ability to perform on the assessment so that candidates are informed about when they may require a reasonable accommodation.
  • Ensure that staff is knowledgeable about offering appropriate alternatives when a reasonable accommodation is requested, and respond promptly to provide a positive and inclusive experience for the candidate.
  • Follow up with candidates after implementing an accommodation to ensure it met their needs.
Design with accessibility in mind
  • Consider using screening tools that were developed with a universal design. This reduces the possibility that the assessment itself will provide a barrier to demonstrating job-related skills, abilities, and behaviors. When tools are designed with inclusion in mind, fewer accommodations may be needed, and this helps to remove the burden of disclosure from the candidate.
  • Test the assessment for accessibility by using assistive technology to identify any potential barriers or issues.
  • Make collecting feedback a priority to ensure the tool continues to provide a fair and inclusive experience for all candidates.
Know what you are measuring
  • Ensure that the tool only measures abilities or qualifications that are essential for the job through job analysis.
  • Determine if someone could be successful on the job demonstrating some, but not all, of the skills, abilities, or behaviors being assessed.
  • Confirm that the tool is measuring only the essential skills, abilities, or behaviors of the job. Avoid the use of any tools that focus on assessing other characteristics that may be nice to have, or related to, but not essential for performance on the job.

We are committed to creating and maintaining an inclusive approach to talent assessment where everyone has the same access to opportunities. It is our duty to highlight the full potential of candidates in your business to create a diverse, inclusive, and innovative workplace through accessible talent solutions. We will continue to demonstrate our commitment to diversity, equity, and inclusion through partnering with others, engaging in research, and upskilling our teams to continue to integrate best practices for talent acquisition and talent management processes.

SHL has been helping organizations around the globe reduce biases in hiring through its scientific approach. Download our report on our commitment to diversity, equity, and inclusion.

headshot mckenzie specht


McKenzie Specht, M.A.

McKenzie Specht is a Scientist at SHL and has been with the organization since May 2022. McKenzie is an active contributor to SHL’s Neurodiversity Research Program, which is dedicated to researching how the personnel selection process may be uniquely different for a neurodivergent candidate than that of someone who is neurotypical. This research aims to inform best practices for employee selection to create a more fair and inclusive experience. McKenzie received her M.A. in IO Psychology from Minnesota State University, Mankato.

Explore SHL’s Wide Range of Solutions

With our platform of pre-configured talent acquisition and talent management solutions, maximize the potential of your company’s greatest asset—your people.

See Our Solutions