Blog & News


This is a premium post...


If you are not an AIM member - Consider joining. AIM Members receive access to all our premium content online.

If you're an AIM member please login to your AIM account to view this post:


Back to Posts

EEOC Issues Guidance on Use of Artificial Intelligence in Hiring

Posted on August 5, 2022

The U.S. Equal Employment Opportunity Commission (EEOC) issued a new guidance this past May on the use of software, algorithms and artificial intelligence in assessing job applicants and employees. Release of this guidance should give pause to employers.

The guidance says that employer use of tools that rely on algorithmic decision-making may violate the Americans with Disabilities Act (ADA).

When HR Uses AI

The EEOC guidance includes definitions for software, algorithm and artificial intelligence. Examples of software that may cause problems for the employer include:

  • Resumé scanning programs that prioritize applications using certain keywords;
  • Monitoring software that rates employees based on their keystrokes or other factors;
  • Video interviewing software that evaluates candidates based on facial expressions and speech patterns; and
  • Testing software that provides “job fit” scores regarding personality, aptitude, or cognitive skills.

If an employer administers a pre-employment test, that test may be responsible for ADA discrimination if the test discriminates against individuals with disabilities, even when the test is developed by an outside vendor. In addition, employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on their behalf.

Examples of how algorithms and AI may violate the ADA may occur when the employer uses:

  • “gamified” tests in which video games are played to measure abilities, personality traits, and other qualities. If an employer requires a certain score on a gamified memory assessment, a blind applicant wouldn’t be able to see the screen to play the games. The applicant may have a good memory and may be perfectly capable of performing the essential functions of a job requiring good memory.
  • an algorithmic decision-making tool that “screens out” an individual based on a disability even though the person could perform the job with a reasonable accommodation. Problems may arise if screening out occurs when qualified applicants or employees lose a job opportunity because a disability prevents them from meeting—or lowers their performance on—a selection criterion.
  • video interviewing software that analyzes applicants’ speech patterns to assess problem-solving ability that wouldn’t fairly score an individual who has a speech impediment that causes significant differences in speech patterns. If the applicant were rejected because of a low score caused by the speech impediment, the person may have been improperly screened out.
  • a form of testing that doesn’t provide a reasonable accommodation that is necessary for a job applicant or employee to be rated fairly and accurately by the algorithm. An applicant may have difficulty taking a knowledge test that requires using a keyboard or trackpad. Therefore, that type of test wouldn’t accurately measure the applicant’s knowledge. According to the agency, the employer would need to provide an accessible version of the test as a reasonable accommodation, such as a test that allows oral responses, unless it would cause an undue hardship.
  • an algorithmic decision-making tool that constitutes a disability-related inquiry or medical examination before giving a candidate a conditional offer of employment, even if an individual doesn’t have a disability. According to the guidance, an assessment includes “disability-related inquiries” if it asks job applicants or employees questions likely to elicit information about a disability or directly asks whether they have a disability. An assessment qualifies as a “medical examination” if it seeks information about an individual’s physical or mental impairments or health.

Promising practices that help avoid trouble  

The guidance also identifies promising practices employers may use to comply with the guidance.

Promising practices to help employers provide a reasonable accommodation include:

  • Training staff to recognize and process requests for reasonable accommodation as quickly as possible, including requests to retake a test in an alternative format, or to be assessed in an alternative way, after the individual has already received poor results.
  • Training staff to develop or obtain alternative means of rating job applicants and employees when the current evaluation process is inaccessible or otherwise unfairly disadvantages someone who has requested a reasonable accommodation because of a disability.
  • If the algorithmic decision-making tool is administered by a third party, such as a testing company, asking that third party to forward all accommodation requests promptly to be processed by the employer in accordance with ADA requirements. Alternatively, the employer could seek to enter into an agreement with the third party requiring it to provide reasonable accommodations on the employer’s behalf.

Promising practices to minimize the chances that algorithmic decision-making tools will disadvantage individuals with disabilities, either intentionally or unintentionally include:

  • Using algorithmic decision-making tools that have been designed to be accessible to individuals with as many kinds of disabilities as possible, thereby minimizing the chances that individuals with different kinds of disabilities will be unfairly disadvantaged.
  • Informing all job applicants and employees who are being rated that reasonable accommodations are available for individuals with disabilities and providing clear and accessible instructions for requesting such accommodations.
  • Describing, in plain language and in accessible formats, the traits that the algorithm is designed to assess, the method by which those traits are assessed, and the variables or factors that may affect the rating.

Promising practices to minimize the chances that algorithmic decision-making tools will assign poor ratings to individuals who are able to perform the essential functions of the job, with a reasonable accommodation if legally required, include:

  • Ensuring that the algorithmic decision-making tools only measure abilities or qualifications that are truly necessary for the job—even for people who are entitled to an on-the-job reasonable accommodation.
  • Ensuring that necessary abilities or qualifications are measured directly, rather than by way of characteristics or scores that are correlated with those abilities or qualifications.

Before purchasing an algorithmic decision-making tool, an employer should ask the vendor to confirm that the tool does not ask job applicants or employees questions that are likely to elicit information about a disability or seek information about an individual’s physical or mental impairments or health, unless such inquiries are related to a request for reasonable accommodation.

The ADA permits an employer to request reasonable medical documentation in support of a request for reasonable accommodation that is received prior to a conditional offer of employment, when necessary, if the requested accommodation is needed to help the individual complete the job application process.
Final thoughts
Employers need to take care in using new technologies to ensure they do not inadvertently violate laws such as the ADA. Companies considering using these technologies on their own or through a third-party should carefully read over the guidance prior to making any employment decision.

AIM members with questions about this or other human resource issues may contact the AIM Employer Hotline at 800-470-6277.