Many employers are using artificial intelligence (AI) and other software to evaluate new hires, monitor employee performance, and even determine who to promote or discharge. A recent Accenture and Harvard Business School study shows that 99% of Fortune 500 companies use applicant tracking systems. And among organizations with 1,000 or more employees, close to 70% also use recruitment management systems.

Automated decision-making tools (ADMTs), applicant tracking systems and recruitment management systems share a common goal: to make it easier to source and vet qualified candidates and identify the best matches. These are all critical decisions for hiring and career growth.

But while software-based testing and screening tools can ease the burden on your human resources team, they can also inadvertently disqualify candidates well-suited for the work — including individuals protected under the Americans with Disabilities Act (ADA) due to physical disability, temporary, chronic or long-term illness, mental health, or developmental or neurological challenges.

Federal compliance guidance just released

The Equal Employment Opportunity Commission (EEOC) recognizes that AI-based algorithms, machine learning and other ADMTs could lead to legal missteps. On May 12, 2022, the EEOC and Department of Justice (DOJ) released new guidance to help employers avoid disability discrimination claims.

Areas of concern

The DOJ's guidance gives state and municipal employers practical pointers on how to avoid disability discrimination claims stemming from the use of algorithms and AI in hiring. And the EEOC’s technical assistance document (TAD) offers similar advice for private employers.

The TAD advises employers to pay attention to three areas of concern:

  1. Reasonable accommodations that may be required when using decision-making algorithms
  2. Practices that could unlawfully screen out qualified workers under the ADA from being considered for jobs or promotions
  3. Whether AI software and algorithms require applicants and employees to provide information about disabilities or medical conditions (Such practices could constitute unlawful disability-related inquiries or lead to prohibited medical examinations.)

While neither the EEOC’s nor the DOJ’s newly released guidance has the force or effect of law, they do give employers some parameters for how to use ADMTs without violating federal disability law.

Background

In 2021, the EEOC launched an Artificial Intelligence and Algorithmic Fairness Initiative to ensure that the use of AI, machine learning and emerging technologies in hiring and other employment practices complies with federal equal employment opportunity (EEO) laws. The release of the TAD and the DOJ’s guidance signifies a key step in setting the stage for future EEO enforcement.

A recent audit of employment algorithms by the Brookings Institution found that natural language analyzers for resumes and interview transcripts showed bias against people with disabilities. It also found that commercial facial recognition tools revealed “clear disparities” that were “highly concerning” for that group. Brookings added that Black, female and older workers were also negatively impacted, and it warned that “small biases in individual algorithms” could lead to larger, systemic issues for employers.

This issue will likely continue to be a source of concern for employers, and rightfully so. EEOC Chair Charlotte Burrows says new technologies shouldn’t become pathways to discrimination.

Practical pointers for avoiding disability discrimination claims

Given that both the EEOC and DOJ have issued guidance on this issue, now is a good time to evaluate your AI tools and software. Specifically, consider whether:

  • The selection and performance criteria are job related and fair.
  • Certain scanning algorithms might promote bias.

While these are the main criteria for remaining in compliance with the ADA, the EEOC’s guidance offers some specific tips on how to avoid disability discrimination claims.

Grant reasonable accommodations

First, evaluate whether employment screening tests are timed or require certain equipment. The EEOC explains that a timed test requiring the use of a keyboard might unfairly disadvantage a candidate who has a condition that precludes them from typing quickly. Similarly, a visual memory test could unfairly disadvantage someone with a visual impairment.

In cases like these, be mindful of the types of reasonable accommodations you may need to make to accurately evaluate an individual’s aptitudes.

Here are some common impairments that may require a reasonable accommodation:

  • Visual and hearing impairments
  • Manual dexterity impairments
  • Cognitive impairments

Encourage applicants and employees to speak up if they need an accommodation. If someone raises a concern, ask what they think could help them perform the requested function. Remember that you may ask for proof or additional information, the EEOC notes.

The EEOC’s guidance includes several examples of reasonable accommodations, including:

  • Specialized equipment
  • Alternative tests and testing formats
  • Permission to work in a quiet place
  • Other exceptions to workplace policies

When communicating about reasonable accommodations, employers should:

  • Be open and upfront. If an evaluation is part of the application process, the EEOC recommends communicating the manner of evaluation upfront so the applicant has time to request an accommodation if one is needed. The EEOC calls this a “promising practice” for avoiding ADA claims.
  • Engage in an interactive process by promptly addressing requests for reasonable accommodation. If an employer flat-out refuses a request, applicants and employees are advised to reach out to the EEOC about next steps.

Make sure your software isn’t screening out protected individuals

The main thing to watch out for is how certain technologies may screen out individuals with disabilities.

Examples of applications that could lead to legal issues include:

  • Chatbots
  • Problem-solving ability tests
  • Video interviewing
  • Personality assessments
  • Facial recognition software

Here’s a closer look at how such applications could be problematic:

A chatbot’s binary format could be discriminatory if it screens someone out solely because they answered “yes” or “no” to a question.

For example, imagine a chatbot disqualifies a job candidate because they answered “no” to a question about being able to stand for extended periods. The applicant might have been unlawfully disqualified since a person with an ADA-protected disability who can perform the job’s essential functions is entitled to a reasonable accommodation, such as a chair.

The EEOC also gives an example of an employer using a computer program that analyzes speech patterns to test an employee’s problem-solving skills. The worker stutters, and as a result the algorithm disqualifies them from a promotion. The employee could have a viable ADA claim.

Facial recognition software and personality assessments might also unfairly disadvantage people with certain conditions or traits. For example, such tools might disparately impact workers with major depressive disorder (MDD) or post-traumatic stress disorder.

The EEOC gives an example of a job applicant who suffers from MDD and is required to take a personality test that asks questions about optimism. If that person is disqualified from the job based on their answers, and not their ability to perform the essential job functions, that would violate the ADA.

To prevent ADA discrimination claims, audit any technology that scores applications or tests. Consider whether it could lead to a discrimination claim under the ADA.

In 2021, New York City became the first municipality in the nation to enact legislation that requires employers to examine their ADMTs. That law (Local Law 1894-2020), effective January 1, 2023, could indicate how other jurisdictions will address the issue.

For instance, covered employers operating in New York City must keep a summary of audit findings for bias on their websites; make available the type of data their ADMTs collect; notify applicants prior to using such tools; and give candidates the opportunity to request an accommodation. Since other jurisdictions are likely to follow suit, it’s a good idea to budget for algorithmic audits. Right now, there are several audit firms carving out a niche in this growing industry.

Algorithmic audits can also help you determine whether your evaluations are having a disparate impact on a protected class, such as those covered under Title VII of the Civil Rights Act. You can assess whether an ADMT is resulting in disparate impact discrimination by comparing average results among different demographic groups, the EEOC explains.

“If the average results for one demographic group are less favorable than those of another (for example, if the average results for individuals of a particular race are less favorable than the average results for individuals of a different race), the tool may be modified to reduce or eliminate the difference,” it adds.

But this type of evaluation differs from the analysis that would be required to determine if disability bias has occurred. “If an employer or vendor were to try to reduce disability bias in the way described above, doing so would not mean that the algorithmic decision-making tool could never screen out an individual with a disability.”

Since each disability is unique and can have different impacts (or no impact at all) on performance, the EEOC advises employers to “take different steps beyond the steps taken to address other forms of discrimination.”

If you are using an external provider to administer an ADMT on your behalf, instruct them to send any requests for accommodation to you as soon as possible, says the EEOC. Alternatively, you could enter into an agreement with the provider to administer reasonable accommodations on your behalf, as long as they comply with the ADA.

Be aware that using an ADMT developed by an outside vendor generally doesn't shield an employer from liability. The EEOC’s guidance states that employers can be held liable for their agents, which may include entities like software providers, if the employer gives that entity the authority to act on its behalf.

Before you purchase an ADMT, ask the vendor to confirm that the tool doesn’t ask questions likely to elicit information about disabilities or seek information about physical or mental impairments or health. (Such inquiries must be directly related to a request for reasonable accommodation.)

Also think about whether the algorithm could disadvantage protected individuals by screening out traits or characteristics that are correlated with certain disabilities.

Do not retaliate

Finally, do not retaliate against anyone who raises concerns or complains that your ADMTs are violating the law. Under the ADA, complaints of discrimination are protected activities, as are requests for reasonable accommodation.

Use AI judiciously

ADMTs can be an effective way to ease the administrative burdens of recruiting and managing talent. But you need to use them judiciously. Make sure you’re measuring necessary qualifications and abilities directly, and not excluding individuals based on characteristics or scores linked to those qualifications and abilities.

If you have to question whether your selection criteria or testing methods might unjustly disqualify a worker protected under the ADA, err on the side of caution. And be ready to adopt protocols to meet reasonable accommodation requirements under federal, state and local laws.


Blue Ridge Risk Partners is a top 75 independent insurance agency in the United States. With 21 offices throughout Maryland, Pennsylvania, and West Virginia and access to hundreds of carriers, we are able to meet your unique insurance needs.