In-House and Employment Lawyers Beware!

While the U.S. Chamber of Commerce is currently promoting the use of AI for hiring, the EEOC was busy settling their first ever AI discrimination lawsuit. As first reported by the MSBA in EEOC Issues Guidance on AI Use in Employment Decisions, in May 2023, the EEOC issued guidance on the use of artificial intelligence in hiring, transfer, demotion, monitoring performance, determining pay or performance and other employment selection procedures. The EEOC warned that without proper safeguards the use of these AI tools may disproportionately have a disparate impact on certain minority groups. Consequently, employers were placed on notice that any software, algorithm or AI tool that they use for screening or for any employment decision may produce disparately discriminatory practices. The EEOC recommended that employers frequently audit the systems that they use to ensure the technologies are not violating pre-existing law on disparate impact. 

Fast forward to the first week of August 2023, when the EEOC disclosed in a New York federal court filing that iTutorGroup Inc. agreed to pay $365,000 to settle the EEOC’s lawsuit filed on behalf of more than 200 applicants, alleging that its AI-powered hiring selection tool was programmed to automatically reject women applicants over 55 and men over 60 in violation of the Age Discrimination in Employment Act (ADEA).

This is the first of among many more boundaries the EEOC and other governmental agencies are trying to draw to limit bias and discrimination in the use of AI. See ABA Guidelines for AI that Maryland Lawyers Should Know. The goal should be to strike a balance that promotes technological advancement while safeguarding the rights and interests of individuals and society as a whole. The Society for Human Resource Management reports that approximately 79 percent to 85 percent of employers now use some form of AI in recruiting and hiring. AI can help automate aspects of the screening process. The problem is not in the use of AI for recruiting and hiring, it is the data in the algorithm or the programing of the algorithm used to screen the applicants.  

Ensure Your Recruiting And Screening Tools Are Not Biased

Use facially neutral parameters in your AI recruiting and screening tools that do not violate pre-existing discrimination laws. For example, do not program the algorithm to specifically exclude individuals in legally recognized protected classes. Many employers program their AI screening tool to perform a skill assessment. This type of test selects and excludes candidates for legitimate unbiased reasons. Of course, all recruiting and screening AI tools should be audited frequently to ensure compliance with all state and federal anti-discrimination laws. Also, it’s a good idea to keep a human involved in the selection process for sound judgment and thinking.  This settlement is a good wake up call for all employers and their counsel to ensure any AI used in the recruiting and hiring selection process does not result in discrimination.