EEOC Digs into Harassment, Pay Discrimination and AI

?Sexual harassment, pay discrimination and artificial intelligence (AI) are among the priorities that the U.S. Equal Employment Opportunity Commission (EEOC) will focus on this year in its guidance and enforcement efforts.

The agency is carefully considering comments from employers and the public right now as it finalizes its latest strategic enforcement plan, EEOC Vice Chair Jocelyn Samuels told attendees at the SHRM Employment Law & Compliance Conference in Washington, D.C., on Feb. 28.

Sexual Harassment

Later this year, the EEOC plans to provide updated guidance for employers to prevent sexual harassment in the workplace, Samuels said. In the meantime, the agency offers best practices for employers in its 2017 guidance.

“In the wake of #MeToo, the number of sexual harassment charges that we received has really dramatically increased, and we know things now that we didn’t know” 10 years ago, Samuels said. The agency believes there is significant underreporting of sexual harassment, and “that is an issue that really concerns us.”

Industries such as retail, restaurants and fast food report higher rates of sexual harassment, which “often involves really vulnerable people, like teenagers,” Samuels said. Employers in those sectors should “take a look at your workplaces and ensure you have in place the kinds of processes you need to become aware of potentially illegal harassment and address it before it becomes a legal liability.”

Pay Discrimination

Even though pay discrimination is illegal in the U.S., “we still have a problem” with pay disparities, Samuels said. “Clearly, there is still work to be done.”

In 2021, women earned 82 cents for every dollar that men earned, according to the U.S. Government Accountability Office.

Under federal law, employers cannot pay workers differently based on their race, color, religion, sex, pregnancy, gender identity, sexual orientation, national origin, disability, age or genetic information.

“We know not every pay disparity is the product of discriminatory conduct,” Samuels said. “We’re committed to trying to address the discriminatory factors that may be leading to that portion of the disparity.”

Illegal pay discrimination can be difficult to detect and contest. “Often people don’t know what their co-workers are paid,” Samuels said. “It’s often invisible. That’s what led the EEOC in 2016 to adopt a pay data collection instrument.”

The agency halted its pay data collection in 2019, following employer complaints about the reporting being too burdensome. A 2022 study by the National Academies of Sciences, Engineering, and Medicine found the pay data was useful in identifying discrimination based on sex and race.

However, employers should know that the EEOC will never hold an employer liable for discrimination based solely on its pay data submitted to the agency, Samuels said. Instead, the EEOC wants to identify patterns of systemic discrimination and target geographic areas and industries “where outreach and education would be more useful.”

It remains unclear whether the EEOC will resume collecting pay data from employers. “If we move forward, we are committed to ensuring that the pay data we collect is salient and does offer a useful way to identify disparities that could be discriminatory and minimizing the burden on employers to the extent possible,” Samuels said. “Stay tuned for more information on that.”

Artificial Intelligence

The draft version of the EEOC’s strategic plan for fiscal years 2023-2027 includes a new focus on AI, which businesses are increasingly using to screen job candidates. Some employers use AI to scan through applicants’ social media posts and analyze their body language, eye contact, facial expressions and speaking tones during job interviews.

Using AI in employment decisions “opens a Pandora’s box of equal employment opportunity concerns,” Jim Banks, SHRM’s general counsel, said at the conference.

The EEOC, the Federal Trade Commission and the Justice Department are looking for ways to ensure AI tools and algorithms don’t discriminate against protected groups.

“You can expect to see a lot more from [the EEOC] over the next couple of years” regarding AI, Samuels said. “I really do appreciate how beneficial technology can be in promoting efficiency, streamlining processes, enabling employers to consider thousands of applicants. … Technology is not the villain, but we have to be sure we are using the technology that’s developing in ways that don’t violate the employment discrimination laws.”

AI tools could be considered discriminatory when they have a disparate impact on protected groups. If an AI tool gives a protected group a selection rate that’s less than fourth-fifths (or 80 percent) of the rate for the group with the highest rate, that’s generally considered a disparate impact.

However, the four-fifths rule is “just a rule of thumb,” and “it doesn’t mean that you’re off the hook” if you meet it, Samuels said.

“One way that you can have a problem with bias is [if] the data used to train the AI tool is biased,” said Savanna Shuntich, a lawyer with Fortney & Scott in Washington, D.C., at the conference.

Employers should tell job applicants if AI will be used during the hiring process so they can request a disability accommodation if they need it, Samuels said. Some states require employers to provide notice or get consent from job applicants before they use AI tools as part of the hiring process.

When hiring assessments use AI, it’s not always clear to applicants what is being evaluated, which makes it hard for applicants to know when they should request an accommodation, Shuntich said. For example, if the AI is rating voice tone, a person with hearing loss might need an accommodation.

Record-keeping is another compliance concern when employers incorporate AI in employment decisions. Employers should “be intentional about what you’re keeping,” Shuntich said.

In addition, vendors are becoming an important factor in AI adoption. “Now more than ever, have close relationships with your vendors” regarding AI tools and record-keeping, especially as state and federal laws may change, Eric Dunleavy, vice president of employment and litigation services at DCI Consulting Group in Washington, D.C., said at the conference.

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to our Newsletter