?Four federal agencies recently pledged to collaborate closely to prevent discrimination resulting from the use of artificial intelligence and automated decision tools in the workplace. It signaled their heightened scrutiny of employers deploying AI in the last year.
The U.S. Equal Employment Opportunity Commission (EEOC), the U.S. Department of Justice (DOJ), the Consumer Financial Protection Bureau (CFPB) and the Federal Trade Commission (FTC) released a statement and held a press conference on April 25, highlighting their commitment to enforcing existing civil rights and consumer protection laws as they apply to AI in the workplace.
“We are going to hold companies responsible for deploying these technologies, so it is all in compliance with existing law,” said CFPB Director Rohit Chopra. “We have such a clear focus today on making sure that this does not become a vector for infringing civil rights.”
Kristen Clarke, assistant attorney general for civil rights at the DOJ, called the coordination among the agencies a “whole-of-government approach.”
Potential Discrimination
Some employers have started using AI and algorithms to help with decisions like recruiting, interviewing, hiring, pay and promotions. However, AI has the potential to perpetuate unlawful bias, automate unlawful discrimination and produce other harmful outcomes based on race, gender, disability or other protected characteristics, the joint statement noted.
“Unchecked AI poses threats to fairness in ways that are already being felt,” Chopra said. “While machines crunching numbers might seem capable of taking human bias out of the equation, that’s not what’s happening.”
Outcomes from AI tools, including employment decisions, can be skewed by datasets with unrepresentative or imbalanced data, historical bias, or other types of errors, the joint statement noted.
For example, AI can result in discriminatory employment decisions if it relies on datasets based on a workforce that’s predominately white and male, Clarke explained.
“AI poses some of the greatest modern-day threats when it comes to discrimination today,” she said. “We have an arsenal of bedrock civil rights laws that do give us the accountability to hold bad actors accountable.” Those laws include the Civil Rights Act of 1964, the Americans with Disabilities Act (ADA), the Fair Credit Reporting Act, and the Equal Credit Opportunity Act.
“Expansion of AI and automated systems cannot come at the price of civil rights and racial equity,” Clarke said. “We must be vigilant these tools do not exacerbate racial inequities or perpetuate the racial divide.”
The EEOC’s latest draft Strategic Enforcement Plan outlined the agency’s priorities, including preventing technology from perpetuating discrimination.
Employer Liability
In the press conference, federal agency leaders made clear that employers can be held liable for the ways they deploy AI tools developed by technology companies. Companies aren’t off the hook just because they didn’t develop the tool themselves. “Claims of innovation must not be cover for legal violations,” said FTC Chair Lina Kahn. “There is no AI exception to the laws on the books.”
Some employers use AI to present games or tests that screen job applicants. However, this may screen out people with disabilities or preclude them from seeking a reasonable accommodation, which is illegal, said EEOC Chair Charlotte Burrows.
Furthermore, it’s unlawful for employers to ask questions about disabilities during the application process, so employers cannot use AI tools in a way that inadvertently uncovers information about an applicant’s disability.
For example, an AI tool programmed to screen for certain personality traits might ask an applicant, “Do you wake up in the morning feeling optimistic about the day?” That would be illegal if it screens out people with chronic depression, which may qualify as a disability under the ADA, Burrows said.
“Rapid adoption of AI and other automated systems has truly opened a new civil rights frontier,” she said. “We cannot leave the American people unprotected.”