?While New York City employers have more time to prepare for the enforcement of the automated employment decision tool (AEDT) law in April, there’s still doubt that companies will be ready to meet the law’s requirements. Some even suggest that the law will delay the use of artificial intelligence tools in hiring and promotion decisions.
The AEDT law, which passed in December 2021, had an enforcement date of Jan. 1, 2023, but that date has been pushed back to April 15, a decision that New York City employers hope will give them enough time to conduct bias audits on their AEDTs .
In an effort to address employers’, vendors’ and other stakeholders’ concerns, the city’s Department of Consumer and Worker Protection held its second public hearing on the AEDT law in January, but changes to the law have yet to be made.
The Society for Human Resource Management hosted a panel discussion about the law on Feb. 16, allowing the employer community to share feedback with council members and staff from the mayor’s office.
In the meantime, Terry Baker, president and CEO of recruitment marketing technology firm PandoLogic, thinks the delay is a sign that New York companies’ hiring systems may not meet the law’s requirements.
“There is recognition that the market is not ready for compliance and adoption,” he said. “Most employers have not yet audited the tools they use which will be subject to the AEDT governance. In addition, the NYC Department of Consumer and Worker Protection still needs to clarify what constitutes an adequate audit under the law.”
Baker’s organization will be subject to the law both as a company headquartered in New York City and as a vendor that offers its customers a talent acquisition platform and a conversational AI tool that automates the collection of information from candidates for the purpose of vetting them for specific job openings.
“We certainly fall under this law both as an employer and as a technology provider, and we are particularly attuned to what the requirements are,” Baker said.
However, because the requirements to successfully audit automated hiring systems aren’t clear, both employers and vendors are in a difficult position, he said.
“The law defines impact ratio but does not yet define the kind of auditing procedures that would determine whether the audit was done correctly or adequately. Vendors that provide these tools are not yet prepared to demonstrate compliance for that reason,” Baker said.
The law states that an automated employment decision tool is any computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that generates a score, classification, or recommendation used to substantially assist in making employment decisions.
Because employers are liable under the AEDT law, Baker said, it’s critical that vendors with AI and other automated tools used to score and select candidates explain how their tools work, but that’s not easy to do.
“Vendors must provide their customers with exposure and transparency because it’s the employers that have the liability and it’s the employers that have to communicate with the candidates what this process looks like to establish the fairness of the process. They can’t do that if they are using third-party tools that don’t provide that level of transparency,” Baker said.
He added that there are a lot of companies that use a lot of platforms that haven’t gone through an audit and Baker observed: “it’s very difficult when using a third-party product to understand what’s happening within that code base.”
New York University Assistant Professor in Journalism Hilke Schellmann and Dr. Mona Sloane, senior research scientist at the NYU Center for Responsible AI have concluded that AI in hiring is undergoing an earth-shaking revolution.
“Many Fortune 500 companies employ AI-based solutions to weed through the millions of job applications the companies receive every year,” stated Schellmann and Sloane in a project titled Holding Hiring Algorithms Accountable and Creating New Tools for Humanistic Research.
“The problem: many companies don’t want to reveal what technology they are using and vendors don’t want to reveal what’s in the black box, despite evidence that some automated decision making systems make biased and/or arbitrary decisions.”
As companies gauge the damage to their reputation and the costs they’ll incur if they are found in violation of the law, Baker believes employers in New York City should consider placing more responsibility on vendors and pay lawyers to add amendments to service level agreements that protect them.
“If employers are smart, they are going to pass the onus of this law to the vendors. Anybody that is hiring at scale in New York City is investing in a lot of third-party products. The average company uses more than 10 third-party tools to enable their entire talent acquisition and employment process,” Baker said.
Data from Aptitude Research shows that 63 percent of companies are using more talent acquisition solutions today than in the pre-COVID-19 period. As talent acquisition tools evolve, AI remains the common denominator, researchers say.
The cost to employers of breaking the law is another consideration. Each violation of the AEDT law can cost companies up to $1,500. That’s a punishment Baker thinks will convince employers to delay their use of AI in hiring.
“I think the law will slow the adoption of AI, unfortunately, because there’s a lot of fear with regard to compliance, and there are some stiff penalties,” he said.
One company that is adamant that its technology will escape the law’s penalties is SeekOut, a Seattle.-based company with a significant number of New York City clients that use SeekOut’s AI-driven talent acquisition and management platform.
According to Sam Shaddox, vice president and head of the legal department at SeekOut, the company has intentionally designed its AI solutions so that the technology doesn’t make hiring decisions, but instead finds diverse candidates for employers to reach out to.
“One of the clarifications that the draft implementing regulations make is that the audit and transparency requirements only apply to AI tools that are being used for candidates that have applied to a position,” Shaddox said.
Shaddox added that the AEDT law and similar legislation will force small and medium-size businesses, as well as vendors, to have a more robust compliance team.
To the question of whether the law will limit the use of AI tools in hiring, Shaddox said companies will take a variety of approaches to using AI and automation tools to support their hiring objectives.
“Some employers will reduce the use of AI systems, and some will have greater trust and will use AI systems more. Over the next two to four years, we will see how the law turns out before we start to see an emerging trend on whether employers are excited or not with their use of AI in hiring processes,” Shaddox said.
Nicole Lewis is a freelance journalist based in Miami.