How Equitable Technology Can Boost Diversity

Almost three-quarters of Americans oppose the use of artificial intelligence (AI) in hiring decisions, according to a 2023 survey by Pew Research Center, yet almost 25% of companies surveyed by SHRM report using AI for HR-related tasks.

In order to make advanced tech a useful, ethical, and legal part of people operations, employers may proceed with enthusiasm, as long as it’s tempered with caution. This was discussed in an executive panel titled “How Equitable Technology Can Boost Diversity,” which I moderated during From Day One’s November virtual conference.

Given the panelists’ expertise in talent acquisition, the way tech is used in hiring decisions (specifically how AI and bias-limiting tools are used) was at the center.

When Tech Makes Decisions, and When People Make Decisions

Tech has proven itself to outperform recruiters on many hiring tasks, like making connections across vast amounts of information to solve problems faster, says Rebecca Warren, who leads customer success at talent intelligence platform Eightfold.

There’s a difference between new AI-powered tools and the more familiar ones that use data analytics and machine learning. Warren considers AI to be largely proactive, while the latter is reactive.

“AI are systems or machines that are replicating human intelligence,” she said. “Data analytics uses the insights, patterns, and trends from data to help make decisions. If you were to think about providing actionable information to improve operations, that would come from your data analytics, as opposed to using AI, which helps to either eliminate extra work or make connections faster. They should be used in conjunction, but they have different purposes.”

One signal of a good tech tool is that it streamlines the interview process, said Stacey Olive, VP of talent acquisition and employer branding at Medidata Solutions, which builds software for clinical trials. “Anything that levels the playing field is going to be helpful. Sometimes that’s the luxury of an applicant tracking system that allows you to customize a feedback form or to make sure that hiring can’t take place unless you have a diverse candidate slate.”

It appears that a sizable share of workers are comfortable with companies using AI to screen candidates. Pew found that 47% of Americans think AI would be better at evaluating job applicants than humans are. Only 15% believe it would be worse.

But there are some parts of the hiring process that even the most sophisticated tech can’t replicate, like networking. “Many people find their jobs through their network and by word of mouth. Some people don’t even like to apply, so you’ve got to always be networking, and that’s constant hard work,” Olive noted.

Spotting the Bad Actors, Finding the Good

Resist the draw of all things shiny and new, said panelists. Don’t chase technology for technology’s sake and adopt a buzzy tech trend before you’re equipped to do it well. “Like any new technology, people need to think about the problem they’re trying to solve,” said Josh Brenner, CEO of job search platform Hired. Instead of finding a problem to fit the technology, find tech tools that solve problems you already know about.

Journalist Emily McCrary-Ruiz-Esparza moderated the panel among Josh Brenner of Hired, Rebecca Warren of Eightfold, Nicholas Mailey of Equinix, and Stacey Olive of Medidata Solutions (photo by From Day One)

Larger firms should make sure the tools they’re picking are set up for enterprise-level application, “especially when it comes to compliance,” said Brenner. “There are a lot of challenges within recruiting around privacy, salary transparency, and AI biases. Make sure your vendor can provide a third-party audit.” As of July 5, employers in New York City are required to submit AI tools used in hiring to an annual audit under the US’s first law designed to limit such bias. State and federal laws may be incoming, and some agencies already have recourse to challenge AI tools that cause harm.

All employers should look out for AI platforms making big, vague claims. “If your vendor can’t explain how the platform or the process works, that’s a red flag,” said Warren. “You need to make sure that the person who is telling you what you need to buy actually understands how it works.”

Not everything billed as AI is truly artificial intelligence. Some might be more accurately labeled as machine learning or data analysis. If you’re unsure of your vetting capabilities, bring in an expert. “Even if you’re a small organization, it’s absolutely worth the money to bring in a consultant to give you an independent view of whether that technology is going to solve your problems,” Warren said. “Even if it’s just five hours or one week, bring in an expert to make sure you’re not causing more harm than good.”

Governing Your Company Data

To better govern its data, enterprise network provider Equinix developed a governance board comprising representatives from information security, legal, IT, and HR analytics. The team built a process for reviewing the way data is handled, who has access (and whether they still need it), how it’s used, and who governs it. The group meets at least monthly, sometimes more, to review new data practices and audits.

“Then we run water through the pipes,” says the company’s VP of talent acquisition, Nicholas Mailey. “You want to play out the implementation of different solutions or processes, look at the outcome of those processes or practices to ensure that you know, to the extent that you can, that you have control and a sense for whether the outcome is ultimately equitable.”

The test-run exercise has worked, prompting the company to correct itself before making privacy-violating mistakes. “We started to implement AI technology in certain areas, then we candidly thought that we were out over our skis,” Mailey said. At the time, Equinix asked job seekers to self-disclose demographic information in the application process. So when Equinix wanted to create diverse talent pipelines for jobs, it was those self-disclosures they thought of first. But at disclosure, applicants had been assured that such data would not be factored into hiring decisions.

“Fortunately, we caught it,” he said. “But if we hadn’t had a governing board in place looking at how we were approaching these issues, we would have made the mistake of leveraging that data.” Instead, the company took its time and got it right. “It was another year and a half before we started implementing again.”

Recruiters like to move quickly and get things done. But HR isn’t a tech department (at least, not entirely), and it’s best to proceed with caution, Mailey said. “It’s safest for companies to go slow in order to go fast—and ensure you’re doing the right thing.”

Emily McCrary-Ruiz-Esparza is a freelance journalist and From Day One contributing editor who writes about work, the job market, and women’s experiences in the workplace. Her work has appeared in the BBC, The Washington Post, Quartz, Fast Company, and Digiday’s Worklife.