Days after mass layoffs trimmed 12,000 jobs at Google, hundreds of former employees flocked to an online chatroom to commiserate about the seemingly erratic way they had suddenly been made redundant.
They swapped theories on how management had decided who got cut. Could a “mindless algorithm carefully designed not to violate any laws” have chosen who got the ax, one person wondered in a Discord post The Washington Post could not independently verify.
Google says there was “no algorithm involved” in their job cut decisions. But former employees are not wrong to wonder, as a fleet of artificial intelligence tools become ingrained in office life. Human resources managers use machine learning software to analyze millions of employment related data points, churning out recommendations of who to interview, hire, promote or help retain.
But as Silicon Valley’s fortunes turn, that software is likely dealing with a more daunting task: helping decide who gets cut, according to human resources analysts and workforce experts.
A January survey of 300 human resources leaders at U.S. companies revealed that 98 percent of them say software and algorithms will help them make layoff decisions this year. And as companies lay off large swaths of people — with cuts creeping into the five digits — it’s hard for humans to execute alone.
Big firms, from technology titans to companies that make household goods often use software to find the “right person” for the “right project,” according to Joseph Fuller, a professor at Harvard’s business school who co-leads its Managing the Future of Work initiative.
These products build a “skills inventory,” a powerful database on employees that helps managers identify what kinds of work experiences, certifications and skill-sets are associated with high performers for various job titles.
These same tools can help in layoffs. “They suddenly are just being used differently,” Fuller added, “because that’s the place where people have … a real … inventory of skills.”
Human resource companies have taken advantage of the artificial intelligence boom. Companies, such as Eightfold AI, use algorithms to analyze billions of data points scraped from online career profiles and other skills databases, helping recruiters find candidates whose applications might not otherwise surface.
Since the 2008 recession, human resources departments have become “incredibly data driven,” said Brian Westfall, a senior HR analyst at Capterra, a software review site. Turning to algorithms can be particularly comforting for some managers while making tricky decisions such as layoffs, he added.
Many people use software that analyzes performance data. Seventy percent of HR managers in Capterra’s survey said performance was the most important factor when assessing who to layoff.
Other metrics used to lay people off might be less clear-cut, Westfall said. For instance, HR algorithms can calculate what factors make someone a “flight risk,” and more likely to quit the company.
This raises numerous issues, he said. If an organization has a problem with discrimination, for instance, people of color may leave the company at higher rates, but if the algorithm is not trained to know that, it could consider non-White workers a higher “flight risk,” and suggest more of them for cuts, he added.
“You can kind of see where the snowball gets rolling,” he said, “and all of a sudden, these data points where you don’t know how that data was created or how that data was influenced suddenly lead to poor decisions.”
Jeff Schwartz, vice president at Gloat, an HR software company that uses AI, says his company’s software operates like a recommendation engine, similar to how Amazon suggests products, which helps clients figure out who to interview for open roles.
He doesn’t think Gloat’s clients are using the company’s software to create lists to lay people off. But he acknowledged that HR leaders must be transparent in how they make such decisions, including how extensively algorithms were used.
“It’s a learning moment for us,” he said. “We need to uncover the black boxes. We need to understand which algorithms are working and in which ways, and we need to figure out how the people and algorithms are working together.”
The reliance on software has ignited a debate about the role algorithms should play in stripping people of jobs, and how transparent the employers should be about the reasons behind job loss, labor experts said.
“The danger here is using bad data,” said Westfall, “[and] coming to a decision based on something an algorithm says and just following it blindly.”
But HR organizations have been “overwhelmed since the pandemic” and they’ll continue using software to help ease their workload, said Zack Bombatch, a labor and employment attorney and member of Disrupt HR, an organization which tracks advances in human resources.
Given that, leaders can’t let algorithms solely decide who to cut, and need to review suggestions to ensure it isn’t biased against people of color, women or old people — which would bring lawsuits.
“Don’t try to pass the buck to the software,” he said.
Recommended Comments
There are no comments to display.
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.