What’s on Your Mind? Bosses Are Using Artificial Intelligence to Find Out
What’s on Your Mind? Bosses Are Using Artificial
Intelligence to Find Out
AI tools give companies instant insights from employee
surveys that once took months to process
By Imani MoiseMarch 28, 2018 11:35 a.m. ET
Human-resource departments are becoming a bit less human
as companies turn to artificial intelligence for help with hiring and
firing—and to learn how employees really feel about their bosses.
Every year at SPS Companies Inc., most of the steel
processor’s 600 employees, from warehouse staffers to top executives, fill out
a 30-minute confidential survey that asks, among other things, whether they
feel micromanaged and whether they feel their managers support their
professional growth. One question challenges survey-takers to gauge how
respected and valued they feel within the organization.
This year, for the first time, the Manhattan, Kan.-based
company tapped an artificial-intelligence tool called Xander to analyze
responses. Xander can determine whether an employee feels optimistic, confused
or angry, and provide insights to help manage teams, the tool’s developers at
Ultimate Software Group Inc. said.
From a block of text, the software analyzes answers to
open-ended questions based on language and other data, assigning attitudes or
opinions to employees.
One top executive at SPS learned from recent survey
analysis that he needed to work on his temper. “One of my lowest scoring items
was maintaining my composure under stress,” he said of the feedback from his
direct reports. On the bright side, Xander reported that the manager’s staff
felt he was fair and honest.
Research shows that emotions are key to understanding
what motivates employees. How people feel often determines if they go above and
beyond in the workplace or underperform, says Jason Hite, chief people
strategist at HR consultancy Daoine Centric LLC. It can also explain why people
leave.
Companies have used technology to track employee actions
and help boost productivity for years, but now some are turning to software to
sniff out differences between what employees say and how they feel.
At First Horizon National Corp., a regional bank based in
Memphis, it once took a team of six human resource personnel three months to
pore over 3,500 surveys. Managers would take another five months to submit
action plans based on the data.
“By the time they got started we were getting ready to do
another survey,” says Mario Brown, manager of leadership assessment and
development at First Horizon.
Using Xander, First Horizon could slice and dice the
feedback as soon as the survey closed. One insight the company gained from the
survey was that it needed to work on its training program.
Steel company SPS streamlined its health-care plan
offerings after survey results showed the options confused and overwhelmed
employees. HR staffers have used some of the time saved processing survey
results to start new mental and physical health initiatives for employees,
including a wellness blog.
More than 40% of employers world-wide have implemented
artificial intelligence processes of some kind, according to a recent study
from Deloitte.
But as AI tools infiltrate HR departments, regulators are
struggling to keep up.
A number of software companies including HireVue Inc. and
Syndio offer artificial-intelligence tools to help make decisions about hiring,
firing and compensation. That worries employees who are wary of being
psychoanalyzed by software, and some employment lawyers fret that AI programs
might contain biases that could lead to workplace discrimination.
“I’m fully aware of a handful of people who didn’t want
to take the survey because they had a fear of being tracked,” says Corey
Kephart, vice president of human resources at SPS.
Since most emotions are communicated nonverbally,
programs that solely rely on text can miss the bigger picture, said Julie
Albright, a digital sociologist at University of Southern California.
Artificial intelligence might one day be trained to recognize signs of
depression and other emotions in facial expressions and voice tones, she said,
but the technology isn’t there yet.
Any algorithmic bias is likely to have an outsize impact
on minorities and other protected classes of employees, said Garry Mathiason,
an attorney at Littler Mendelson P.C. who specializes in artificial
intelligence and employment law. A hiring algorithm might notice a higher rate
of absences for people with disabilities and recommend against employing them,
for example.
The Equal Employment Opportunity Commission, the U.S.
regulator that enforces laws preventing workplace discrimination, hasn’t issued
official rules determining how artificial intelligence can be used in
human-resource decisions, but a panel convened by the agency in 2016 concluded
that the technology can potentially create new barriers for opportunities.
Mr. Mathiason said he expects official guidelines from
the EEOC in the near future. In the meantime, companies can avoid legal gray
areas by keeping human review as part of any AI-enabled decision-making process
and by disclosing how the AI is being used, he said.
Though confidential, the surveys aren’t anonymous. Xander
can take into account an employee’s demographic data, previous surveys, and
other background information when analyzing responses. Ultimate Software said
the tool has safeguards in place to protect confidentiality. For example, a
manager may need a certain number of direct reports to respond to a survey
before gaining access to verbatim responses to make it harder to identify who
said what.
The company said Xander can’t always get it right—but
neither do people.
It still requires humans to pick up body language cues,
and “even humans only catch sarcasm half the time,” says Suhail Halai, Ultimate
Software’s head of customer experience.
Comments
Post a Comment