Using Algorithms to Determine Character for Making Loans
Using Algorithms to Determine Character
By QUENTIN HARDY
JULY 26, 2015 5:30 AM July 26, 2015 5:30 am
Computers aren’t just doing hard math problems and
showing us cat videos. Increasingly, they judge our character.
Maybe we should be grateful.
A company in Palo Alto, Calif., called Upstart has over
the last 15 months lent $135 million to people with mostly negligible credit
ratings. Typically, they are recent graduates without mortgages, car payments
or credit card settlements.
Those are among the things that normally earn a good or
bad credit score, but these people haven’t been in the working world that long.
So Upstart looks at their SAT scores, what colleges they attended, their majors
and their grade-point averages. As much as job prospects, the company is
assessing personality.
“If you take two people with the same job and
circumstances, like whether they have kids, five years later the one who had
the higher G.P.A. is more likely to pay a debt,” said Paul Gu, Upstart’s
co-founder and head of product. “It’s not whether you can pay. It’s a question
of how important you see your obligation.”
The idea, validated by data, is that people who did
things like double-checking the homework or studying extra in case there was a
pop quiz are thorough and likely to honor their debts.
Analytics, meet judgment of people. “I guess you could
call it character, though we haven’t used that label,” said Mr. Gu, who is 24.
The same personality dynamic holds for people go to great
schools or have top grades. Douglas Merrill, the founder and chief executive of
ZestFinance, is a former Google executive whose company writes loans to
subprime borrowers through nonstandard data signals.
One signal is whether someone has ever given up a prepaid
wireless phone number. Where housing is often uncertain, those numbers are a
more reliable way to find you than addresses; giving one up may indicate you
are willing (or have been forced) to disappear from family or potential
employers. That is a bad sign.
Zest recently branched into “near prime” borrowers, who
have either fallen from the prime category or risen from subprime. The question
is why these people have changed categories, and Zest tries to figure out if a
potentially reliable borrower has had some temporary bad luck, like a one-time
medical expense.
“‘Character’ is a loaded term, but there is an important
difference between ability to pay and willingness to pay,” said Mr. Merrill.
“If all you look at is financial transactions, it’s hard to say much about
willingness.”
Mr. Merrill, who also has a Ph.D. in psychology (from
Princeton, in case Mr. Gu wants to lend him money), thinks that data-driven
analysis of personality is ultimately fairer than standard measures.
“We’re always judging people in all sorts of ways, but
without data we do it with a selection bias,” he said. “We base it on stuff we
know about people, but that usually means favoring people who are most like
ourselves.” Familiarity is a crude form of risk management, since we know what
to expect. But that doesn’t make it fair.
Character (though it is usually called something more
neutral-sounding) is now judged by many other algorithms. Workday, a company
offering cloud-based personnel software, has released a product that looks at
45 employee performance factors, including how long a person has held a
position and how well the person has done. It predicts whether a person is
likely to quit and suggests appropriate things, like a new job or a transfer,
that could make this kind of person stay.
It can also characterize managers as “rainmakers” or
“terminators,” depending on how well they hold talent. Inside Workday, the
company has analyzed its own sales force to see what makes for success. The top
indicator is tenacity.
“We all have biases about how we hire and promote,” said
Dan Beck, Workday’s head of technology strategy. “If you can leverage data to
overcome that, great.”
People studying these traits will be encouraged to adopt
them, he said, since “if you know there is a pattern of success, why wouldn’t
you adopt it?”
In a sense, it’s no different from the way people read
the biographies of high achievers, looking for clues for what they need to do
differently to succeed. It’s just at a much larger scale, based on observing
everybody.
There are reasons to think that data-based character
judgments are more reasonable. Jure Leskovec, a professor of computer science
at Stanford, is finishing up a study comparing the predictions of data analysis
against those of judges at bail hearings, who have just a few minutes to size
up prisoners and decide if they could be risks to society. Early results
indicate that data-driven analysis is 30 percent better at predicting crime,
Mr. Leskovec said.
“Algorithms aren’t subjective,” he said. “Bias comes from
people.”
That is only true to a point: Algorithms do not fall from
the sky. Algorithms are written by human beings. Even if the facts aren’t
biased, design can be, and we could end up with a flawed belief that math is
always truth.
Upstart’s Mr. Gu, who said he had perfect SAT scores but
dropped out of Yale, wouldn’t have qualified for an Upstart loan using his own
initial algorithms. He has since changed the design, and he said he is aware of
the responsibility of the work ahead.
“Every time we find a signal, we have to ask ourselves,
‘Would we feel comfortable telling someone this was why they were rejected?’ ”
he said.
A version of this article appears in print on 07/27/2015,
on page B5 of the New York edition with the headline: Determining Character
With Algorithms.
Comments
Post a Comment