What should be the role of Computer Algorithms in Sentencing?
Sent to Prison by a Software Program’s Secret Algorithms
By ADAM LIPTAK MAY 1, 2017
When Chief Justice John G. Roberts Jr. visited Rensselaer
Polytechnic Institute last month, he was asked a startling question, one with
overtones of science fiction.
“Can you foresee a day,” asked Shirley Ann Jackson,
president of the college in upstate New York, “when smart machines, driven with
artificial intelligences, will assist with courtroom fact-finding or, more
controversially even, judicial decision-making?”
The chief justice’s answer was more surprising than the
question. “It’s a day that’s here,” he said, “and it’s putting a significant
strain on how the judiciary goes about doing things.”
He may have been thinking about the case of a Wisconsin
man, Eric L. Loomis, who was sentenced to six years in prison based in part on
a private company’s proprietary software. Mr. Loomis says his right to due
process was violated by a judge’s consideration of a report generated by the
software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.
In March, in a signal that the justices were intrigued by
Mr. Loomis’s case, they asked the federal government to file a
friend-of-the-court brief offering its views on whether the court should hear
his appeal.
The report in Mr. Loomis’s case was produced by a product
called Compas, sold by Northpointe Inc. It included a series of bar charts that
assessed the risk that Mr. Loomis would commit more crimes.
The Compas report, a prosecutor told the trial judge, showed
“a high risk of violence, high risk of recidivism, high pretrial risk.” The
judge agreed, telling Mr. Loomis that “you’re identified, through the Compas
assessment, as an individual who is a high risk to the community.”
The Wisconsin Supreme Court ruled against Mr. Loomis. The
report added valuable information, it said, and Mr. Loomis would have gotten
the same sentence based solely on the usual factors, including his crime —
fleeing the police in a car — and his criminal history.
At the same time, the court seemed uneasy with using a
secret algorithm to send a man to prison. Justice Ann Walsh Bradley, writing
for the court, discussed, for instance, a report from ProPublica about Compas
that concluded that black defendants in Broward County, Fla., “were far more
likely than white defendants to be incorrectly judged to be at a higher rate of
recidivism.”
Justice Bradley noted that Northpointe had disputed the
analysis. Still, she wrote, “this study and others raise concerns regarding how
a Compas assessment’s risk factors correlate with race.”
In the end, though, Justice Bradley allowed sentencing
judges to use Compas. They must take account of the algorithm’s limitations and
the secrecy surrounding it, she wrote, but said the software could be helpful “in
providing the sentencing court with as much information as possible in order to
arrive at an individualized sentence.”
Justice Bradley made Compas’s role in sentencing sound
like the consideration of race in a selective university’s holistic admissions
program. It could be one factor among many, she wrote, but not the
determinative one.
In urging the United States Supreme Court not to hear the
case, Wisconsin’s attorney general, Brad D. Schimel, seemed to acknowledge that
the questions in the case were substantial ones. But he said the justices
should not move too fast.
“The use of risk assessments by sentencing courts is a
novel issue, which needs time for further percolation,” Mr. Schimel wrote.
He added that Mr. Loomis “was free to question the
assessment and explain its possible flaws.” But it is a little hard to see how
he could do that without access to the algorithm itself.
The company that markets Compas says its formula is a
trade secret.
“The key to our product is the algorithms, and they’re
proprietary,” one of its executives said last year. “We’ve created them, and we
don’t release them because it’s certainly a core piece of our business.”
Compas and other products with similar algorithms play a
role in many states’ criminal justice systems. “These proprietary techniques
are used to set bail, determine sentences, and even contribute to
determinations about guilt or innocence,” a report from the Electronic Privacy
Information Center found. “Yet the inner workings of these tools are largely
hidden from public view.”
In 1977, the Supreme Court ruled that a Florida man could
not be condemned to die based on a sentencing report that contained
confidential passages he was not allowed to see. The Supreme Court’s decision
was fractured, and the controlling opinion appeared to say that the principle
applied only in capital cases.
Mr. Schimel echoed that point and added that Mr. Loomis
knew everything the court knew. Judges do not have access to the algorithm,
either, he wrote.
There are good reasons to use data to ensure uniformity
in sentencing. It is less clear that uniformity must come at the price of
secrecy, particularly when the justification for secrecy is the protection of a
private company’s profits. The government can surely develop its own algorithms
and allow defense lawyers to evaluate them.
At Rensselaer last month, Chief Justice Roberts said that
judges had work to do in an era of rapid change.
“The impact of technology has been across the board,” he
said, “and we haven’t yet really absorbed how it’s going to change the way we
do business.”
Comments
Post a Comment