Artificial Intelligence Could Soon Enhance Real-Time Police Surveillance
Artificial Intelligence Could Soon Enhance Real-Time
Police Surveillance
Companies are working with departments to develop body
cameras that could identify faces in real time; privacy groups fear loss of
anonymity
By Shibani Mahtani and Zusha Elinsonm Updated April 3,
2018 1:51 p.m. ET
CHICAGO—Several technology companies are working with
police departments across the U.S. to develop the capability to add artificial
intelligence to video surveillance and body cameras that could identify faces
in real time, potentially expanding the reach of police surveillance.
The body-camera technology, expected to be ready by the
fall, hasn’t yet been purchased by police departments and is still in the
development stage. Police departments, including the New York Police
Department, already use facial recognition to review surveillance footage after
a crime has occurred.
The new software uses an algorithm to tell an officer on
the spot, through a body camera or a video surveillance camera, that it has
found a suspect. The officer could then make a decision of whether to stop the
suspect or take some other action.
The technology underscores law enforcement’s growing
dependence on software and high-tech tools, including gun-shot-detection
technology and predictive analytics. The tools have been hailed by
law-enforcement, but often raise concerns about privacy.
Chicago-based Motorola Solutions , a maker of police
communications and body-camera technology, has partnered with
artificial-intelligence company Neurala to produce a body-worn camera, ready
for deployment this fall, that executives say will learn to identify a suspect
or a missing child and spot them in a crowd. The technology would get smarter
by taking in more data over time.
“This frees up some of your cognitive space so you aren’t
trying to do a thousand things at one time,” said a sergeant at a Midwest
police force, who is working with Motorola to provide feedback on the
technology. His department was interested in acquiring it when it rolls out, he
said.
Motorola said it is working with a number of departments
around the country. Several declined to comment.
Moving the technology into real-time creates the
possibility of police mistakes based on technology that may not always be
accurate—especially for darker faces, which are harder for the technology to
match accurately than those of lighter-skinned faces—and raise alarms over the
loss of privacy, rights groups say.
“All of the sudden we have lost our ability to be
relatively anonymous in society, to be able to walk about without fear that the
government is tracking our every move,” said Jennifer Lynch, an attorney with
the Electronic Frontier Foundation, a nonprofit privacy organization based in
San Francisco, which recently wrote a report highlighting issues with the use
of facial-recognition technology by law enforcement.
Companies say they have taken steps to avoid bias by
feeding millions of publicly available photos into the algorithm and testing it
to identify false positives and matches.
“We’ve worked really hard on training [the algorithm]
with a diverse data set to make sure that it is balanced and unbiased,” said
Paul Steinberg, Motorola Solution’s chief technology officer.
Several smaller companies are working to integrate
artificial intelligence into existing video-surveillance systems, able to
identify subjects or people of interest by matching faces with a database of
millions of photos.
TaeWoo Kim, chief scientist at One Smart Labs, a New
York-based startup that is working on such software, said the technology is
“creepy and a bit Big Brother-y,” but said it is “purely intended to fight
crime, terrorism and track wanted subjects.”
Dozens of U.S. law-enforcement agencies use facial
recognition to run photos of suspects through databases of mug shots or
driver’s license photos. Researchers at the Georgetown Law School estimated in
2016 that one in every two Americans adults—117 million people—are in
facial-recognition networks used by law enforcement in the country, a number
that is likely higher today.
In New York City, a detective investigating a crime pulls
an image from one of the thousands of surveillance cameras in the city and
forwards it to investigators in the facial-identification section, who
cross-reference it with a database of mug shots to see if there is a match.
This process, however, only happens after a crime is
committed. It also includes a peer review session, in which a group of officers
must review the match to confirm the result. Those safeguards could be removed
with the new real-time technology.
William Bratton, the former commissioner of the NYPD,
says that the public was similarly worried about DNA testing when the
technology first emerged. The technology has been credited in freeing
wrongfully convicted people from prison.
“From my perspective, the plus far outweighs the
minuses,” Mr. Bratton said.
Others, however, say that even if the technology worked
well, it wouldn’t be immediately embraced by police departments.
“There’s a finite pool of money to purchase this sort of
thing and it is super controversial,” said Daniel Zehnder, a recently retired
Las Vegas police captain who ran the department’s body-camera program.
“Civil-liberties groups would certainly have many questions and issues with
it.”
—Zolan Kanno-Youngs contributed to this article.
Comments
Post a Comment