Facial Recognition Software Moves From Overseas Wars to Local Police - Feds Push Ahead With Iris Scans, DNA Profiles, Voice ID
Facial Recognition Software Moves From Overseas Wars to
Local Police
By TIMOTHY WILLIAMS AUG. 12, 2015
SAN DIEGO — Facial recognition software, which American
military and intelligence agencies used for years in Iraq and Afghanistan to
identify potential terrorists, is being eagerly adopted by dozens of police
departments around the country to pursue drug dealers, prostitutes and other
conventional criminal suspects. But because it is being used with few
guidelines and with little oversight or public disclosure, it is raising
questions of privacy and concerns about potential misuse.
Law enforcement officers say the technology is much
faster than fingerprinting at identifying suspects, although it is unclear how
much it is helping the police make arrests.
When Aaron Harvey was stopped by the police here in 2013
while driving near his grandmother’s house, an officer not only searched his
car, he said, but also took his photograph and ran it through the software to
try to confirm his identity and determine whether he had a criminal record.
Eric Hanson, a retired firefighter, had a similar
experience last summer. Stopped by the police after a dispute with a man he
said was a prowler, he was ordered to sit on a curb, he said, while officers
took his photo with an iPad and ran it through the same facial recognition
software. The officers also used a cotton swab to collect a DNA sample from the
inside of his cheek.
Neither man was arrested. Neither had consented to being
photographed. Both said the officers had told them that they were using facial
recognition technology.
“I was thinking, ‘Why are you taking pictures of me,
doing this to me?’ ” said Mr. Hanson, 58, who has no criminal record. “I felt
like my identity was being stolen. I’m a straight-up, no lie, cheat or steal
guy, and I get treated like a criminal.”
Lt. Scott Wahl, a spokesman for the 1,900-member San
Diego Police Department, said that until June 19, it did not have a written
policy regulating facial recognition software and only recently began training
officers on its lawful use. Nor does it require police officers to file a
report when they use the equipment but do not make an arrest. The department
has no record of the stops involving Mr. Hanson and Mr. Harvey, but Lieutenant
Wahl did not dispute their accounts.
“It is a test product for the region that we’ve allowed
officers to use,” he said of facial recognition software and the hand-held devices
the police use to take pictures. “We don’t even know how many are out there.”
But county documents show that over 33 days in January
and February, 26 San Diego law enforcement agencies used the software to try to
identify people on more than 20,600 occasions — although officers found a match
to criminal records only about 25 percent of the time.
Lieutenant Wahl said the department was not aware of any
complaints about the software or about the policy of collecting DNA samples
that Mr. Hanson and others have described.
The department uses the technology judiciously,
Lieutenant Wahl said. “We don’t just drive around taking people’s picture and
start swabbing them,” he said.
Others say misuse is common.
“I get a call about facial recognition maybe twice a
month,” said Victor Manuel Torres, a San Diego civil rights lawyer. “The
complaint is always that they did it and didn’t get permission. ‘The police put
me in cuffs and I’m on the curb, and they pull out an iPad and are taking
pictures.’ ”
The Police Department, which the Justice Department
recently determined to have a history of serious misconduct, has also been
found to disproportionately stop and search African-Americans. But there is no
similar racial breakdown for facial recognition checks, in part because the
department does not keep the data.
“It is not as if there is the identification of a
specific crime problem; they are simply collecting a lot of information that
could impact a lot of completely innocent people,” said Michael German, a
fellow at the Brennan Center for Justice and a former F.B.I. agent. “There is
very little oversight on the local level, and little concern from the federal
agencies providing the grants.”
Facial recognition technology was first developed in the
1960s, but only recently became accurate enough for widespread use. It is among
an array of technologies, including StingRay tracking devices and surveillance
aircraft with specialized cameras, that were used in overseas wars but have
found their way into local law enforcement.
The software can identify 16,000 distinct points on a
person’s face — to determine the distance between the eyes or the shape of the
lips, for instance — and compare them with thousands of similar points in
police booking or other photos at a rate of more than one million faces a
second.
The technology is so new that experts say they are
unaware of major legal challenges. In some cities, though, a backlash is
stirring.
In Northern California, the Oakland City Council, under
pressure from residents and civil liberties advocates, scaled back plans this
year for a federally financed center that would have linked surveillance
equipment around the city, including closed-circuit cameras, gunshot microphones
and license plate readers. It also formed a committee to limit the use of this
equipment and to develop privacy standards, like how long data may be kept and
who will have access to it.
The authorities in Boston tested facial recognition
technology but decided in 2013 not to adopt it, saying it crossed an ethical
line. The software had been linked to surveillance cameras to secretly scan the
faces of thousands of people at outdoor concerts in the city center. The images
had then been fed into software capable of analyzing them.
“I don’t want people to think we’re always spying on
them,” said William B. Evans, Boston’s police commissioner.
Yet the F.B.I. is pushing ahead with its $1 billion Next
Generation Identification biometric program, in which the agency will gather
data like fingerprints, iris scans and photographs, as well as information
collected through facial recognition software. That software is capable of
analyzing images from driver’s license photos and the tens of thousands of
surveillance cameras around the country.
The F.B.I. system will eventually be made accessible to
more than 18,000 local, state, federal and international law enforcement
agencies.
But people who are not criminal suspects are included in
the database, and the error rate for the software is as high as 20 percent —
meaning the authorities could misidentify millions of people.
Among the cities that use facial recognition technology
are New York and Chicago, which has linked it to 25,000 surveillance cameras in
an effort to fight street crime.
In many ways, though, San Diego County is at the
forefront.
Here, beat cops, detectives and even school police
officers are using hand-held devices to create a vast database of tens of
thousands of photos of people like Mr. Harvey and Mr. Hanson — some suspected
of committing crimes, others not — usually without the person’s consent.
Not everyone is opposed to such programs. Last year, Tom
Northcutt, a San Diego property manager, took an iPhone photo of a man moments
before the man struck him in the arm with a two-by-four and fled. Mr.
Northcutt, who did not know the aggressor, immediately sent the image to the
police by email.
Less than 10 minutes later, a detective matched the man
to a booking photograph of a suspect, who was arrested and later convicted of
assault.
“It felt good knowing that they could do that,” Mr.
Northcutt said.
Mr. Harvey, 27, remains upset about what happened to him.
He said that when he refused to consent to having his picture taken, the
officer boasted that he could do so anyway.
“He said, ‘We’re going to do this either legally or
illegally,’ and pulled me out of the car,” Mr. Harvey said.
Mr. Harvey, who is African-American, said the San Diego
Police had stopped him as a suspected gang member more than 50 times because
his neighborhood, Lincoln Park, is among the city’s most violent.
He said he had been told he was in a gang database, even
though he has never been a gang member. He recently spent nearly a year in jail
on gang conspiracy charges that were dismissed in March.
“I don’t know how good a gang member I could have been,
not having a criminal record,” he said.
Mr. Hanson, who is white and lives in the city’s upscale
Ocean Beach neighborhood, said his treatment by officers had been as intrusive
as it was frightening.
“I’m not a lawyer,” he said, “but they didn’t appear to
be following the law.”
Comments
Post a Comment