Amazon's facial-recognition technology is supercharging local police
Amazon's facial-recognition technology is supercharging
local police
Drew Harwell, The Washington Post Published 2:57 pm PDT,
Tuesday, April 30, 2019
The Washington County Sheriff's Office was the first
law-enforcement agency in the country to use Amazon's artificial-intelligence
tool Rekognition.
HILLSBORO, Ore. - When workers at an Ace Hardware here
reported that a woman had walked out of the store with an $11.99 tank of
welding gas that she hadn't paid for in her tote bag, an elaborate high-tech
crime-fighting operation sprang into action.
A Washington County sheriff's detective, working with the
agency's Special Investigations Unit, ran the store's surveillance footage
through an internal facial-recognition program built by Amazon, revealing a
possible match.
That woman's license plate was flagged and, three months
later, a narcotics officer in an unmarked SUV saw it and radioed other patrol
deputies to stop her. A deputy clapped a pair of handcuffs around her wrists,
an arrest report states. She said she'd needed the gas to fix her car.
Deputies in this corner of western Oregon outside
ultraliberal Portland used to track down criminals the old-fashioned way,
faxing caught-on-camera images of a suspect around the office in hope that
someone might recognize the face.
Then, in late 2017, the Washington County Sheriff's
Office became the first law enforcement agency in the country known to use
Amazon's artificial-intelligence tool Rekognition, transforming this thicket of
forests and suburbs into a public testing ground for a new wave of experimental
police surveillance techniques.
Almost overnight, deputies saw their investigative powers
supercharged, allowing them to scan for matches of a suspect's face across more
than 300,000 mug shots taken at the county jail since 2001. A grainy picture of
someone's face - captured by a security camera, a social media account or a
deputy's smartphone - can quickly become a link to their identity, including
their name, family and address. More than 1,000 facial-recognition searches
were logged last year, said deputies, who sometimes used the results to find a
suspect's Facebook page or visit their home.
But Washington County also became ground zero for a
high-stakes battle over the unregulated growth of policing by algorithm.
Defense attorneys, artificial-intelligence researchers and civil rights experts
argue that the technology could lead to the wrongful arrest of innocent people
who bear only a resemblance to a video image. Rekognition's accuracy is also
hotly disputed, and some experts worry that a case of mistaken identity by
armed deputies could have dangerous implications, threatening privacy and
people's lives.
Some police agencies have in recent years run
facial-recognition searches against state or FBI databases using systems built
by contractors such as Cognitec, IDEMIA and NEC. But the rollout by Amazon has
marked perhaps the biggest step in making the controversial face-scanning
technology mainstream. Rekognition is easy to activate, requires no major
technical infrastructure and is offered to virtually anyone at bargain-barrel
prices. Washington County spent about $700 to upload its first big haul of
photos, and now, for all its searches, it pays about $7 a month.
It's impossible to tell, though, just how accurate or
effective the technology has been during its first 18 months of real-world
tests. Deputies don't have to note in arrest reports when a facial-recognition
search was used, and the exact number of times it has resulted in an arrest is
unclear. Sheriff's officials said the software has led to dozens of arrests for
theft, violence or other crimes, but a public-records request turned up nine
case reports in which facial recognition was mentioned.
"Just like any of our investigative techniques, we
don't tell people how we catch them," said Robert Rookhuyzen, a detective
on the agency's major crimes team who said he has run "several dozen"
searches and found it helpful about 75% of the time. "We want them to keep
guessing."
Sheriff's officials say face scans don't always mark the
end of the investigation: Deputies must still establish probable cause or find
evidence before charging a suspect with a crime. But the Sheriff's Office sets
its own rules for facial-recognition use and allows deputies to use the tool to
identify bodies, unconscious suspects and people who refused to give their
name.
The search tool's imperfect results raise the risk of an
innocent person being flagged and arrested, especially in cases of the scanned
images being blurred, low-quality or partially concealed. Deputies are also
allowed to run artist sketches through the search, an unusual use that AI
experts said could more often lead to a false match.
Amazon's guidelines for law enforcement say officials
should use Rekognition's results only when the system is 99% confident in a
match. But deputies here are not shown that search-confidence measurement when
they use the tool. Instead, they are given five possible matches for every
search, even if the system's certainty in a match is far lower.
After fielding questions from The Washington Post, Amazon
added language to those guidelines, stating that officers should manually
review all matches before detaining a suspect and that the search
"shouldn't be used as the sole determinant for taking action."
The relationship between Amazon and Oregon's
third-largest law enforcement agency is mutually beneficial: The Sheriff's
Office is helping to refine the system, which Amazon hopes to sell across the
country. But Amazon's push into law-enforcement sales has alarmed some legal
advocates who say the system poses too many risks to civil liberties. (Amazon
founder and CEO Jeff Bezos owns The Post.)
"The government is incredibly powerful, and they
bring a lot to bear against an individual citizen in a case," said Mary
Bruington, the director of the Washington County Public Defender's Office,
which represents defendants who can't afford an attorney. "You couple that
with Amazon? That's a powerful partnership."
Matt Wood, the general manager of artificial intelligence
for the company's cloud-computing division, Amazon Web Services, said in a
statement that Rekognition is just "another input among many other leads
for a 100 percent human-driven investigation."
Still, the company faces criticism on many fronts: Top AI
researchers, members of Congress and civil rights groups - as well as some of
Amazon's own investors and employees - have urged the company to stop providing
the technology to law enforcement, pointing to studies that have found that the
system is less accurate with dark-skinned faces. Amazon has disputed that
research.
Some of Amazon's rivals have spurned similar contracts.
Microsoft President Brad Smith said in April that the company had recently
declined to provide its facial-recognition software to a California law
enforcement agency that wanted to run a face scan anytime its officers pulled
someone over, but that it had approved a deal putting the technology in a U.S.
prison. Microsoft declined to provide details.
Amazon investors will vote in May on a proposal, backed
by a group of activist shareholders, that would prevent the company from
selling Rekognition to government agencies unless the company's board
determines that it doesn't pose a risk to human rights.
The Sheriff's Office allowed Post journalists to spend
two days in March in its squad cars, detective's offices and county jail,
observing how deputies have folded the technology into their daily caseload.
Most of those interviewed said the software had saved them time, boosted their
arrest numbers and helped them process the growing glut of visual evidence. To
date, no legal challenge has been made to an arrest on the grounds that the
photo match was mistaken, both deputies and public defenders said.
But lawyers in Oregon said the technology should not be,
as many see it, an imminent step forward for the future of policing, and they
frame the system not as a technical milestone but a moral one: Is it OK to nab
more bad guys if more good guys might get arrested, too?
"People love to always say, 'Hey, if it's catching
bad people, great, who cares,' " said Joshua Crowther, a chief deputy
defender in Oregon, "until they're on the other end."
- - -
'Indistinguishable from magic'
When Amazon revealed Rekognition in 2016, the company
called it a breakthrough for a potent style of deep-learning artificial
intelligence that showed results "indistinguishable from magic." In a
blog post illustrated with a photo of an executive's dog, the company offered
some general ideas for how people could begin using it, including for security
checkpoints or billboards wired to gather data from a viewer's face.
The unveiling caught the eye of Chris Adzima, a former
eBay programmer who had been hired at the Washington County Sheriff's Office to
work on an iPhone app that deputies use to track inmates' behavior. His agency
had hundreds of thousands of facial photos already online and no real way to
analyze them. Using Amazon's AI, he got a system up and running in less than
three weeks.
"They didn't really have a firm idea of any type of
use cases in the real world, but they knew that they had a powerful tool that
they created," said Adzima, a senior information systems analyst who works
in a small cubicle at the sheriff's headquarters. "So, you know, I just
started using it."
Deputies immediately began folding facial searches into
their daily beat policing, and Adzima built a bare-bones internal website that
let them search from their patrol cars. He dropped the search-confidence
percentages and designed the system to return five results, every time: When
the system returned zero results, he said, deputies wondered whether they'd
messed something up. To spice it up, he also added an unnecessary purple
"scanning" animation whenever a deputy uploaded a photo - a touch he
said was inspired by cop shows like "CSI."
As he started flooding Amazon's servers with image data,
account executives there took notice, he said, and some voiced their surprise
and excitement that he was using it for police work. In one 2017 email first
revealed last year as part of an American Civil Liberties Union public-records
request, an Amazon account executive asked to introduce Adzima to an executive
at a police-body-camera company who wanted to understand how he "overcame
stakeholder resistance." "You're AWS-famous now," the executive
wrote, with an emoji of a smiley face.
Deputies here say the system is a huge hit. Chris Lee,
who has used the search in five cases of burglary and theft, said many of his
colleagues have become prolific users, eager to find a simple resolution to an
otherwise-difficult hunt. "You're always like: Is it going to show us
something?" he said.
For training, deputies are emailed only a printout of the
office's facial-recognition policy and a short PowerPoint presentation
cautioning them to be careful with the results. One slide shows how the system
responded to an uploaded mug shot of O.J. Simpson: by returning a photo of a
white man with a beard. "As you can see," the slide reads, the system
"still requires human interpretation."
The agency's four-page policy requires staffers to use
the system only in cases of a "criminal nexus" and prohibits its use
in "mass surveillance" or to monitor people based on their religion,
political activities or race. But it also offers several exceptions, including
allowing facial searches in cases of "significant threat to life" or
when deputies believe a felony suspect will be at a certain place at a specific
time.
The search has helped deputies devise unconventional
techniques. In one case, an inmate was talking to his girlfriend on a jailhouse
phone when she said there was a warrant out for her arrest. Deputies went to
the inmate's Facebook page, found an old video with her singing and ran a
facial-recognition search to get her name; she was arrested within days.
Deputies can also run black-and-white police sketches
through the system looking for results; in one test case, they said, it pointed
to a man they'd already flagged as their suspect. Amazon said that running
sketches through Rekognition does not violate its rules but that it expects
human reviewers to "pay close attention to the confidence of any matches
produced this way."
Bruington, from the county public defender's office, said
Rekognition's low price and ease of use could tempt police agencies into
experimenting with a system they may not fully understand. She also worried
that the system's dependence on mug shots meant that anyone previously brought
in by police would be that much more likely to resurface in a criminal search.
"Innocent people go through the criminal justice
system every day," she said.
- - -
'Look at the bird'
Facial-recognition technology had for decades been a
police agency's dream: a simple, stealthy way to identify anyone from afar,
without their knowledge or consent. But only in recent years - thanks to
improvements in imaging and computer power, and plunging data-storage costs -
has the technology become affordable and widespread, used in tagging Facebook
photos and unlocking iPhones.
Today's systems break down people's facial photos into
long strands of code, called "feature vectors" or
"faceprints," that can be rapidly compared with other portraits
across a vast database. But while "computer-vision" algorithms are
adept at pattern recognition, they match pixels, not clues, and can miss
inconsistencies that would seem staggeringly obvious to the human eye.
Still, the promise of cheap and easy identification has
proved too compelling for many companies to ignore. The federal agency that
assesses facial-recognition algorithms, the National Institute of Standards and
Technology, recently said it had tested 127 systems from 44 companies on their
"scalability to large populations" and accuracy in identifying
"noncooperative subjects" photographed "in the wild." The
top-ranking algorithms, from Microsoft and the Chinese start-up Yitu
Technology, could match a face photo across a database of millions of images
with 99% accuracy.
Amazon has previously declined to submit Rekognition for
this assessment, saying the test, which studies an isolated version of the core
search algorithm, wouldn't work on its complicated cloud-based search. But an
NIST official said that fact has not impeded other companies with similar
searches. An Amazon official said the company had launched a
"substantive" effort to "redesign critical components" of
the system so it could participate.
The FBI said it ran more than 52,000 facial-recognition
searches in the past fiscal year, and in 2016, researchers from the Georgetown
University law school found at least 52 state or local agencies that had at
some point relied on a facial-search system built by federal contractors or
surveillance firms. But Amazon has made it simple for any new police force to
get started, charging a cut-rate fee based partially on the number of
"faces stored."
No federal laws govern the use of facial recognition. But
a bipartisan bill introduced in the U.S. Senate in March and a proposed bill in
Amazon's home state of Washington could impose new rules that would, for
instance, require companies to notify passersby that their faces are being
scanned. San Francisco leaders are expected to vote next week on a proposal,
opposed by police, that would make the tech capital the first city in America
to ban local agencies from using facial-recognition software.
Amazon executives say they support national
facial-recognition legislation, but they have also argued that "new
technology should not be banned or condemned because of its potential
misuse." FBI agents and Orlando, Florida, police say they have tested the
system, and Amazon has pitched it to government agencies, including Immigration
and Customs Enforcement.
Lawyers in Washington County, Oregon, said they're just
starting to see the technique show up in arrest reports, and some are preparing
for the day when they may have to litigate the systems' admissibility in court.
Marc Brown, a chief deputy defender working with Oregon's Office of Public
Defense Services, said he worried the system's hidden decision-making could
improperly tilt the balance of power: Human eyewitnesses can be questioned in
court, but not this "magic black box," and "we as defense
attorneys cannot question, you know, how did this process work."
The system's results, Brown added, could pose a huge
confirmation-bias problem by steering how deputies react. "You've already
been told that this is the one, so when you investigate, that's going to be in
your mind," he said. "The question is no longer who committed the
crime, but where's the evidence to support the computer's analysis?"
Amazon's software is rapidly becoming more advanced. The
company last month announced a Rekognition update that would, among other
things, improve the accuracy of the system's "emotion detection"
feature, which automatically speculates on how someone is feeling based on how
they look on camera. It includes "7 supported emotions: 'Happy,' 'Sad,'
'Angry,' 'Surprised,' 'Disgusted,' 'Calm' and 'Confused.' "
Amazon also owns Ring, the maker of a popular doorbell
camera, which applied last year for a facial-recognition patent that could flag
"suspicious" people at a user's doorstep. A Ring spokeswoman said the
company's patent applications are intended to "explore the full
possibilities of new technology."
The Washington County Sheriff's Office's face database,
meanwhile, is always growing, by roughly 19,000 jail bookings a year. When people
are arrested, they're brought to a bustling intake room where they get their
picture taken by a webcam topped with a red Beanie Babies cardinal. "Look
at the bird," they're told.
Those photos become the inmates' identities throughout
the county's penal system, and an internal jail website and iPhone app displays
the images in a large grid so deputies can quickly track their food intake,
behavior and suicide risk.
Rekognition isn't used once an inmate is in lockup, but
it has nevertheless left a subtle impact behind bars. Standing in the
guardhouse nerve center of Pod 3, the maximum-security wing that inmates call
"the hole," deputy Brian van Kleef put it this way: "This is
where we gather our database."
Comments
Post a Comment