AI detective analyses police data to learn how to crack cases
AI detective analyses police data to learn how to crack
cases
A system called VALCRI should do the laborious parts of a
crime analyst's job in seconds, while also suggesting new lines of enquiry and
possible motives
By Timothy Revell 10 May 2017
MOVE over, Sherlock. UK police are trialling a computer
system that can piece together what might have happened at a crime scene. The
idea is that the system, called VALCRI, will be able to do the laborious parts
of a crime analyst’s job in seconds, freeing them to focus on the case, while
also provoking new lines of enquiry and possible narratives that may have been
missed.
“Everyone thinks policing is about connecting the dots,
but that’s the easy bit,” says William Wong, who leads the project at Middlesex
University London. “The hard part is working out which dots need to be
connected.”
VALCRI’s main job is to help generate plausible ideas
about how, when and why a crime was committed as well as who did it. It scans
millions of police records, interviews, pictures, videos and more, to identify
connections that it thinks are relevant. All of this is then presented on two
large touchscreens for a crime analyst to interact with.
Spotting patterns
The system might spot that shell casings were found at
several recent crime scenes including the one the police are focusing on now,
for example. “An analyst can then say whether this is relevant or not and
VALCRI will adjust the results,” says Neesha Kodagoda, also at Middlesex.
Thanks to machine learning, the system improves its searches on the basis of
such interactions with analysts, who can raise or lower the importance of
different sets of criteria with a swipe.
When an unsolved crime lands on an analyst’s desk, one of
the first things they have to do is search police databases for incidents that
could be related based on their location, time or modus operandi, and collect
details of all of the people involved. “An experienced analyst needs 73
individual searches to gather all of this information, before manually putting
it into an easily digestible form,” says Kodagoda. “VALCRI can do this with a
single click.”
This is no mean feat. A lot of the information recorded
in police reports is in side notes and descriptions, but the algorithms
powering VALCRI can understand what is written – at a basic level.
For example, interviews with people at three different
crime scenes may describe an untidy person nearby. One person might have used
the word “scruffy”, another “dishevelled” and the third “messy”. A human would
have no trouble considering that all three might be describing the same person.
Improvements in artificial intelligence mean VALCRI can make such links too.
The system can also use face recognition software to identify people in CCTV
footage or pictures taken at a scene.
West Midlands Police in the UK are currently testing
VALCRI with three years’ worth of real but anonymised data, totalling around
6.5 million records. Police in Antwerp, Belgium, are trialling a version of the
system too.
The next stage is to let VALCRI loose on non-anonymised,
new data as crimes happen. This has been agreed in principle, but getting the
final go ahead is a delicate process. Police techniques used during an
investigation can be challenged in court, so deploying VALCRI too soon or
incorrectly could cause cases to collapse. And laws vary between countries on
how police data can be used.
An added complication is that many people would be
uncomfortable with computers determining the probability of different
narratives explaining a crime. “The data in a crime case is simply not good
enough to do that, so VALCRI doesn’t either,” says team member Ifan Shepherd at
Middlesex. “A human analyst always has to call the shots.”
Having humans in charge won’t solve everything. “Machine
learning can help the police, but it will introduce new biases too,” says Mark
Riedl at Georgia Tech in Atlanta. It will be easy for analysts to think the
system has identified all the relevant characteristics, but it is bound to miss
some as well.
VALCRI tries to counteract this by making the whole
process transparent. Results are never hidden, and every decision can be
retraced. Overall, this could lead to increasingly detailed cases being put to
juries, says Michael Young at the University of Utah in Salt Lake City.
“Narratives could be constructed in a way that preserves provenance,” he says.
In other words, things that would have been left out to
make a case fit together can be included digitally, along with an explanation.
This could be used by both the defence and the prosecution in court to make
each side’s assumptions more transparent, says Young. Sherlock Holmes might be
edged out, but he’d approve.
This article appeared in print under the headline “Robot
detective gets on the case”
Comments
Post a Comment