Who's driving? Autonomous cars may be entering the most dangerous phase
Who's driving? Autonomous cars may
be entering the most dangerous phase
Autopilot controls are not yet
fully capable of functioning without human intervention – but they’re good
enough to lull us into a false sense of security
When California police officers
approached a Tesla stopped in the centre of a
five-lane highway outside San Francisco last week, they found a man asleep at
the wheel. The driver, who was arrested on suspicion of drunk driving, told
them his car was in “autopilot”, Tesla’s semi-autonomous driver assist system.
In
a separate incident this week, firefighters in Culver City reported that
a Tesla rear-ended their parked fire truck as it attended an accident on the
freeway. Again, the driver said the vehicle was in autopilot.
When u pass out behind the wheel on the Bay Bridge with
more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver
explained Tesla had been set on autopilot. He was arrested and charged with
suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL
The oft-repeated promise of driverless technology is
that it will make the roads safer by reducing human error, the primary cause of
accidents. However, automakers have a long way to go before they can eliminate
the driver altogether.
What’s
left is a messy interim period when cars are being
augmented incrementally with automated technologies such as obstacle detection
and lane centering. In theory, these can reduce the risk of crashes, but they
are not failsafe. As a Tesla spokeswoman put it: “Autopilot is intended for use
only with a fully attentive driver.”
Autonomy gives people a sense
something is in control, and we have a tendency to overestimate technology’s
capabilities
However,
research has shown that drivers get lulled into a false sense of security to
the point where their minds and gazes start to wander away from the road.
People become distracted or preoccupied with their smartphones. So when the car
encounters a situation where the human needs to intervene, the driver can be
slow to react.
At
a time when there is already a surge in collisions caused by drivers distracted
by their smartphones, we could be entering a particularly dangerous period of
growing pains with autonomous driving systems.
“People
are already inclined to be distracted. We’re on our phones, eating burgers,
driving with our knees,” said Nidhi Kalra, senior information scientist at the
Rand Corporation. “Additional autonomy gives people a sense that something else
is in control, and we have a tendency to overestimate the technology’s
capabilities.”
Steven
Shladover, of the University of California, Berkeley’s Path
programme, was more sharply critical of car manufacturers: “These
companies are overselling the capabilities of the systems they have and the
public is being misled.”
Waymo, Google’s self-driving car spin-off,
discovered the handoff problem when it was testing a “level 3” automated
driving system – one that can drive itself under certain conditions, but in
which the human still needs to takeover if the situation becomes tricky. The
next level, four, is what most people consider “fully autonomous”.
Most
of the advanced driver assist features introduced by Tesla, Mercedes, BMW and
Cadillac are categorised as level 2 automation.
During
testing, Waymo recorded what its CEO, John Krafcik, described as “sort of
scary” video footage of drivers texting, applying makeup and even sleeping
behind the wheel while their cars hurtled down the freeway. This led Waymo to
decide to leapfrog level 3 automation altogether, and focus on full autonomy
instead.
Comments
Post a Comment