The public thinks Tesla’s Autopilot is safer than it is, an insurance group says
The public thinks Tesla’s Autopilot is safer
than it is, an insurance group says
An insurance industry study says
the Autopilot name for Tesla’s driver assist system is misleading. Tesla
disagrees.
RUSS MITCHELLJUNE 20, 2019 3 AM
For
years, automakers have been offering driver-assist technologies on many new
cars. The software and sensors can help drivers stay in their lane, maintain a
steady speed and, in some cases, change lanes to avoid other vehicles.
But
drivers don’t fully understand the technologies’ capabilities and limitations,
according to a study released Thursday by a leading insurance industry group.
That
disconnect between what tech can do and what people think it can do leads to
risky driver behavior that has resulted in crashes, injuries and deaths, and it
could provoke a backlash against potentially life-saving driverless cars, said
the Insurance Institute for Highway Safety, which is
funded by auto insurance companies.
“Driver
assistance systems show safety benefits. The last thing we want to see is those
benefits eliminated by the automakers introducing other risks into the
vehicle,” IIHS President David Harkey said.
Tesla
Inc., which is mentioned prominently in the study, disputed the findings.
“This
survey is not representative of the perceptions of Tesla owners or people who
have experience using Autopilot, and it would be inaccurate to suggest as
much,” the electric-car company said in a statement. “If IIHS is opposed to the
name ‘Autopilot,’ presumably they are equally opposed to the name ‘Automobile.’
”
In
all current versions of driver-assist technology, drivers are supposed to stay
attentive even as robotic systems take over some driving duties. These systems
share some characteristics with the completely driverless cars under
development, but they are fundamentally different because a human must remain
in control.
The
IIHS study released Thursday questioned more than 2,000 people about the names
used by five automobile companies to market systems intended to relieve driver
stress and improve safety, and its results indicated that names such as Tesla’s Autopilot create consumer confusion about how
much attention a driver really has to pay. The other technologies covered in
the study are Audi and Acura Traffic Jam Assist, Cadillac Super Cruise, BMW
Driving Assistant Plus and Nissan ProPilot Assist. (The automakers’ names were
not mentioned in the study.)
Autopilot
created the most confusion among study respondents. Nearly half — 48% — thought
it would be safe to take hands off the steering wheel when using the system.
The Tesla manual says the driver should hold the wheel at all times, and — when
used as intended — Autopilot warns drivers if they do not. Only 21% to 33% of
study respondents thought (incorrectly) that it would be safe for a driver to
take hands off the wheel when using the other driver-assist systems.
Twenty
percent or more thought it would be safe to talk on the phone while using any
of the systems. A smaller percentage thought texting, watching a video or
taking a nap would be safe. Sixteen percent said it would be safe to text while
using Autopilot; less than 10% thought so for the other driver-assist systems.
Six percent said it would be safe to nap while using Autopilot; about 3%
thought the same about the other systems.
The
assumption that Tesla’s Autopilot was more capable of handling intrinsically
unsafe behavior didn’t surprise Harkey.
“The
name implies something to the consumer about automation, that the vehicle is
designed with automated capabilities that allows them to take on other tasks,” he
said.
“We’ve
seen a number of unfortunate fatal crashes involving Tesla Autopilot,” Harkey said.
“It’s apparent from crashes that have happened that people are improperly using
the system and improperly inferring the system does more than it is designed
to.”
He
mentioned a crash in Delray Beach, Fla., in March in which a Tesla with
Autopilot engaged drove underneath a semitrailer, shearing off the top of the
car and killing the Tesla driver. A similar fatal crash occurred between a
Tesla on Autopilot and a semitrailer in 2016.
Since
Autopilot was introduced in 2014, videos have circulated showing drivers
abusing the system, including one that made the rounds last week showing
a driver
apparently snoozing in his Tesla while driving along the 405 Freeway
in Los Angeles.
Tesla
said it gives drivers clear direction about Autopilot’s capabilities and how it
should be used.
“Tesla
provides owners with clear guidance on how to properly use Autopilot, as well
as in-car instructions before they use the system and while the feature is in
use,” the company said. “If a Tesla vehicle detects that a driver is not engaged
while Autopilot is in use, the driver is prohibited from using it for that
drive.”
Tesla
also said “the survey did not expose participants to any of the materials or
education that we provide drivers and owners with before they use Autopilot.”
The
company said the IIHS study implies that the deaths in the two crashes that
involved semi trucks were the result of misunderstanding by the Tesla drivers.
Tesla said family statements about the drivers’ familiarity with Autopilot
showed that was not the case, at least in the 2016 crash. The crash is still
under investigation by federal safety officials.
The
insurance group said the study was meant to look at how mass-market consumers,
who may or may not be familiar with driver-assist technology, react to brand names.
Kelly
Funkhouser, head of connected and automated vehicles at Consumer Reports, said
companies must do a better job of communicating the limitations of
driver-assist systems.
“As
these get better, it’s human nature to check out,” she said. “We get bored. It’s
second nature to pick up your phone. The problem is, these systems are still in
their infancy. They’re good enough to give you a sense of confidence in them
but not enough to maintain accurate control as the car moves through different
conditions.”
Funkhouser researched the driver-robot connection in self-driving
cars at the University of Utah before she joined Consumer Reports last year.
The better the driver-assist system, she said, the more time it took for a
human driver to react to problems.
She
also studied brand names at Utah and found that people overestimated
Autopilot’s capabilities but underestimated the capabilities of Cadillac’s
Super Cruise, whose abilities are similar to Autopilot’s but which uses an eye
tracker instead of steering-wheel torque measurement to monitor driver
attention. (Autopilot won’t notice a driver is sleeping if he or she keeps a
hand on the wheel, Funkhouser noted, but the Cadillac with its eye tracker
would register a problem and the car would move to the side of the road.)
It’s
important for people to see beyond marketing hype, Funkhouser said. “A lot of
people with a lot of money are promising things that aren’t real or haven’t
even been invented yet.”
Comments
Post a Comment