Flaw deicovered in Driverless Cars - They obey the law all the time
Humans Are Slamming Into Driverless Cars and Exposing a
Key Flaw
By Keith Naughton
December 17, 2015 — 4:01 PM PST Updated on December 18,
2015 — 3:30 AM PST
The self-driving car, that cutting-edge creation that’s
supposed to lead to a world without accidents, is achieving the exact opposite
right now: The vehicles have racked up a crash rate double that of those with
human drivers.
The glitch?
They obey the law all the time, as in, without exception.
This may sound like the right way to program a robot to drive a car, but good
luck trying to merge onto a chaotic, jam-packed highway with traffic flying
along well above the speed limit. It tends not to work out well. As the
accidents have piled up -- all minor scrape-ups for now -- the arguments among
programmers at places like Google Inc. and Carnegie Mellon University are
heating up: Should they teach the cars how to commit infractions from time to
time to stay out of trouble?
“It’s a constant debate inside our group,” said Raj
Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving
Collaborative Research Lab in Pittsburgh. “And we have basically decided to
stick to the speed limit. But when you go out and drive the speed limit on the
highway, pretty much everybody on the road is just zipping past you. And I
would be one of those people.”
Last year, Rajkumar offered test drives to members of
Congress in his lab’s self-driving Cadillac SRX sport utility vehicle. The
Caddy performed perfectly, except when it had to merge onto I-395 South and
swing across three lanes of traffic in 150 yards (137 meters) to head toward
the Pentagon. The car’s cameras and laser sensors detected traffic in a
360-degree view but didn’t know how to trust that drivers would make room in
the ceaseless flow, so the human minder had to take control to complete the
maneuver.
“We end up being cautious,” Rajkumar said. “We don’t want
to get into an accident because that would be front-page news. People expect
more of autonomous cars.”
Not at Fault
Turns out, though, their accident rates are twice as high
as for regular cars, according to a study by the University of Michigan’s
Transportation Research Institute in Ann Arbor, Michigan. Driverless vehicles
have never been at fault, the study found: They’re usually hit from behind in
slow-speed crashes by inattentive or aggressive humans unaccustomed to machine
motorists that always follow the rules and proceed with caution.
“It’s a dilemma that needs to be addressed,” Rajkumar
said.
It’s similar to the thorny ethical issues driverless car
creators are wrestling with over how to program them to make life-or-death
decisions in an accident. For example, should an autonomous vehicle sacrifice
its occupant by swerving off a cliff to avoid killing a school bus full of
children?
California is urging caution in the deployment of
driverless cars. It published proposed rules this week that would require a
human always to be ready to take the wheel and also compel companies creating
the cars to file monthly reports on their behavior. Google -- which developed a
model with no steering wheel or gas pedal -- said it is “gravely disappointed”
in the proposed rules, which could set the standard for autonomous-car
regulations nationwide.
Fast Track
Google is on a fast track. It plans to make its
self-driving-cars unit a stand-alone business next year and eventually offer a
ride-for-hire service, according to a person briefed on the company’s strategy.
Google cars have been in 17 minor crashes in 2 million
miles (3.2 million kilometers) of testing and account for most of the reported
accidents, according to the Michigan study. That’s partly because the company
is testing mainly in California, where accidents involving driverless cars must
be reported.
The most recent reported incident was Nov. 2 in Mountain
View, California, Google’s headquarters, when a self-driving Google Lexus SUV
attempted to turn right on a red light. It came to a full stop, activated its
turn signal and began creeping slowly into the intersection to get a better
look, according to a report the company posted online. Another car stopped
behind it and also began rolling forward, rear-ending the SUV at 4 mph. There
were no injuries and only minor damage to both vehicles.
Robot-Car Stop
Ten days later, a Mountain View motorcycle cop noticed
traffic stacking up behind a Google car going 24 miles an hour in a busy 35 mph
zone. He zoomed over and became the first officer to stop a robot car. He
didn’t issue a ticket -- who would he give it to? -- but he warned the two
engineers on board about creating a hazard.
“The right thing would have been for this car to pull
over, let the traffic go and then pull back on the roadway,” said Sergeant Saul
Jaeger, head of the police department’s traffic-enforcement unit. “I like it
when people err on the side of caution. But can something be too cautious?
Yeah.”
While Google rejects the notion that its careful cars
cause crashes, “we err on the conservative side,” said Dmitri Dolgov, principal
engineer of the program. “They’re a little bit like a cautious student driver
or a grandma.”
More Aggressive
Google is working to make the vehicles more “aggressive”
like humans -- law-abiding, safe humans -- so they “can naturally fit into the
traffic flow, and other people understand what we’re doing and why we’re doing
it,” Dolgov said. “Driving is a social game.”
Google has already programmed its cars to behave in more
familiar ways, such as inching forward at a four-way stop to signal they’re
going next. But autonomous models still surprise human drivers with their quick
reflexes, coming to an abrupt halt, for example, when they sense a pedestrian
near the edge of a sidewalk who might step into traffic.
“These vehicles are either stopping in a situation or
slowing down when a human driver might not,” said Brandon Schoettle, co-author
of the Michigan study. “They’re a little faster to react, taking drivers behind
them off guard.”
That could account for the prevalence of slow-speed,
rear-end crashes, he added.
Behave Differently
“They do behave differently,” said Egil Juliussen, senior
director at consultant IHS Technology and author of a study on how Google leads
development of autonomous technology. “It’s a problem that I’m sure Google is
working on, but how to solve it is not clear.”
One approach is to teach the vehicles when it’s OK to
break the rules, such as crossing a double yellow line to avoid a bicyclist or
road workers.
“It’s a sticky area,” Schoettle said. “If you program
them to not follow the law, how much do you let them break the law?”
Initially, crashes may rise as more robot autos share the
road, but injuries should diminish because most accidents will be minor,
Schoettle said.
“There’s a learning curve for everybody,” said Jaeger, of
the Mountain View Police, which interacts more with driverless cars than any
other law-enforcement unit.
“Computers are learning, the programmers are learning and
the people are learning to get used to these things.”
Comments
Post a Comment