Who's driving? Autonomous cars may be entering the most dangerous phase

Who's driving? Autonomous cars may be entering the most dangerous phase

 

Autopilot controls are not yet fully capable of functioning without human intervention – but they’re good enough to lull us into a false sense of security
When California police officers approached a Tesla stopped in the centre of a five-lane highway outside San Francisco last week, they found a man asleep at the wheel. The driver, who was arrested on suspicion of drunk driving, told them his car was in “autopilot”, Tesla’s semi-autonomous driver assist system.
In a separate incident this week, firefighters in Culver City reported that a Tesla rear-ended their parked fire truck as it attended an accident on the freeway. Again, the driver said the vehicle was in autopilot.

When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL
The oft-repeated promise of driverless technology is that it will make the roads safer by reducing human error, the primary cause of accidents. However, automakers have a long way to go before they can eliminate the driver altogether.
What’s left is a messy interim period when cars are being augmented incrementally with automated technologies such as obstacle detection and lane centering. In theory, these can reduce the risk of crashes, but they are not failsafe. As a Tesla spokeswoman put it: “Autopilot is intended for use only with a fully attentive driver.” 
Autonomy gives people a sense something is in control, and we have a tendency to overestimate technology’s capabilities
Nidhi Kalra, information scientist
However, research has shown that drivers get lulled into a false sense of security to the point where their minds and gazes start to wander away from the road. People become distracted or preoccupied with their smartphones. So when the car encounters a situation where the human needs to intervene, the driver can be slow to react.
At a time when there is already a surge in collisions caused by drivers distracted by their smartphones, we could be entering a particularly dangerous period of growing pains with autonomous driving systems.
“People are already inclined to be distracted. We’re on our phones, eating burgers, driving with our knees,” said Nidhi Kalra, senior information scientist at the Rand Corporation. “Additional autonomy gives people a sense that something else is in control, and we have a tendency to overestimate the technology’s capabilities.”
Steven Shladover, of the University of California, Berkeley’s Path programme, was more sharply critical of car manufacturers: “These companies are overselling the capabilities of the systems they have and the public is being misled.”
Waymo, Google’s self-driving car spin-off, discovered the handoff problem when it was testing a “level 3” automated driving system – one that can drive itself under certain conditions, but in which the human still needs to takeover if the situation becomes tricky. The next level, four, is what most people consider “fully autonomous”.
Most of the advanced driver assist features introduced by Tesla, Mercedes, BMW and Cadillac are categorised as level 2 automation.
During testing, Waymo recorded what its CEO, John Krafcik, described as “sort of scary” video footage of drivers texting, applying makeup and even sleeping behind the wheel while their cars hurtled down the freeway. This led Waymo to decide to leapfrog level 3 automation altogether, and focus on full autonomy instead.
“We found that human drivers overusted the technology and were not monitoring the roadway carefully enough to be able to safely take control when needed,” said the company in its 2017 safety report.
Ian Reagan from the Insurance Institute for Highway Safety (IIHS) shares Waymo’s caution, although he acknowledges that the safety potential for automated systems is “huge”.
“There are lots of potential unintended consequences, particularly with level 2 and 3 systems,” he said, explaining how the IIHS had bought and tested several cars with level 2 automation including vehicles from Tesla, Mercedes and BMW. “Even the best ones do things you don’t expect,” he said.
During tests the IIHS recorded a Mercedes having problems when the lane on the highway forked in two. “The radar system locked onto the right-hand exit lane when the driver was trying to go straight,” he said.
Tesla’s autopilot suffered from a different, repeatable glitch that caused it to veer into the guardrail when approaching the crest of a hill. “If the driver had been distracted, that definitely would have caused a crash,” he said.
Concern over this new type of distracted driving is forcing automakers to introduce additional safety features to compensate. For example, GM has introduced eye-tracking technology to check the driver’s eyes are on the road while Tesla drivers can be locked out of autopilot if they ignore warnings to keep their hands on the steering wheel.
That hasn’t stopped some enterprising owners from finding a way to trick the autopilot warning system by wedging an orange or a water bottle into the steering wheel.

Comments

Popular posts from this blog

Report: World’s 1st remote brain surgery via 5G network performed in China

Visualizing The Power Of The World's Supercomputers

BMW traps alleged thief by remotely locking him in car