Google has been pushing NHTSA to make decisions about legal technical aspects of self-driving cars in relationship to Department of Transportation laws. The agency said earlier this year that it would respond quickly to requests from technology companies and auto suppliers. A letter sent to Google suggests that the software controlling an autonomous car could be considered “the driver.’
A letter interpenetrating legal aspects of Google’s cars without human drivers run by computers was sent to Chris Urmson on February 4, from NHTSA’s attorney Paul A. Hemmersbaugh.
“NHTSA will interpret ‘driver’ in the context of Google’s described motor vehicle design as referring to the (self-driving system), and not to any of the vehicle occupants” stated the letter.
Laws still require braking systems activated by foot control and it is not clear “whether and how Google could certify that the system meets a standard developed and designed to apply to a vehicle with a human driver.”
Consumer Watchdog says that National Highway Traffic Safety Administration is wrong to say the artificial intelligence guiding an autonomous robot car counts as the driver. The group notes that Google’s own test data demonstrates the need for a human driver who can take control when necessary.
“Google says its robot technology failed and handed over control to a human test driver 272 times and the driver was scared enough to take control 69 times,” said John M. Simpson, Consumer Watchdog’s Privacy Project Director. “The robot cars simply cannot reliably deal with everyday real traffic situations. Without a driver, who do you call when the robots fail?”
Consumer Watchdog reiterated its support for regulations proposed by the California Department of Motor Vehicles covering the general deployment of autonomous robot cars on the state’s highways.
“The DMV would require a licensed driver behind the wheel,” Simpson noted. “If you really care about the public’s safety, that’s the only way to go.”
Commenting on NHTSA’s interpretation that the robot technology can count as a driver, Anthony Foxx, Secretary of Transportation said, “We are taking great care to embrace innovations that can boost safety and improve efficiency on our roadways. Our interpretation that the self-driving computer system of a car could, in fact, be a driver is significant. But the burden remains on self-driving car manufacturers to prove that their vehicles meet rigorous federal safety standards.”
Consumer Watchdog said it will press NHTSA and the DOT to ensure that robot car manufacturers prove their cars are safe. The group also called on NHTSA to learn from California’s experience with self-driving robot cars.
The companies’ own data in reports filed with the California DMV makes clear that a human driver able to take control of the vehicle is necessary to ensure the safety of both robot vehicles and other vehicles on the road, Consumer Watchdog said
Google, which logged 424,331 “self-driving” miles over the 15-month reporting period, said a human driver had to take over 341 times, an average of 22.7 times a month. The robot car technology failed 272 times and ceded control to the human driver; the driver felt compelled to intervene and take control 69 times, according to its “disengagement report” filed with the DMV.
Other testing companies, driving far fewer autonomous miles than Google, also reported substantial numbers of disengagements to the DMV. Bosch had 625 disengagements with 934.4 miles driven. Nissan with 1,485 miles driven had 106. Mercedes-Benz reported 1,031 with 1,738 miles driven. Delphi reported 405 disengagements with 16,662 miles. Volkswagen with 10,416 miles reported 260. Tesla claimed it had none, but did not say how many miles its drove.
It’s important to understand that these “disengagements” were promoted by real situations that drivers routinely encounter on the road, Consumer Watchdog said. Among reasons cited by Bosch were failures to detect traffic lights and heavy pedestrian traffic.
Google’s robot technology quit 13 times because it couldn’t handle the weather conditions. Twenty-three times the driver took control because of reckless behavior by another driver, cyclist or pedestrian. The report said the robot car technology disengaged for a “perception discrepancy” 119 times. Google defines such a discrepancy as occurring when the car’s sensors don’t correctly perceive an object, for instance over-hanging branches. The robot technology was disengaged 55 times for “an unwanted maneuver of the vehicle.” An example would be coming too close to a parked car. The human took over from Google’s robot car three times because of road construction.
“What the disengagement reports show is that there are many everyday routine traffic situations with which the self-driving robot cars simply can’t cope, said Simpson. “Self-driving vehicles simply aren’t ready to safely manage many routine traffic situations without human intervention.”