As Google tries to drive through U.S. automotive safety laws while testing self-driving cars in Phoenix, two separate studies reveal two kinds of features needed for autonomous driving—cameras facing the driver and real-time maps. Strategy Analytics reports that direct sensing of driver’s eyes are important for the driver to take back control of the vehicle. ABI research reports real-time maps are essential for the next phase of self-driving autonomy.
Google Tries to Drive Through NHTSA But Get Stopped by WatchDog
Google announced today that it will testing its Lexus self-driving cars in Phoenix Arizona, to test the cars in a desert environment. Google claims the company has racked up more than 1.5 million miles and is currently out on the streets of Mountain View, CA, Austin, TX, Kirkland, WA and Metro Phoenix, AZ.
A little-publicized proposal by Google to allow its driverless cars to detour U.S. auto safety laws threatens public safety and security, Consumer Watchdog said in a letter sent today to Secretary of Transportation Foxx and the National Highway Traffic Safety Administration (NHTSA).
Noting Google’s consistent refusal to disclose information about its driverless cars’ crashes, how its software algorithms will make life and death decisions, its vulnerability to hackers and collection of personal data, the non-profit advocacy group Consumer WatchDog urged the Department of Transportation and NHTSA to require Google to answer ten questions about the safety of its robot car program within thirty days.
NHTSA will hold its very first public hearing on the subject tomorrow, Friday April 8, in Washington, D.C. Consumer Watchdog’s Privacy Director John Simpson will testify at the hearing, as will former NHTSA head Joan Claybrook and Clarence Ditlow, Executive Director of the Center for Auto Safety.
Federal law requires that automobile manufacturers demonstrate the safety of their vehicles to NHTSA through a formal regulatory process that enables American taxpayers and consumers to monitor and participate in the agency’s decision-making process. A series of recent proclamations by NHTSA suggest that the agency is considering abandoning its statutory responsibility to set federal safety standards.
Driver monitoring systems are growing to determine if driver’s eyes are ready to take control over the car and for safety. Meanwhile group sourcing of maps will be important
Most driver monitoring is currently inferred from existing sensors fitted to the vehicle, as mandated for antilock brake and electronic stability control systems, and for electric power steering systems that enhance comfort and fuel economy. As such, deployments in inferred systems will grow from 8.7M units in 2016 to 16.8M by 2022.
The Strategy Analytics report, “Semi-Autonomous Applications Accelerate Development in Automotive Driver Monitoring Systems,” tracks the growing importance of direct sensing of the driver’s eyes, face and head to ensure the accurate assessment of the driver’s ability to retake control from a semi-autonomous vehicle.
“As we have seen from numerous cockpit concepts at CES 2016, interior camera-based driver monitoring systems will be required for semi-autonomous vehicles, as inferred systems using accelerometers, steering angle, wheel speed sensors and the front windshield camera cannot accurately determine the driver’s state,” said Kevin Mak, Senior Analyst for Powertrain, Body, Chassis & Safety at Strategy Analytics.
He added, “While inferred systems only add software algorithms to existing sensors, the interior camera-based systems have greater technical challenges, such as the varying levels of sunlight in the cabin and the complexity in observing the human face – to detect the tell-tale signs of distraction and drowsiness. The current system supplied to the Lexus LS sedan has not progressed beyond a single auto maker and beyond the luxury model segment. New developments aim to extend the accuracy of driver monitoring and to deploy the system more widely to other auto makers. Allied with the potential demand from semi-autonomous vehicles and the requirement to add other functions using the same interior camera, Strategy Analytics expects demand for interior camera driver monitoring to grow to 3.7M units by 2022.”
Accurate Maps Required
ABI Research, reports that accurate, real-time maps are an essential next step as the automotive industry steers toward the future of fully driverless cars. All autonomous and driverless vehicle maps will need to combine accuracy, environmental models, and real-time attributes allowing positional and temporal awareness.
“Crowdsourcing is crucial,” says Dominique Bonte, Managing Director and Vice President at ABI Research. “As connected vehicles include more low-cost, high-resolution sensors, cars will capture and upload this data to a central, cloud-based repository so that automotive companies, such as HERE, can crowdsource the information to build highly accurate, real-time precision maps. This is fueled by the rapid adoption of a wide range of active safety systems with more than 94 million longitudinal assistance ADAS systems expected to ship in 2026.”
The new 3D, dynamic maps will provide a complementary data set to ADAS sensors for an overall smoother driving experience. Whereas sensors provide real-time visibility on a vehicle’s immediate vicinity for last-minute obstacle detection and collision avoidance, maps extend this visibility to allow vehicles to anticipate those situations long before the sensors would even have to detect them. Maps will work in harmony with ADAS sensors to dramatically improve overall accuracy and predictability.
The battle for controlling crowdsourced driverless HD map technology is heating up. Daimler, jointly owning HERE with BMW and Audi, confirmed talks with Amazon and Microsoft to join the consortium. Mobileye signed agreements with GM, VW, and Nissan to use its Road Experience Management (REM) mapping platform. And at the GPU Technology Conference (GTC), NVIDIA announced its new HD mapping approach based on its DRIVE™ PX machine vision hardware platform. Car OEM Toyota also announced its own mapping platform, working with mapping supplier Zenrin in Japan.
ABI Research argues that the biggest challenge for the new mapping paradigm is the lack of standards coupled with high levels of fragmentation in the automotive industry. Despite HERE’s efforts to assemble the industry around its Sensor Integration Standard for real-time map attributes, many players, like ADAS vendor Mobileye, are vying to play a role in map data crowdsourcing and proposing and/or imposing their own proprietary approaches.