How Tesla’s Driver Monitoring for Autonomous Driving Was Proven to Be Faulty

The Dawn Project  uncovered that Tesla’s driver monitoring safety system, which is supposed to check that the driver is paying attention to the road ahead, is defective. The system also fails to detect when the driver has both hands off the steering wheel when the vehicle is in Full Self-Driving mode.

Tesla documents numerous limitations of its Full Self-Driving system on its website. It has problems with interactions with pedestrians, bicyclists, and other road users; unprotected turns; multi-lane turns; simultaneous lane changes, narrow roads; rare objects; merges onto high-traffic, high-speed roads; debris in the road; construction zones; and high curvature roads. The website also warns “rain, snow, direct sun, fog, etc. can significantly degrade performance.”

Tesla goes on to warn that the car “may quickly and suddenly make unexpected maneuvers or mistakes that require immediate driver intervention. The list above represents only a fraction of the possible scenarios that can cause Full Self-Driving (Beta) to make sudden maneuvers and behave unexpectedly. In fact, [the car] can suddenly swerve even when driving conditions appear normal and straight-forward.”

Tesla also warns that the software “may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road”.

Regulators have only allowed this ineffective software to be sold to 400,000 ordinary consumers with the requirement that there is a driver in the car who is paying attention to the road, has both hands on the steering wheel, and is ready to take over immediately.

Research has shown that the only way to ensure a driver is paying attention is to implement an effective driver monitoring system using cameras.

Tesla states that “the cabin camera can determine driver inattentiveness and provide you with audible alerts, to remind you to keep your eyes on the road when Autopilot is engaged.”

However, The Dawn Project’s tests of Tesla’s driver monitoring system reveal that the internal camera fails to recognise the following actions commonly performed by an inattentive driver:

  • Looking out of the side window for a prolonged period of time
  • Eating a meal while actively not paying attention to the road ahead
  • Turning around and looking in the back seat
  • Placing a weight on the steering wheel to simulate a driver’s hands

Tesla convinced NHTSA and the California DMV to designate Full Self-Driving as a Level 2 Advanced Driver Assistance System, while simultaneously marketing it as a fully autonomous car.

Joshua Brown’s fatal self-driving collision with a tractor-trailer in June 2016 was attributed to driver inattention by the National Transportation Safety Board (NTSB). The NTSB said the truck should have been visible to Brown for at least seven seconds before impact. Brown “took no braking, steering or other actions to avoid the collision,” the NTSB report said. As a result, NHTSA required Tesla to add a driver monitoring system.

Dan O’Dowd, Founder of The Dawn Project commented: “Tesla’s driver monitoring system is not fit for purpose. It is simply paying lip service to the regulators who were misled into allowing Tesla to ship an unsafeself-driving car to more than a million ordinary consumers, on the basis that there was an effective driver monitoring system in place.

“Tesla’s driver monitoring safety system has been programmed to the pass a cursory test but is completely ineffective in many real-world scenarios. Tesla has access to the cabin camera footage from all 840 crashes. Tesla must know that many of the drivers were not paying attention at the time of the crash. 23 people have already perished in situations where Tesla’s self-driving software was active..

“The Dawn Project’s safety tests show that Tesla drivers can stare out of the window for minutes at time, look in the back seat for over five minutes, eat a meal and completely remove their hands from the steering wheel while the vehicle is in self-driving mode, a clear failure of Tesla’s  ineffective driver monitoring system.

“The chilling reality of Tesla’s failure to develop a fully functional driver monitoring system is that over a million self-driving Teslas on the road today are unable to detect an inattentive driver, and no doubt many of the 840 crashes and 23 self-driving deaths would have been avoided if Tesla had implemented an effective driver monitoring system.

“This typifies the lack of care and consideration that goes into the design and testing of Tesla’s Full Self-Driving safety features. Tesla’s driver monitoring system satisfies the regulators but is often unable to detect driver inattentiveness.

“The lack of an effective driver monitoring system that requires people to keep their hands on the steering wheel and pay attention to the road means that Tesla’s Full Self-Driving system is abreach of NHTSA’s requirements. The regulator must immediately recall Full Self-Driving and put an end to the deployment of Tesla Full Self-Driving.

“Without an effective driving monitoring system,  over a million Tesla drivers are in Joshua Brown’s seat.”

Dan O’Dowd is an entrepreneur and CEO with over 40 years’ experience in designing and writing secure, safety-critical software. Dan has built operating systems for the U.S. military’s fighter jets and some of the world’s most trusted organizations such as NASA, Boeing, and Airbus.

In 2021, Dan O’Dowd founded The Dawn Project, which campaigns to make computers safe for humanity by ensuring all software in safety-critical infrastructure never fails and can’t be hacked. The first danger The Dawn Project is tackling is Tesla’s deployment of unsafe Full Self-Driving cars on our roads.