Tesla AutoPilot Recall is Not Enough & Doesn’t Solve Dangerous Problems

Software expert Dan O’Dowd, Founder of safety advocacy group The Dawn Project, commented on NHTSA’s recall of over two million Tesla vehicles, saying that the recall does not go far enough to address critical safety issues in Tesla’s self-driving software.

Commenting on NHTSA’s recall, Founder of The Dawn Project Dan O’Dowd said:

“NHTSA’s recall misses the point that Tesla must address and fix the underlying safety concerns that have been raised in regards to its self-driving software to prevent further deaths. Without addressing these critical safety issues, the public will continue to be crash test dummies for Tesla’s self-driving experiment. The only way to protect road users is to ban Tesla’s self-driving software. Minor software updates to Tesla’s driver monitoring system will not be enough. 

“This recall does not solve the underlying problems associated with Tesla’s software – namely, that it does not fully recognize objects and does not stop for obstacles, like school buses and stop signs. Until we take this dangerous technology off the roads, and until Tesla stops testing its systems in real-life situations, we are continuing to needlessly put Americans at risk.

“NHTSA must now act swiftly to ban Tesla Full Self-Driving from public roads until all safety defects have been fixed. Allowing Tesla to issue a voluntary recall is completely inadequate and regulators must compel Tesla to fix these defects and set a deadline for determining whether Tesla has addressed these issues.”

NHTSA’s recall requiring improvements to Tesla’s driver monitoring system comes following The Dawn Project’s campaign to highlight serious concerns about this software.

The Dawn Project’s tests revealed that Tesla’s driver monitoring system did not detect an inattentive driver.

Videos of these tests can be found here: