Tesla Robotaxis to Roll in Austin on June 22?

Tesla’s much-hyped Robotaxi rollout—slated for June 22, 2025—has captured headlines and stirred excitement in the auto and tech worlds. Elon Musk announced last month that Tesla will deploy 10 to 20 fully autonomous Model Ys, outfitted without steering wheels or pedals, as a driverless ride-hailing fleet in Austin, Texas.

While the promise of autonomous Tesla taxis roving the streets of Austin is tantalizing, industry analysts, longtime Musk-watchers, and even some Tesla investors are approaching the date with skepticism—because the Tesla CEO has a long history of missing deadlines, stretching definitions, and overpromising on automation.

A History of Moving Targets

Elon Musk first promised a self-driving Tesla would complete a cross-country trip from Los Angeles to New York by the end of 2017. That drive never happened. In 2019, he stated with confidence that “a year from now,” Tesla would have “over a million cars with full self-driving, software, everything” on the road. Five years later, none of Tesla’s vehicles are considered fully autonomous by any industry standard.

Musk has repeatedly predicted the arrival of Level 5 autonomy—defined by the Society of Automotive Engineers as a vehicle capable of operating without any human intervention under all conditions. Yet, Tesla’s current “Full Self-Driving” (FSD) software remains at best Level 2, requiring a driver to remain alert and responsible for the vehicle.

This gap between projection and reality has raised flags in both the tech and auto sectors. Critics argue that Musk’s announcements often serve more as strategic distractions or investor bait than as concrete roadmaps.

What’s New This Time?

Tesla says the upcoming Robotaxi launch in Austin is different. The company has reportedly retrofitted Model Ys with custom interiors that eliminate the steering wheel and pedals entirely. The fleet will operate in a geofenced zone, a more limited approach compared to the original cross-country ambitions. If true, this represents a cautious and pragmatic step forward.

Tesla is not alone in this race. Alphabet’s Waymo and Amazon’s Zoox already operate robotaxis in select areas, using lidar-based navigation systems. Tesla, by contrast, relies exclusively on cameras and neural nets, rejecting radar and lidar. That philosophical difference is both the basis of Tesla’s promise—and its greatest liability.

While Waymo has steadily scaled its driverless services in Phoenix, San Francisco, and Los Angeles, Tesla’s approach has not yet delivered similar results in real-world conditions.

One of the biggest unknowns surrounding the June 22 launch is regulation. The National Highway Traffic Safety Administration (NHTSA) has yet to approve the deployment of vehicles without traditional controls. Any last-minute regulatory pushback could delay or dilute the launch.

Then there are safety concerns. Tesla’s FSD software is still under federal investigation following a series of accidents, and there is no public data showing that its driverless system meets or exceeds the safety levels of human drivers.

Even Tesla bulls admit there’s reason to be cautious. “If anyone else promised what Musk does, they’d be laughed out of the room,” said Colin Rusch, senior analyst at Oppenheimer. “But he’s delivered enough in the past to keep people listening.”

Meanwhile the Dawn Project shows a Tesal in FSD running over a child dummy on a street in Austin.

A live demonstration held in Austin on June 12, 2025, revealed alarming failures in Tesla’s Full Self-Driving (FSD) software. Organized by public safety advocacy group The Dawn Project, in collaboration with Tesla Takedown and ResistAustin, the event aimed to expose critical flaws in the latest version of the FSD system—version 13.2.9.

During the demonstration, a Tesla vehicle operating in Full Self-Driving mode was repeatedly tested in a controlled environment where a child-sized mannequin crossed the road in front of a stopped school bus. The school bus displayed its red flashing lights and extended stop sign, as required by law. In all eight test runs, the FSD-equipped Tesla failed to stop. The vehicle not only struck the child mannequin in every attempt, but also illegally passed the stopped school bus.

Observers noted that at no point did the software disengage or issue an alert to the driver following the collisions. According to The Dawn Project, these repeated failures highlight a serious public safety risk associated with the real-world deployment of Tesla’s autonomous software.

ResistAustin organizer, Nevin Kamath, commented: “Austinites are not Elon’s personal crash-test dummies. ResistAustin is appalled that Musk chose our town as the launching pad for this dangerous technology and we encourage the people of Austin to boycott this dangerous service.”