Amid heated debate over the safety of autonomous vehicles, Waymo’s co-founder publicly criticized Tesla’s approach to Full Self-Driving technology while his own company’s driverless taxis are under federal scrutiny for repeatedly failing to stop for stopped school buses.
This story began attracting widespread attention after John Krafcik, one of the original architects of Waymo’s autonomous driving efforts, appeared on a popular industry podcast to call out Tesla’s camera-only system as “myopic” and insufficient for truly safe autonomy.
Krafcik’s remarks were widely reported and amplified largely because they came from a veteran of the autonomous driving field confronting one of its most high-profile competitors. Interestingly, this isn’t the first public clash between the two tech giants.
Waymo’s Robotaxis Run Afoul of School Bus Laws
But the sharp critique of Tesla’s approach was quickly overshadowed by renewed focus on a very real and legally clear safety issue involving Waymo vehicles themselves.
Multiple videos and reports from the Austin Independent School District (ISD) documented at least two dozen instances where Waymo’s robotaxis have navigated around stopped school buses that were actively loading or unloading children with flashing red lights and deployed stop arms.
In every U.S. state, traffic laws require vehicles approaching a school bus with its red signals activated to come to a full stop until the bus resumes motion.
The incidents primarily occurred in Austin, Texas, where local authorities and school officials recorded videos clearly showing the automated vehicles either drifting around a stopped bus or failing to halt entirely. One particularly startling occurrence involved a Waymo vehicle passing a stopped bus shortly after a young student had crossed the street to board it.
These repeated infractions prompted federal regulators to step in. The National Highway Traffic Safety Administration’s Office of Defects Investigation opened a formal probe in late 2025 after receiving multiple reports and evidence of Waymo vehicles failing to comply with school bus stop laws.
In response to federal pressure, Waymo agreed to issue a voluntary software recall affecting more than three thousand of its driverless cars as the company sought to correct the underlying behavior.
Recall Fails to Fix Problem, NTSB Steps In
Image Credit: Daniel Ramirez from Honolulu, USA, CC BY 2.0, Wikimedia.
Despite these updates, the school district in Austin reported that at least five subsequent violations happened even after the software changes were deployed. Frustrated officials asked Waymo to temporarily suspend robotaxi operations during school pickup and drop-off periods until it can assure compliance with the law. Waymo declined to pause service in those time windows.
The situation escalated further this week when the National Transportation Safety Board (NTSB) announced its own investigation. The NTSB, which typically conducts thorough, independent accident and incident reviews across transportation modes, said investigators would travel to Austin to gather data on the series of illegal school bus passing events.
A preliminary NTSB report is expected within 30 days, with a final report likely taking up to two years. While the NTSB cannot levy fines, its findings often shape regulatory recommendations and national safety policy.
Waymo’s Defense and the Unforgiving Bar for Autonomy
Image Credit: Waymo.
Waymo, for its part, insists that overall safety performance around school buses remains strong and that none of the recorded incidents have resulted in collisions. Mauricio Peña, Waymo’s chief safety officer, has reiterated that the company believes its autonomous driving system handles school bus encounters at a level that is at least on par with, if not superior to, human drivers and that continuous improvements are being integrated into its software.
“We safely navigate thousands of school bus encounters weekly across the United States, and the Waymo Driver is continuously improving,” Pena reportedly said in a statement to TechCrunch. “There have been no collisions in the events in question, and we are confident that our safety performance around school buses is superior to human drivers.”
Industry analysts point out that autonomous driving technology is extraordinarily complex and that edge-case scenarios like school bus interactions remain a significant challenge for all developers. While human drivers also regularly break school bus stop laws, the bar for autonomy systems is often perceived as higher because these systems are expected by regulators and the public to operate flawlessly within the rules of the road.
A Clash of Philosophies and Expansion Pressures
Image Credit: Waymo.
The public controversy over Waymo’s school bus behavior and Krafcik’s critique of Tesla’s safety approach illustrates the broader tensions within the autonomous vehicle industry.
Companies are racing to expand services nationwide, with Waymo recently launching a driverless robotaxi operation in Miami alongside existing services in cities like San Francisco, Los Angeles, Atlanta, and Phoenix. But expansion brings increased scrutiny.
Krafcik’s remarks landed like a flashbang in the autonomy debate, dismissing Tesla’s safety philosophy as fundamentally flawed and arguing that relying primarily on cameras rather than a sensor fusion approach with lidar and radar puts the public at risk. He framed Tesla’s Full Self-Driving strategy as an experiment conducted on open roads rather than a rigorously constrained system.
Tesla has not issued a direct response to Krafcik’s comments.
Elon Musk has long maintained that vision-only autonomy mirrors how humans drive and is ultimately safer and more scalable. Supporters within Tesla’s orbit point instead to billions of miles driven on Autopilot and FSD as rebuttal by data rather than rhetoric.
As regulators dig into these compliance issues, the industry’s leaders face mounting pressure to prove that autonomous vehicles can not only match but exceed human performance on road safety.


Leave a Reply