- Tesla’s fully self-driving systems have once again come under scrutiny
- Former Uber self-driving chief crashes Model X using FSD
- A Cybertruck also crashed violently into an overpass barrier
Two recent high-profile Tesla accidents have once again put the spotlight on the company’s full self-driving (FSD) technology. They highlight the problem of “asking humans to supervise systems designed to make supervision seem pointless,” as Uber’s former head of autonomous vehicles wrote in an article for The Atlantic after his Model X crashed into a wall.
Raffi Krikorian, chief technology officer at Mozilla and former head of self-driving at Uber, said he was navigating residential streets in the San Francisco Bay Area with his Model
Moments before hitting a wall, Krikorian says he turned the wheel to take control, but it was too late.
Article continues below.
In a similar, but much scarier scenario, dash cam footage captured by Justine Saint Amour’s Cybertruck reveals the moments just before the vehicle crashed into an overpass barrier at high speed, nearly sending it over the edge and potentially killing the driver and her one-year-old son.
‘TERRIFYING’: Dashcam video shows the moment a Tesla Cybertruck, reportedly operating in self-driving mode, nearly sent a Houston mother and her baby off a bridge before crashing violently into an overpass barrier. The woman claims she suffered multiple injuries from the incident… pic.twitter.com/DgcnHp2FtZMarch 17, 2026
Saint Amour says the vehicle was in fully autonomous driving mode before the incident, but his attorney Bob Hilliard acknowledged that his client disabled the system moments before impact. Tesla CEO Elon Musk immediately realized this fact.
He turned to
But Raffi Krikorian, who used to run Uber’s autonomous vehicle division, argues in his article that drivers need “five to eight seconds to mentally reconnect after an automated driving system returns control.” In his opinion, this middle ground simply does not work.
“A machine that constantly fails keeps you on your toes. A machine that works perfectly needs no supervision. But a machine that works almost perfectly? That’s where the danger lies,” he writes.
Saint Amour suffered two herniated discs in his lower back, one in his neck, a sprained tendons in his wrist, and experienced numbness and weakness in his right hand, according to Electrek, and is suing Tesla for more than $1 million as a result.
Analysis: Tesla’s messages have long been misleading
Tesla may have stopped referring to its advanced cruise control system as “autopilot,” but that hasn’t stopped the company from continuing to promote its fully autonomous driving technology.
Despite adding “Supervised” to the procedures, Elon Musk has long spread the myth that his technology is more capable than it actually is.
He has stated that a Tesla should be able to drive autonomously “90 percent of the miles.” back in 2013, as well as claiming that drivers could soon “go to sleep in their car and wake up at their destination” on multiple occasions, the most recent being late last year.
The growing number of high-profile court cases and accidents are testament to the fact that we’re still a long way from that point, and even though Tesla’s website warns full self-driving users not to “become complacent,” customers are clearly putting more trust in the systems than they should.
As Raffi Krikorian writes: “When a car’s marketing says ‘autonomous driving’ but the fine print says ‘responsible driver,’ it’s a warning sign.”
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course, you can also follow TechRadar on YouTube and tiktok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.




