BotBeat
...
← Back

> ▌

Tesla (FSD/Optimus)Tesla (FSD/Optimus)
INDUSTRY REPORTTesla (FSD/Optimus)2026-03-17

Former Uber Self-Driving Chief Crashes Tesla on FSD, Exposes Fundamental Supervision Problem

Key Takeaways

  • ▸Even autonomous vehicle experts are vulnerable to Tesla FSD's "vigilance decrement trap," where near-perfect performance paradoxically encourages drivers to stop paying attention
  • ▸Tesla's Level 2 classification places full liability on drivers while the company retains extensive telemetry data it uses to shift blame post-crash, creating an asymmetric accountability problem
  • ▸Psychological research confirms that monitoring a system that works almost perfectly creates dangerous attention gaps of 5-8 seconds, which is longer than typical emergency response times
Source:
Hacker Newshttps://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/↗

Summary

Raffi Krikorian, Mozilla's CTO and former head of Uber's autonomous vehicle division, totaled his Tesla Model X while using Full Self-Driving on a residential street with his children in the back seat. In an essay published in The Atlantic, Krikorian provides an informed critique of Tesla's Level 2 autonomy approach, describing how the system suddenly lost control during a turn he had navigated hundreds of times. Despite his extensive expertise in building self-driving systems and training safety drivers at Uber—where pilot programs achieved zero injuries—Krikorian was unable to intervene in time to prevent the crash, which left him with a concussion and neck injury.

Krikorian's analysis identifies a fundamental flaw in Tesla's supervised autonomy model: the system is designed to work so reliably that it creates a dangerous trap where drivers gradually stop paying attention. He explains how the progression from highway use (where FSD worked well) to local roads conditioned him to trust the system, eventually leading to the accident. The incident also raises critical questions about data accountability, as Krikorian's name appeared on the insurance report rather than Tesla's, despite the company collecting extensive telemetry data on driver behavior that it has used post-crash to shift blame onto drivers.

Psychological research supports Krikorian's concerns, documenting a phenomenon called "vigilance decrement" where monitoring a nearly-perfect system leads to mind-wandering and boredom. Studies show drivers take 5 to 8 seconds to mentally reengage after an automated system transfers control back, yet emergencies often unfold faster than that window. Krikorian's critique comes amid broader concerns about Tesla's data practices, particularly after a landmark $243 million wrongful-death verdict in Florida where plaintiffs had to hire a hacker to recover evidence Tesla claimed was unavailable.

  • Tesla's current supervised autonomy model is fundamentally broken because it asks humans to supervise a system specifically designed to make supervision feel unnecessary

Editorial Opinion

Krikorian's account from someone with genuine expertise in autonomous systems is damning indictment of Tesla's approach to Level 2 autonomy. The core insight—that near-perfect reliability paradoxically creates worse outcomes than either fully manual or truly autonomous systems—challenges the entire premise of "supervised" self-driving. Until regulatory frameworks align liability with data access and Tesla genuinely solves the vigilance problem rather than exploiting it, FSD remains a fundamentally flawed technology masquerading as a safety feature.

Autonomous SystemsTransportationRegulation & PolicyEthics & BiasAI Safety & AlignmentJobs & Workforce Impact

More from Tesla (FSD/Optimus)

Tesla (FSD/Optimus)Tesla (FSD/Optimus)
POLICY & REGULATION

California Regulator Confirms Tesla's 'Robotaxi' Is Just a Chauffeur Service, Exempt From Safety Reporting

2026-03-25
Tesla (FSD/Optimus)Tesla (FSD/Optimus)
POLICY & REGULATION

Tesla Faces Wider Federal Probe of Self-Driving Feature Amid Robotaxi Rollout Plans

2026-03-23
Tesla (FSD/Optimus)Tesla (FSD/Optimus)
POLICY & REGULATION

Tesla's Full Self-Driving Faces Potential Recall as NHTSA Investigates Degradation Detection System Failures

2026-03-19

Comments

Suggested

OracleOracle
POLICY & REGULATION

AI Agents Promise to 'Run the Business'—But Who's Liable When Things Go Wrong?

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Anthropic Explores AI's Role in Autonomous Weapons Policy with Pentagon Discussion

2026-04-05
PerplexityPerplexity
POLICY & REGULATION

Perplexity's 'Incognito Mode' Called a 'Sham' in Class Action Lawsuit Over Data Sharing with Google and Meta

2026-04-05
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us