TSLA/robotaxi/Technical Feasibility of Tesla Robotaxi

Technical Feasibility of Tesla Robotaxi

8.4BFSD cumulative milesgrowing ~1B per 50 days

Tesla's robotaxi thesis rests on a camera-only perception stack achieving Level 4 autonomy — something no company has yet commercially deployed without LiDAR. FSD v14 reaches 9,200 miles between critical interventions, a dramatic improvement but still far from L4 requirements of millions of miles between disengagements.

9,200
Miles between interventions
FSD v14, up from hundreds in v11
4x human
Austin crash rate
Based on NHTSA SGO data
92% fewer
Waymo safety record
serious injury crashes vs human drivers

NHTSA Investigation EA26002

On March 18, 2026, NHTSA escalated its FSD camera degradation investigation to Engineering Analysis covering 3.2 million vehicles. This is the most advanced stage before a potential recall.

The camera-only approach has a fundamental cost advantage — no LiDAR ($267/unit at scale), lower compute requirements — but a fundamental data disadvantage: no direct depth sensing. NVIDIA research confirmed log-linear data scaling laws for end-to-end autonomous driving, validating Tesla's data-centric approach, but the question remains whether camera-only can reach the safety bar.

The key question

Can end-to-end neural networks fully compensate for the lack of active depth sensing (LiDAR/radar)?

Tesla Is the Only Company Pursuing Camera-Only L4

Every deployed L4 system -- Waymo, Zoox, Pony.ai, and Baidu Apollo Go -- uses LiDAR. The camera-only approach faces three empirical challenges: NHTSA investigations of visibility failures, 15-43% camera accuracy retention in adverse weather, and Tesla's Austin fleet crashing at 4x the human rate.

Tesla FSD8 (HW4)NoneNoneUltrasonic (some models)
Waymo 6th-gen1346Audio receivers
ZooxYesYesYesThermal cameras (FLIR)
Pony.ai 7th-gen1494Thermal, audio
Baidu Apollo GoMultiMultiMulti360-degree fusion

The Core Bet

If camera-only works, Tesla wins on cost and scale. If it doesn't, Tesla is years behind competitors who chose LiDAR. This is the single most consequential technical question in the Tesla thesis.

Compute Sufficiency: Is HW4 Enough for L4?

9 evidence

HW4 Is 7-10x Below NVIDIA's L4 Reference Spec

Tesla's HW4 delivers an estimated 100-150 INT8 TOPS. NVIDIA's DRIVE AGX Thor delivers 1,000 INT8 TOPS. Tesla's AI5 chip (2,000-2,500 TOPS) has been delayed to late 2026 / mid-2027, meaning the Cybercab will launch on HW4.

INT8 TOPS36/SoC100-1502,000-2,5001,000
Process nodeSamsung 14nmSamsung 7nmTSMC 3nmTSMC 5nm
MemoryLPDDR4, 96 GB/s16GB GDDR6, 384 GB/sTBDHBM
StatusIn fleet (4M+ cars)Current productionDelayed to late 2026+Shipping to partners
Can run latest FSD?NoYesN/AYes

Efficiency vs Raw Power

Tesla argues its end-to-end neural network is more computationally efficient than traditional modular stacks, achieving comparable real-world driving performance with fewer TOPS. This is plausible but unproven at L4. The Cybercab launching on HW4 rather than AI5 is a real constraint that limits the neural network models it can run.

Data Moat: Does More Data = Better AI?

8 evidence
8.4B+Tesla FSD cumulative milesvs Waymo's 170.7M rider-only miles

Tesla's FSD fleet has surpassed 8.4 billion cumulative miles as of February 2026, growing exponentially -- from 6M in 2021 to 2.25B in 2024 to 4.25B in 2025. This dwarfs Waymo's 170.7 million rider-only miles by roughly 50x. Tesla collects more data per day than all competitors combined.

Quantity vs Quality

Tesla has overwhelming data volume but all of it is camera-only, L2 supervised driving on mapped roads. Waymo has less data but it is L4 autonomous driving across diverse conditions with LiDAR ground truth. The question is whether Tesla's volume compensates for Waymo's quality and diversity.

FSD Improvement Trajectory: v12 through v14

9 evidence

Improving But Far From Waymo's Safety Benchmarks

FSD v14 achieved approximately 1,454 miles between critical disengagements overall (834 in city), a 3-4x improvement over v13.2. But crowdsourced data shows sharp regressions between minor versions, and the system remains far from the safety threshold needed for unsupervised commercial operation.

Open questions

?Will NHTSA's EA26002 probe result in a recall that forces Tesla to add redundant sensors?
?Is AI5 (2,000+ TOPS) necessary for unsupervised L4, or can HW4 (100-150 TOPS) suffice with software optimization?
?Why does Tesla's robotaxi crash rate (1/57K miles) drastically contradict its FSD safety claims (5.1M per major collision)?
?Does Tesla's 8.4B data-mile advantage translate to safety parity with Waymo's curated, simulation-augmented approach?