Introduction by your Director of Humor. I asked the Deep Research feature of ChatGPT the following: How does Tesla, Inc use AI in their electric cars? Details and source by car model name and corporate use. It returned the following article which is 4467 words. This research was completed in 15 minutes with 36 citations and 266 searches. I believe it’s the most thorough deep dive on the face of planet Earth or Mars. Enjoy 🙂
Deep Research Report on AI Use in Tesla Electric Cars
Executive summary
Tesla’s in-vehicle AI is centered on camera-based, neural-network-driven driver assistance—marketed as Autopilot and Full Self-Driving (Supervised)—with most real-time inference executed onboard using dedicated “AI computer” hardware inside the vehicle. In corporate disclosures, Tesla describes autonomy as a core strategy: it trains “artificial neural networks” using massive fleet data, accelerates them with “in-house, custom designed inference chips,” and improves deployed behavior through frequent over-the-air (OTA) updates. [1]
Across the modern lineup (Model S/3/X/Y and Cybertruck), AI-driven features are broadly similar; differences are driven more by build year / sensor set / onboard computer generation than by model name. Tesla’s own manuals emphasize that Self-Driving functions depend on camera inputs and an in-vehicle “AI computer” that “process[es] neural networks” to guide driving, while also enforcing driver supervision via cabin-camera-based attentiveness checks that “cannot be disabled.” [2]
A major strategic pivot on the sensing side is Tesla’s shift to Tesla Vision: Tesla states it removed front radar from Model 3/Y (starting 2021) and later Model S/X (2022), then removed ultrasonic sensors (USS) from Model 3/Y (2022) and S/X (2023), replacing USS inputs with a vision-based occupancy network used in FSD (Supervised). [3]
At the corporate level, Tesla reports scaling its AI training infrastructure with a large training cluster (“Cortex”) and building “Cortex 2,” disclosing “Cortex 1 >100k H100e” and plans to expand onsite compute in 2026. [4] Tesla also continues to reference (in recruiting and technical forums) auto-labeling infrastructure, large-scale dataset curation, and high-throughput training on GPU clusters and “soon, our supercomputer Dojo,” though corporate materials in 2025–2026 emphasize Cortex more prominently than Dojo. [5]
Safety and regulatory scrutiny is material and ongoing. Official recall documents show that (a) Tesla issued an OTA recall remedy for FSD Beta behavior in 2023 (NHTSA Recall 23V-085), explicitly describing it as SAE Level 2 requiring constant supervision; and (b) Tesla issued an OTA remedy for Autosteer misuse controls in 2023 (NHTSA Recall 23V-838), adding controls/alerts and potential suspensions. [6] As of March 2026, public reporting indicates National Highway Traffic Safety Administration[7] has escalated an engineering analysis into crashes involving FSD under reduced-visibility conditions. [8]
Vehicle AI architecture and platform
Functional stack in Tesla vehicles
Tesla’s filings describe a standard autonomy software decomposition—perception and “artificial neural networks” trained on “massive amounts of field data,” deployed to customer vehicles and iteratively improved via OTA updates. [9] Tesla’s support documentation frames Autopilot as a camera-based system using “neural net processing” and “a powerful onboard computer” that processes inputs in milliseconds, while remaining a hands-on driver assistance system. [10]
From Tesla’s owner manuals, the Self-Driving control loop can be summarized as:
- Sensors: multiple exterior cameras (front/rear/side) plus interior sensors (cabin camera; and on some vehicles, cabin radar), with availability varying by configuration and manufacture date. [11]
- Onboard inference: an “AI computer installed in [the vehicle]” that “process[es] neural networks” to form a model of surroundings and make driving decisions. [12]
- Driver supervision: cabin camera monitoring for attentiveness during FSD (Supervised), with explicit user-interface indicators and escalating warnings; Tesla states this monitoring “cannot be disabled.” [13]
- Fallback / degradation: if cameras are obstructed or visibility is degraded, Self-Driving features may become limited or unavailable, with on-screen alerts. [14]
Sensors and the Tesla Vision transition
Tesla’s official Tesla Vision update describes a two-step sensor simplification:
- Radar removal: radar removed from Model 3/Y starting 2021 and Model S/X in 2022, shifting these vehicles to a “camera-based Autopilot system.” [3]
- Ultrasonic sensor removal: USS removed from Model 3/Y in 2022 (most markets) and from all Model S/X in 2023; Tesla states it launched a vision-based occupancy network (used in FSD (Supervised)) to replace USS-generated inputs. [3]
Tesla’s manuals for newer vehicles emphasize camera placement and camera care, and repeatedly warn that environmental factors can reduce camera performance. [15]
Driver monitoring and interior sensing
Tesla describes cabin camera functionality as both a driver-monitoring and safety feature:
- Cabin camera availability: Tesla states it is present in Model 3, Model Y, Cybertruck, and Model S/Model X produced in 2021 or later. [16]
- By default, cabin camera images/video “do not leave the vehicle,” and are only shared if the driver enables data sharing (with additional special-case sharing in safety events). [17]
- During FSD (Supervised), the cabin camera monitors attentiveness and “cannot be disabled.” [18]
Some newer platforms also include cabin radar for occupant detection. Tesla’s Model Y documentation states cabin radar supports vehicle/safety features including driver detection, seat occupancy, the Occupant Classification System, and auto parking brake engagement; older Model Y documentation ties cabin radar to front passenger airbag occupancy detection and FMVSS 208 compliance. [19]
Onboard compute and hardware generations
Tesla does not publish a single, exhaustive public mapping of “AI computer” generations to every VIN; instead, it tells owners how to check “AI computer type” in the vehicle UI. [10] In filings, Tesla describes onboard inference acceleration using “in-house, custom designed inference chips.” [1]
For next-generation compute, Tesla’s Q4 2025 deck describes ongoing work on “in-house, custom designed” inference chips (AI5 and AI6), targeting a “step-function improvement” over AI4, including claims of increased compute, memory, and hardened quantization/softmax. [20]
For training infrastructure (offboard), Tesla’s disclosures emphasize “AI infrastructure” and owned data centers, and explicitly name the Texas-based “Cortex” cluster and “Cortex 2.” [21]
Model-by-model AI features and platform differences
Tesla’s own FSD (Supervised) support page is explicit that feature availability varies by “vehicle configuration, hardware, software version, region, model, vehicle trim and model year.” [22] As a result, the most accurate “model-by-model” description is a platform + build-era view: the same named model may exist in the fleet with different sensors (radar/USS present in older vehicles; vision-only in newer ones) and different onboard computers.
Cross-model comparison table
| Vehicle | Current Tesla-described camera layout in manual (representative) | Driver monitoring & interior sensing | Tesla Vision / radar / USS status (per Tesla) | Self-Driving software status (Tesla terminology) | Notes on HW/SW compatibility |
| Model S (incl. Plaid) | Cameras include rear plate, door pillars, windshield-mounted cameras, fenders, and front fascia camera. [23] | Cabin camera present on 2021+ Model S (Tesla support). [16] | Radar removed in 2022; USS removed in 2023; occupancy network used to replace USS inputs. [3] | FSD (Supervised) described as Level-2-like driver assistance needing active supervision; improves via OTA. [24] | Owners must check “AI computer type” in-car; features vary by build date/hardware. [25] |
| Model X (incl. Plaid) | (Not exhaustively enumerated on one table page; similar external camera suite in manuals and dependence on camera visibility.) [26] | Cabin camera present on 2021+ Model X (Tesla support); FSD attentiveness monitoring cannot be disabled. [27] | Radar removed in 2022; USS removed in 2023; occupancy network replacement. [3] | Same Autopilot / FSD (Supervised) descriptions and warnings. [28] | Same “check AI computer type” guidance. [25] |
| Model 3 | Exterior camera positions: rear plate; door pillars; two windshield cameras; front fender cameras; optional front bumper/fascia camera “if equipped.” [29] | Cabin camera monitors attentiveness during FSD (Supervised); cannot be disabled; sunglasses still allow monitoring. [30] | Radar removed starting 2021; USS removed starting 2022 (most markets); occupancy network replacement. [3] | Manual states FSD (Supervised) uses camera inputs and an “AI computer” to process neural nets; upgrades via OTA. [12] | Tesla explicitly notes that using devices to circumvent occupancy detection violates terms and can lead to permanent disablement. [12] |
| Model Y | Exterior camera positions: front bumper/fascia camera; rear plate; door pillars; two windshield cameras; front fender cameras. [31] | Cabin camera + cabin radar (occupancy detection and safety features). [32] | Radar removed starting 2021; USS removed starting 2022 (most markets); occupancy network replacement. [3] | Autosteer described as “BETA”; Self-Driving depends on camera perception, and can degrade if cameras are obstructed. [33] | Tesla’s deck states Robotaxi operations currently use Model Y vehicles. [1] |
| Cybertruck | Camera positions: tailgate; door pillars; two windshield cameras; front wheel cameras; front bumper camera. [34] | Cabin camera: driver inattentiveness + active safety enhancements; default no sharing; limited sharing on safety-critical event if data sharing enabled. [35] | Tesla Vision policy applies broadly; Tesla’s Vision update page discusses S/3/X/Y changes, while Cybertruck manual documents its camera suite. [36] | FSD (Supervised) driver attentiveness enforced by cabin camera; cannot be disabled; green indicator shown when active. [37] | Cybertruck-specific features like Auto Shift out of Park reference “detected surroundings,” implying sensor-driven inference. [38] |
| Roadster (next-gen, planned) | Not applicable (not publicly detailed in Tesla owner manuals as a current consumer vehicle). | Not applicable. | Not applicable. | Not applicable. | Tesla’s Q4 2025 update deck lists Roadster as “Design development,” with production timing not specified there. [39] |
Model S and Model X including Plaid variants
AI-driven features (vehicle-facing). For Model S/X, Tesla positions Autopilot/FSD (Supervised) as advanced driver assistance that still requires the driver to remain fully engaged; Tesla describes ongoing improvement via OTA updates. [40] Tesla Vision is now the baseline strategy: Tesla states it removed radar (S/X in 2022) and USS (S/X in 2023), with a vision-based occupancy network replacing USS inputs. [3]
Sensors and monitoring. The Model S camera layout described in Tesla’s manual includes multiple cameras around the vehicle (rear/license plate, pillars, windshield, fenders, front fascia). [23] For cabin monitoring, Tesla states the cabin camera exists in Model S and Model X produced in 2021 or later and can determine driver inattentiveness (with privacy controls). [16]
Variants (Plaid). Tesla’s filings describe tri-motor powertrain technology in certain versions of Model S and Model X, but do not describe variant-specific differences in Autopilot/FSD AI capability; thus Plaid’s AI features appear to be primarily shared with other trims, subject to the same sensor/computer generation constraints. [41]
Model 3
AI-driven driving features. Tesla’s Model 3 manual explicitly states FSD (Supervised) uses camera inputs (front/rear/left/right) and the in-vehicle “AI computer” to process neural networks and guide the driver to the destination, and that the system improves via OTA updates. [12] The FSD support page also describes functional categories: “Drives to You” (Summon variants), “Drives for You” (broad roadway operation), and “Parks for You” (Autopark), while stressing that none of these features make the vehicle autonomous. [22]
Driver monitoring. The Model 3 FSD (Supervised) manual section describes cabin-camera attentiveness monitoring with a green indicator, states the system “cannot be disabled,” and notes it does not require full visibility of the driver’s eyes (e.g., sunglasses). [30]
Sensors and feature dependency. Model 3’s “Cameras” page in the manual documents external camera placements and notes that an additional front-facing camera may be present “if equipped,” including a front camera sprayer nozzle in that case. [29] Tesla also cautions that if cameras are obstructed or blinded, Self-Driving features may not be available. [42]
Model Y
AI-driven driving features. Model Y’s manual describes Self-Driving as using vehicle cameras and states it includes a cabin camera that monitors driver attentiveness, emphasizing the driver must be ready to take immediate action. [43] Tesla also labels Autosteer as a “BETA feature.” [44]
Sensor stack and interior occupancy. Model Y’s camera layout in Tesla’s manual shows multiple cameras and includes a front-facing camera above the grille/front fascia, with a sprayer nozzle, and a cabin camera. [45] Tesla’s Model Y documentation also includes cabin radar that supports occupancy detection and safety features (driver detection, seat occupancy, OCS, auto parking brake engagement). [19]
Operational limits. Tesla’s troubleshooting guidance highlights frequent real-world degradation modes—camera blockage or limited visibility—and explains that Self-Driving features may be reduced or unavailable until visibility/calibration is restored. [46]
Cybertruck
AI-driven features and monitoring. Cybertruck’s manual details both the exterior camera suite and cabin camera. The cabin camera “can determine driver inattentiveness,” “enhances active safety features,” and supports data sharing only if enabled; in a safety-critical event (e.g., collision), Tesla states Cybertruck can share “short cabin camera video clips” to develop future safety enhancements and improve cabin-camera-dependent intelligence. [35]
Cybertruck’s FSD (Supervised) manual section describes continuous cabin-camera attentiveness monitoring, emphasizes it cannot be disabled, and documents the UI warning behavior and the green attentiveness indicator. [37]
Vehicle-specific automation that likely relies on inference. Cybertruck’s “Auto Shift out of Park” describes automatic selection of Drive/Reverse based on “detected surroundings,” which implies sensor interpretation (though Tesla does not publicly specify the exact model used). [38]
Roadster
Tesla’s Q4 2025 quarterly update deck lists Roadster under automotive capacity planning as “Design development” (with other entries indicating planned ramps for other products) but does not provide a production owner manual or a public sensor/AI specification. [39] Accordingly, Roadster AI capabilities are unknown beyond the general expectation that Tesla would seek platform alignment with its autonomy stack; any more detailed architecture claims would be speculative.
Data collection, training, and OTA deployment pipeline
Data collection and privacy controls
Tesla’s privacy documentation draws a clear boundary between in-vehicle processing and optional data sharing:
- Tesla’s privacy notice states “Fleet Learning camera recordings” use external cameras to learn features such as lane lines, street signs, and traffic lights, and that sharing requires consent via in-car controls; it also states recordings are limited (example: 30 seconds) and “remain anonymous.” [47]
- Tesla’s privacy-support page states Sentry Mode and Dashcam recordings are not shared with Tesla (as a default policy statement). [48]
- Tesla’s vehicle security page states cabin camera images/video do not leave the car unless data sharing is enabled, and shared cabin camera images are not linked to VIN. [16]
Data selection and active learning mechanisms
Tesla has patented a structured mechanism for selecting which data to transmit from vehicles for training. The “System and method for obtaining training data” patent describes:
- applying a neural network to sensor data,
- applying a trigger classifier to an intermediate result to compute a score, and
- transmitting sensor data (and metadata such as location, timestamp, operating conditions) based on whether the score exceeds thresholds and other conditions. [49]
This patent supports the widely-discussed “data engine” concept: rather than uploading all driving video, the system identifies high-value examples for training.
Tesla also states in recruiting materials that it builds and scales “neural network training and auto-labeling infrastructure” and that engineers can decide what data to collect and how to label it, with deployment to millions of robotic platforms. [50]
Auto-labeling and labeling at scale
Tesla does not publish full internal labeling pipelines, but two credible public signals exist:
- Tesla’s recruiting/event materials explicitly reference “auto-labeling infrastructure” for Autopilot and humanoid robotics and training “on very large amounts of data across large-scale GPU clusters and, soon, our supercomputer Dojo.” [50]
- Third-party coverage of Tesla engineers’ technical talks at Hot Chips describes Dojo as optimized for large-scale video AI, which is consistent with the data modality implied by Tesla’s autonomy approach. [51]
Where Tesla does not disclose details (exact neural network architectures, labeling model architectures, human-in-the-loop staffing levels), those specifics should be treated as unknown.
OTA update process and recall remedies
Tesla’s platform is designed for frequent fleet-wide updates:
- Tesla’s 10-K states it improves driver assistance functions via OTA software updates and “update[s] our vehicles’ software regularly through over-the-air updates.” [1]
- Tesla’s support page explains that updates can be checked/installed from the vehicle “Software” tab and the mobile app. [52]
- In recall contexts, both Tesla and NHTSA documentation show that software can be the remedy: the NHTSA recall report for FSD Beta (23V-085) describes an OTA software update remedy; Tesla’s own recall support page references an OTA rollout timeline and specific FSD Beta version thresholds for the remedy. [53]
Fleet-to-training-to-fleet flowchart
flowchart LR
subgraph Vehicle[“In-Vehicle (Onboard Inference)”]
S[“Sensors\n(exterior cameras, interior cabin camera;\n+ cabin radar on some vehicles)”]
P[“Perception & world model\n(neural nets + post-processing)”]
A[“Planning/Control outputs\n(driver-assist actions)”]
D[“Driver monitoring\n(cabin camera attentiveness; warnings/strikeouts)”]
L[“Local logging buffer\n(event clips, metadata)”]
S –> P –> A
S –> D
P –> L
D –> L
end
subgraph Uplink[“Uplink & Data Governance”]
C[“Consent gates\n(Data Sharing settings)”]
F[“Fleet learning uploads\n(short clips; anonymized per policy)”]
S3[“Secure storage\n(training datasets)”]
L –> C –> F –> S3
end
subgraph Train[“Training & Validation (Offboard)”]
AL[“Auto-labeling + human review\n(where used)”]
T[“Training compute\n(Cortex cluster; Dojo referenced in recruiting/technical talks)”]
V[“Validation\n(sim, regression tests, gating)”]
S3 –> AL –> T –> V
end
subgraph Deploy[“Deployment”]
OTA[“OTA software update\n(vehicle firmware + NN weights)”]
V –> OTA –> Vehicle
end
This flow reflects Tesla’s public descriptions of (a) onboard camera-based inference, (b) consent-gated fleet-learning uploads with limited clip duration and anonymization, (c) large-scale training infrastructure (Cortex; Dojo referenced in some contexts), and (d) OTA deployment/recall remediation. [54]
Timeline of key publicly documented shifts
timeline
title Tesla autonomy-related platform shifts (publicly documented)
2014-2016 : Early Autopilot hardware era (1 camera + radar + ultrasonic; retrofit limits)
2021 : Tesla begins removing radar from Model 3/Y (Tesla Vision transition)
2022 : Radar removal extends to Model S/X; USS removal begins for Model 3/Y (most markets)
2023 : USS removal extends to Model S/X; NHTSA recall 23V-085 (FSD Beta) software remedy; NHTSA recall 23V-838 (Autosteer misuse controls) software remedy
2024 : Tesla expands cabin-camera-based monitoring language in manuals; ongoing regulatory scrutiny continues
2025 : Tesla disclosures emphasize scaling AI training compute (Cortex) and “end-to-end foundation model” for FSD (Supervised)
2026 : Tesla plans to more than double onsite compute in Texas; active regulatory investigations into FSD performance continue
The timeline is grounded in Tesla’s own Tesla Vision update, NHTSA recall documents, and Tesla’s 2025–2026 corporate disclosures about training compute and FSD foundation models. [55]
Corporate-level AI uses tied to the vehicle program
AI training infrastructure and data centers
Tesla’s 2025 10-K treats AI infrastructure as a major capital and depreciation category (“AI infrastructure includes our owned data centers”) and describes investing in compute hardware to process massive field data and continually train neural networks. [56]
In the Q4 2025 deck, Tesla provides unusually concrete training-compute disclosures: “Cortex 1 >100k H100e” in production and “Cortex 2” in construction, plus a statement that Tesla plans to more than double onsite compute in Texas in the first half of 2026 (measured in H100 equivalents). [57]
Dojo vs. Cortex
Tesla’s current corporate reporting emphasizes Cortex; “Dojo” is not mentioned in that Q4 2025 deck excerpt and is absent from the particular 2025 10-K HTML text search performed here. [58] However, Tesla continues to reference Dojo in recruiting/event materials (e.g., Tesla x NeurIPS 2024) as an upcoming training platform and references tooling to “deploy trained neural nets to Tesla hardware.” [50]
On the technical side, third-party reporting of Tesla engineers’ Hot Chips 2022 talks provides detailed descriptions of the Dojo architecture: D1 die/node organization, SRAM emphasis, tile-based scaling, proprietary transport protocol concepts, and the motivation of large-scale video training. [59] These sources are not SEC filings, but they are grounded in a major semiconductor conference and are consistent with Tesla’s public autonomy narrative.
Fleet learning as a product and business strategy
Tesla’s consumer-facing FSD page claims FSD (Supervised) is trained on large volumes of “anonymous real-world driving scenarios” and references a fleet size (millions of vehicles). [60] The Q4 2025 deck quantifies this more aggressively, claiming the global fleet can collect “over 500 years of continuous driving data per day,” and describes FSD (Supervised) versioning (v14) as an “end-to-end foundation model” trained on customer and Robotaxi data. [61] These are Tesla claims to investors; Tesla does not publicly disclose full methodological details (dataset composition, exact architecture, evaluation protocol), and those should be treated as proprietary.
Energy platforms and “AI” beyond vehicles
Although your question focuses on electric cars, Tesla’s corporate disclosures tie vehicle AI infrastructure to broader energy business software:
- Tesla’s 2025 10-K describes remote control/dispatch platforms for energy storage (“Autobidder” and “Powerhub”) and frames energy products as capable of firmware enhancement and optimization leveraging AI. [62]
- The Q4 2025 deck references large-scale Virtual Power Plant events (Powerwall network), indicating Tesla’s broader software/optimization strategy at grid scale. [39]
These are “AI-adjacent” in the sense of optimization platforms and large-scale control systems; Tesla does not detail the ML methods used in these energy platforms in the cited documents.
Manufacturing automation and robotics
Tesla frames itself as moving from a “hardware-centric business to a physical AI company,” pairing autonomy with robotics (Optimus) and vertical integration in AI silicon. [63] The Q4 2025 deck contains a dedicated “Robotics” section describing Optimus program progress and production line preparations, indicating shared AI infrastructure requirements. [39]
In addition, Tesla disclosures mention factory simulation/automation efforts and “simulations modeling” capabilities prior to construction, but do not provide ML-specific disclosures for manufacturing QA/robotics on a per-vehicle-model basis in the cited text. [64]
Safety, regulation, limitations, and controversies
Regulatory posture: Level 2 framing and driver responsibility
U.S. safety recall documentation is explicit that Tesla’s driver assistance remains Level 2 in regulatory framing. NHTSA Recall 23V-085 describes “FSD Beta” as an “SAE Level 2 driver support feature” requiring constant supervision and driver responsibility for operation whenever engaged. [65] Tesla’s own FSD (Supervised) support page likewise emphasizes active supervision and non-autonomy. [22]
Recalls and enforcement via OTA
Two major OTA-remedied recalls illustrate how safety/regulation intersects Tesla’s AI feature deployment:
- FSD Beta driving operations (23V-085): NHTSA describes risks where FSD Beta could “infringe upon local traffic laws or customs” in specific scenarios (intersections, speed zones, lane changes), with an OTA update improving behavior. [66]
- Autosteer misuse controls (23V-838): NHTSA documents an OTA remedy adding “additional controls and alerts” and an “eventual suspension” policy for repeated failure to demonstrate driving responsibility; the remedy depends on vehicle hardware. [67]
Tesla’s owner manuals also describe Self-Driving suspensions via “strikeouts” for inattentiveness leading to a one-week suspension. [68]
Current investigations and camera-visibility concerns
In March 2026, reporting indicates NHTSA escalated an engineering analysis into crashes involving FSD under reduced visibility (glare, dust, airborne particles), focusing on whether the camera-based system appropriately detects degraded visibility and alerts drivers. [8] This scrutiny is closely related to Tesla’s sensor strategy (moving away from radar/USS toward vision). [3]
Tesla’s own documentation acknowledges camera degradation as a limiting condition: manuals warn that camera obstruction/blinding can make Self-Driving features unavailable, and troubleshooting guidance describes blocked or blinded cameras as a direct cause of Self-Driving unavailability. [69]
Known practical limitations documented by Tesla
Tesla’s manuals and support pages document several recurring limitation classes:
- Environmental and camera condition limits: rain, residue, faded lane markings, dirty/obstructed lenses can degrade performance; vehicles may disable features and display warnings. [70]
- High driver workload expectation: FSD (Supervised) warnings emphasize that behavior can be inconsistent from a driver perspective and that drivers must be prepared to intervene immediately. [12]
- Feature gating by configuration/region: Tesla’s FSD support page and Tesla Vision update both note that feature availability and parity can vary by region/hardware and be restored via OTA over time. [71]
Privacy-related controversies
Tesla’s privacy posture emphasizes consent and limited sharing, but controversies have emerged:
- Tesla states fleet-learning camera recordings require consent and are anonymized/limited in duration. [47]
- Tesla states cabin camera images/video do not leave the vehicle unless data sharing is enabled and are not linked to VIN; Cybertruck manual similarly states cabin camera does not perform facial recognition or identity verification. [17]
- Reporting by Reuters[72] described internal sharing of sensitive customer images by Tesla workers (historical reporting), raising questions about governance and controls around access to collected media. [73]
- Consumer Reports[74] has also raised consumer-facing concerns about in-car cameras and privacy implications (including questions around how “anonymized” footage is in practice). [75]
Litigation and corporate risk disclosures
Tesla’s 2025 10-K explicitly notes ongoing claims and regulatory scrutiny regarding Autopilot/FSD and states it receives requests/subpoenas from multiple agencies including NHTSA, the National Transportation Safety Board, the U.S. Securities and Exchange Commission[76], and the U.S. Department of Justice[77], among others. [78] It also discloses litigation related to alleged misrepresentations and, separately, a product liability verdict related to alleged Autopilot technology use. [78]
Sources
Tesla primary and official materials
- Tesla, Form 10‑K for year ended 2025 (business/technology sections on AI, neural networks, inference chips, OTA updates, AI infrastructure/data centers, energy optimization platforms). [79]
- Tesla Investor Relations, Q4 2025 Quarterly Update Deck (AI training compute: Cortex 1/2; FSD (Supervised) end-to-end foundation model claims; inference chip roadmap AI5/AI6; Roadster listed as design development). [80]
- Tesla Support, Tesla Vision update (radar and USS removal; occupancy network; feature parity and OTA restoration). [3]
- Tesla Support, Full Self‑Driving (Supervised) (feature definitions, supervision requirements, variability by hardware/region). [22]
- Tesla public page, FSD (Supervised) training claims and fleet scale. [60]
- Tesla Support, Vehicle Safety and Security Features (cabin camera policy and data sharing controls). [16]
- Tesla Legal, Customer Privacy Notice (fleet learning recordings; consent; anonymization/limits). [47]
- Tesla Events, Tesla x NeurIPS 2024 (auto-labeling infrastructure; GPU clusters; Dojo referenced; data collection/labeling decisions). [50]
- Tesla Owner Manuals (selected pages used above):
- Model 3: Cameras; FSD (Supervised) driver attentiveness; FSD (Supervised) AI computer description. [81]
- Model Y: Cameras; cabin camera; cabin radar; Self-Driving overview; camera-visibility troubleshooting. [82]
- Model S: Cameras. [23]
- Model X: FSD (Supervised) driver attentiveness. [83]
- Cybertruck: Cameras; cabin camera data sharing policy; FSD (Supervised) driver attentiveness. [84]
- Tesla Support, Software Updates (user process for checking/installing). [52]
- Tesla Support, Autopilot and Full Self‑Driving Capability (hardware-check instructions; vision + neural net processing; driver supervision framing). [10]
Regulatory and safety documents
- NHTSA Recall 23V‑085, Part 573 Safety Recall Report (FSD Beta described as SAE Level 2; defect conditions; OTA remedy). [66]
- NHTSA Recall 23V‑838, Part 573 Safety Recall Report (Autosteer misuse controls; OTA remedy; suspension policy elements). [85]
- Reporting on 2026 NHTSA engineering analysis into FSD reduced-visibility incidents. [8]
Patents and technical disclosures
- Tesla-associated patent application US20210271259A1 / EP3850549A1, “System and method for obtaining training data” (trigger classifier applied to intermediate NN outputs to decide whether to transmit sensor data for training; metadata transmission). [49]
- Coverage of Tesla engineers’ Hot Chips 2022 Dojo talks (microarchitecture; tile-based scaling; video-training focus). [59]
Independent reputable reporting on controversies
- Reuters[72] (2023 reporting on internal sharing of sensitive images from customer cars). [73]
- Consumer Reports[74] (privacy concerns related to in-car cameras). [75]
[1] [9] [21] [40] [41] [56] [62] [64] [78] [79] https://www.sec.gov/Archives/edgar/data/1318605/000162828026003952/tsla-20251231.htm
https://www.sec.gov/Archives/edgar/data/1318605/000162828026003952/tsla-20251231.htm
[2] [12] [13] [18] [30] https://www.tesla.com/ownersmanual/model3/en_us/GUID-2CB60804-9CEA-4F4B-8B04-09B991368DC5.html
https://www.tesla.com/ownersmanual/model3/en_us/GUID-2CB60804-9CEA-4F4B-8B04-09B991368DC5.html
[3] [36] [55] [74] [76] https://www.tesla.com/en_gb/support/transitioning-tesla-vision
https://www.tesla.com/en_gb/support/transitioning-tesla-vision
[4] [20] [39] [57] [58] [61] [63] [80] https://assets-ir.tesla.com/tesla-contents/IR/TSLA-Q4-2025-Update.pdf
https://assets-ir.tesla.com/tesla-contents/IR/TSLA-Q4-2025-Update.pdf
[5] [50] https://www.tesla.com/event/tesla-x-neurips-24
https://www.tesla.com/event/tesla-x-neurips-24
[6] [53] [65] [66] https://static.nhtsa.gov/odi/rcl/2023/RCLRPT-23V085-3451.PDF
https://static.nhtsa.gov/odi/rcl/2023/RCLRPT-23V085-3451.PDF
[7] [34] [35] [84] https://www.tesla.com/ownersmanual/cybertruck/en_us/GUID-60A2EFBA-78D7-47BB-BCDE-77582529EB48.html
https://www.tesla.com/ownersmanual/cybertruck/en_us/GUID-60A2EFBA-78D7-47BB-BCDE-77582529EB48.html
[8] US agency upgrades probe into 3.2 million Tesla vehicles over FSD crashes
[10] [25] [54] https://www.tesla.com/en_gb/support/autopilot
https://www.tesla.com/en_gb/support/autopilot
[11] [15] [29] [81] https://www.tesla.com/ownersmanual/model3/en_us/GUID-682FF4A7-D083-4C95-925A-5EE3752F4865.html
https://www.tesla.com/ownersmanual/model3/en_us/GUID-682FF4A7-D083-4C95-925A-5EE3752F4865.html
[14] [42] [69] [70] https://www.tesla.com/ownersmanual/model3/en_us/GUID-20F2262F-CDF6-408E-A752-2AD9B0CC2FD6.html
https://www.tesla.com/ownersmanual/model3/en_us/GUID-20F2262F-CDF6-408E-A752-2AD9B0CC2FD6.html
[16] [17] [27] [77] https://www.tesla.com/support/vehicle-safety-security-features
https://www.tesla.com/support/vehicle-safety-security-features
[19] [72] https://www.tesla.com/ownersmanual/modely/en_us/GUID-4ED40B33-6E34-4E58-B8C3-9D0810DB87D6.html
https://www.tesla.com/ownersmanual/modely/en_us/GUID-4ED40B33-6E34-4E58-B8C3-9D0810DB87D6.html
[22] [24] [28] [71] https://www.tesla.com/support/fsd
https://www.tesla.com/support/fsd
[23] https://www.tesla.com/ownersmanual/models/en_us/GUID-682FF4A7-D083-4C95-925A-5EE3752F4865.html
https://www.tesla.com/ownersmanual/models/en_us/GUID-682FF4A7-D083-4C95-925A-5EE3752F4865.html
[26] https://www.tesla.com/ownersmanual/modelx/en_us/GUID-8EA7EF10-7D27-42AC-A31A-96BCE5BC0A85.html
https://www.tesla.com/ownersmanual/modelx/en_us/GUID-8EA7EF10-7D27-42AC-A31A-96BCE5BC0A85.html
[31] [45] [82] https://www.tesla.com/ownersmanual/modely/en_us/GUID-682FF4A7-D083-4C95-925A-5EE3752F4865.html
https://www.tesla.com/ownersmanual/modely/en_us/GUID-682FF4A7-D083-4C95-925A-5EE3752F4865.html
[32] [43] https://www.tesla.com/ownersmanual/modely/en_us/GUID-101D1BF5-52D2-469A-A57D-E7230BBEE94B.html
https://www.tesla.com/ownersmanual/modely/en_us/GUID-101D1BF5-52D2-469A-A57D-E7230BBEE94B.html
[33] [44] https://www.tesla.com/ownersmanual/modely/en_us/GUID-20F2262F-CDF6-408E-A752-2AD9B0CC2FD6.html
https://www.tesla.com/ownersmanual/modely/en_us/GUID-20F2262F-CDF6-408E-A752-2AD9B0CC2FD6.html
[37] https://www.tesla.com/ownersmanual/cybertruck/en_us/GUID-2CB60804-9CEA-4F4B-8B04-09B991368DC5.html
https://www.tesla.com/ownersmanual/cybertruck/en_us/GUID-2CB60804-9CEA-4F4B-8B04-09B991368DC5.html
[38] https://www.tesla.com/ownersmanual/cybertruck/en_us/GUID-87DD5C22-8D11-4E00-8A04-1D198116B859.html
https://www.tesla.com/ownersmanual/cybertruck/en_us/GUID-87DD5C22-8D11-4E00-8A04-1D198116B859.html
[46] https://www.tesla.com/ownersmanual/modely/en_us/GUID-9A3F0F72-71F4-433D-B68B-0A472A9359DF.html
https://www.tesla.com/ownersmanual/modely/en_us/GUID-9A3F0F72-71F4-433D-B68B-0A472A9359DF.html
[47] Privacy Notice | Tesla
https://www.tesla.com/legal/privacy
[48] Obtain a Copy of the Data Associated With Your …
https://www.tesla.com/support/privacy?utm_source=chatgpt.com
[49] https://patents.google.com/patent/US20210271259A1/en
https://patents.google.com/patent/US20210271259A1/en
[51] https://www.servethehome.com/tesla-dojo-custom-ai-supercomputer-at-hc34/
https://www.servethehome.com/tesla-dojo-custom-ai-supercomputer-at-hc34/
[52] Software Updates | Tesla Support
https://www.tesla.com/support/software-updates
[59] https://www.servethehome.com/tesla-dojo-ai-system-microarchitecture/
https://www.servethehome.com/tesla-dojo-ai-system-microarchitecture/
[60] Full Self-Driving (Supervised) | Tesla
[67] [85] https://static.nhtsa.gov/odi/rcl/2023/RCLRPT-23V838-8276.PDF
https://static.nhtsa.gov/odi/rcl/2023/RCLRPT-23V838-8276.PDF
[68] https://www.tesla.com/ownersmanual/models/en_us/GUID-20F2262F-CDF6-408E-A752-2AD9B0CC2FD6.html
https://www.tesla.com/ownersmanual/models/en_us/GUID-20F2262F-CDF6-408E-A752-2AD9B0CC2FD6.html
[73] Tesla workers shared sensitive images recorded by …
[75] https://www.consumerreports.org/electronics/privacy/teslas-in-car-cameras-raise-privacy-concerns-a9884415005/
[83] https://www.tesla.com/ownersmanual/modelx/en_us/GUID-2CB60804-9CEA-4F4B-8B04-09B991368DC5.html
https://www.tesla.com/ownersmanual/modelx/en_us/GUID-2CB60804-9CEA-4F4B-8B04-09B991368DC5.html



