There are 9 owner-reported driver assist & adas complaints for the 2024 Tesla Cybertruckin NHTSA's database. These are unverified consumer reports and may not reflect confirmed defects.
All the road safety features stopped working all of a sudden. Auto lights. Defaulted to high beams. Lane departure stopped, cruise control stopped, forward emergency braking stopped, rear cross traffic stopped, blind stopped stopped, and other safety’s I am not sure of.
On December 23, 2025, my 2024 Tesla Cybertruck (Foundation Series) was operating with Full Self-Driving (FSD – Supervised) engaged. While FSD was active, the vehicle executed an unsafe driving trajectory. I attempted to intervene and regain control; however, the system did not disengage as expected and did not yield control appropriately to driver input. Despite driver intervention attempts, the vehicle continued along the unsafe trajectory, resulting in a loss of control and collision with roadside objects. There were no mechanical warnings or alerts prior to the incident. The issue appears related to system behavior, disengagement logic, and driver override response while FSD was active. The vehicle is currently preserved under a formal litigation hold. No inspection, teardown, or data access has occurred. Tesla has confirmed in writing that it does not modify, delete, or alter vehicle data it has received. I am submitting this complaint due to concerns regarding the safety behavior of Full Self-Driving (Supervised), including the system’s failure to disengage upon driver input and the human-machine interface governing control authority.
The contact’s father owns a 2024 Tesla Cybertruck. The contact’s father stated that while driving at an undisclosed speed, the Autopilot function became inoperable. The contact stated that while engaging the Autopilot function and attempting to make a turn in a residential area, the vehicle unexpectedly accelerated and collided with a tree. The air bags did not deploy during the incident. As a result of the crash, both the contact’s father and nephew sustained injuries and received medical attention at a local emergency room. The contact’s father sustained a rib injury and bruising, while the contact’s nephew suffered an ongoing back injury due to the incident. A police report was filed. The vehicle was towed and taken to a body shop. The dealer was contacted; however, the vehicle was not diagnosed or repaired. The manufacturer was made aware of the failure, and a case was opened. The approximate failure mileage was 1,500.
The car crosses solid yellow lines all the time. all the time. it hits rumble strips. it generally is unsafe and annoying. it is getting worse. this is not to mention the constant brake checking when engaging. it does not change lanes when prompted even if it is safe.
I am writing to report an incident that occurred on [XXX] [XXX] while my Cybertruck was operating in Full Self-Driving (FSD) mode. While driving on [XXX] Brooklyn NY with no other moving traffic present, my Cybertruck did not detect a vehicle that was parked improperly—sticking out too far into the lane. The FSD system failed to adjust its path accordingly and struck the side view mirror of my vehicle, causing significant damage to the mirror glass and housing. I understand that FSD is a driver-assist feature that requires supervision, and I was attentive during the drive. However, the clearance of side view mirrors from the driver’s perspective is extremely difficult to judge at street level, and I believe stationary obstacles of this nature should be reliably detected and avoided by the FSD system. This type of incident raises concerns about the performance of FSD in handling stationary objects, especially when the vehicle was otherwise properly supervised and operating in a straightforward environment. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
On May 15, 2025, at around 11:30 AM, my Tesla was involved in a collision while Full Self-Driving (FSD) was engaged. The incident occurred at the Tesla West Covina Dealership. The vehicle struck a barrier wall without any collision warning, suggesting both a failure of the sensors and FSD system. After the incident, Tesla staff directed me to their West Covina Service Center. A service advisor reviewed the dashcam footage with me and acknowledged the crash appeared to result from FSD malfunction. I was told Tesla will cover the repair. However, when I returned on May 26, 2025, I was informed that the dashcam footage was no longer available. I did not delete it and suspect it may have been removed by Tesla personnel. They also then claimed they will not cover any repairs. I have written to their legal department to formally request the video footage and any other data they have regarding the crash. Despite follow-up, Tesla has not responded or released any data. I am reporting this as a possible safety defect and mishandling of evidence. I request NHTSA investigate: The failure of Tesla’s FSD and collision avoidance system. The disappearance of crash footage after review by Tesla staff. Tesla’s lack of response to requests for data and documentation. Thank you for your attention.
On [XXX], at approximately around [XXX], an incident occurred involving the Auto Park feature of a Tesla vehicle in a quiet parking lot. The conditions were clear, with no adverse weather, obstacles, or surrounding vehicles. The Auto Park function was engaged, and the vehicle reversed and moved forward as expected. However, it failed to stop and collided with a yellow pole directly in front. The front camera feed was visible on the screen, but the vehicle did not detect or recognize the pole, and no collision alert was issued. The incident occurred too quickly to allow manual intervention via the brake pedal. The collision resulted in minor front bumper damage, with repair costs estimated at over $1,500. All incident details and vehicle data have been provided to Tesla for investigation. And they rejected to provide front plastic bumper replacement. [XXX] [XXX] INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
I am filing this complaint regarding Tesla's failure to include the advertised Autosteer feature in my new Cybertruck, which I believe raises serious safety concerns. At the time I ordered my vehicle, Tesla's website and the official Cybertruck Owner’s Manual (archived October 15, 2024) clearly listed Autosteer as a feature included with Autopilot. Tesla also markets Autopilot as a system that “enhances safety and convenience,” which formed a core part of my purchasing decision. I previously owned both a Model 3 and Model Y, both of which had Autosteer included in the base Autopilot, reinforcing that expectation. Upon delivery of my Cybertruck, I learned that Autosteer was not actually included, and Tesla now requires customers to pay for Full Self-Driving (Supervised) to access this core safety function. This change was not communicated prior to purchase or delivery and contradicts both historical Tesla offerings and the documentation available at the time of sale. Tesla’s decision to restrict a safety-related feature—one that assists with lane keeping and crash avoidance—may put my family and others at greater risk. Drivers may assume Autosteer is included based on Tesla’s marketing, which could lead to misuse or over-reliance on the limited Base Autopilot. I contacted Tesla directly and requested a resolution. Despite multiple follow-ups, I received no meaningful response. I believe this matter represents a pattern of deceptive practices and raises legitimate safety concerns regarding what features are expected and what is actually delivered. I respectfully request that NHTSA investigate Tesla’s removal of Autosteer from Base Autopilot and its potential implications for consumer safety and transparency in driver-assistance system advertising.
NHTSA has created a safety risk by forcing FSD to have extreme monitoring. While in FSD I have to go back to normal driving to change the station on the radio or look at my phone. This is something I do in my non fsd vehicles all the time. The reason for me purchasing FSD is to be safer. Your ridiculous overreach may actually kill me.The entire purpose is to reduce distracted driving, yet your ridiculous rules force an unsafe environment. I hope whoever pushed these restrictions realizes they are accountable for the crashes and deaths they create for forcing people out of FSD.
Complaints are unverified consumer reports submitted to NHTSA. A high complaint count may reflect vehicle popularity, not defect severity. Data sourced from NHTSA public records.
Data synced from NHTSA on May 4, 2026