There are 50 owner-reported driver assist & adas complaints for the 2020 Tesla Model 3in NHTSA's database. These are unverified consumer reports and may not reflect confirmed defects.
Driving S/B on Ann Arbor Saline Road. Had FSD on the Tesla engaged. Needed to make left turn onto Maple. Left arrow light to Maple was red. Traffic light to continue straight on Ann Arbor-Saline was green. Despite left arrow being red, Tesla did not slow and ran red light to make left turn onto Maple.
When adaptive cruise control is enabled (Autopilot, not equipped with Full Self Driving) car will sometimes brake hard as if there is an obstacle ahead, with the words "Curve Assist Active" flashing on the screen. Seems to happen most often on state highways, vs interstate. In one case last night car slowed from 65mph to 30mph, on a straight road, with no object ahead. Thankfully no cars were immediately behind or I would have been rear-ended. Happened 4 times in 500 miles of highway travel.
Incident Date: Feb 1, 2026, 2:45 PM PST Location: Camarillo, CA (Pleasant Valley Rd & Village Commons Blvd) Description: The vehicle (Tesla Model 3, FSD Supervised) attempted an unprotected left turn. The system identified a gap but paused excessively before initiating the maneuver. After the pause—when the gap was no longer safe—the system proceeded to turn anyway, directly into the path of a speeding oncoming vehicle. The system failed to abort the maneuver after its own hesitation. It also failed to accelerate with the urgency required to clear the path. I was forced to manually override with emergency acceleration and steering into an oncoming lane to avoid a high-speed T-bone collision. This appears to be a "stale data" failure where the car executed an old plan that was no longer valid.
Vehicle Information: 2020 Tesla Model 3 AWD (Leased) Full Self-Driving (FSD) enabled Summary of Safety Issue: While Full Self-Driving (FSD) was engaged, the vehicle failed to detect roadside infrastructure and collided with multiple fixed objects, including a route sign board, a walkway sign board, and an underground electrical cable. The system did not provide adequate warnings or corrective action before impact. System jerk the steering confirmed system is engaged but did not break and drive to non-drivable area and not given any disengagement warning and totaled the car. Incident Details: • Date of incident: [XXX] • Location: On [XXX] {XXX] Piscataway, NJ 08854 • Road type: [XXX] • Weather/visibility: Clear • Speed at time of incident: <40 MPH (under the speed limit) Description: As soon as i trigger the FSD it immediately tried to navigated left or right and failed and unexpectedly veered into roadside infrastructure. The system did not slow down, steer away, or alert the driver in time to prevent the collision. The impact caused significant damage to public property and the vehicle is totaled. No injuries occurred. Evidence Available: • Photos of the scene and damage • Police report (number: [XXX] • Tow documentation • Tesla collision center estimate Reason for Reporting: This incident suggests a potential safety defect in Tesla’s FSD system related to object detection, path planning, and collision avoidance. The failure occurred without driver input and raises concerns about the system’s reliability in detecting fixed roadside objects. I request that NHTSA review this incident as part of ongoing evaluations of Tesla’s driver-assistance systems. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
Full Self Driving mode (with Hardware 3) consistently will enter the carpool lane when I am driving solo. In the vehicle navigation settings, I have, "use carpool lanes," unselected. It doesn't seem to use this input in it's decision to enter the carpool lane. Sometimes it is very quick to signal and then enter the carpool lane illegally that I can't respond quick enough to correct. However, it only seems to need 1 correction for it to ignore the carpool lane for the remainder of the trip. This then can/will repeat each new trip (it doesn't always occur for each new trip, sometimes it won't make an attempt to get into the carpool lane). I have sent Tesla dozens of recorded messages that it needs to be fixed. It will enter at an illegal point, crossing the solid line, this can sometimes be a dangerous maneuver as well as being illegal for a solo occupant. Also, I don't want to want to be ticketed. It's quite a stressful situation each time this occurs.
The contact owns a 2020 Tesla Model 3. The contact stated that while attempting to exit a parking lot, the rear sensors failed to operate as designed. The contact stated that the rearview sensor failed to provide an audible warning, which resulted in the vehicle bumping into another vehicle. The dealer was contacted. The vehicle was not diagnosed or repaired. The manufacturer was made aware of the failure. The failure mileage was approximately 68,238.
I am reporting a critical safety incident involving my vehicle, which occurred during rush hour on a busy highway with no traffic lights or stop signs. While waiting in the left turn lane at a center divide, my emergency autopilot braking system activated unexpectedly, causing my car's left front side to encroach into oncoming traffic. Despite my attempts to move the car, it became immobile. I attempted to reset the car by turning it off and on, but the vehicle’s computer system malfunctioned entirely. With oncoming traffic traveling at speeds of 70 mph, I had no choice but to exit the vehicle for safety. I left the car in the center divide. This incident was extremely frightening, and I am deeply concerned about the potential dangers posed by the malfunctioning autobraking system. Thank you for your attention to this serious matter.
Driving at highway speed (75 mph) using Autopilot & car rapidly applied hard braking. There were no obstacles, shadows or anything visible that should have caused this. It occurred twice over two days in different locations. If anyone had been behind me too close they would have collided with me. No other vehicle I’ve owned have done this with their implementation of adaptive cruise & lane assist. My car used to use radar and I never had this issue until vision based detection was implemented. It’s occurred numerous times over the years, but no this rapid of braking.
I have taken my vehicle into Tesla to have the issue fixed twice. They don't know what's happening or how to fix it. My car is randomly taking the wheel and steering me into other cars / off the road. It happens once a month or so, I have a newborn baby I drive in the car and I don't feel safe driving. I've recorded the incident and Tesla Service said "that should not be happening unless I'm using auto pilot" which I never do. I've looked into lemon law but I did not purchase the car from tesla directly, I bought it from a used car dealership. I'm not sure what to do, I can't sell the car because I would still owe $10,000 after the sale. Please help anyway you can, I'm desperate for anything.
While driving on cruise control at highway speeds (75-80mph) on I-80 the car braked VERY hard and unexpectedly MANY times. There were no other cars around me, weather clear, road straight. No obstacles noticed to cause this reaction. If a car had been behind me I would have caused him to rear-end me because there was no apparent reason for the severe braking. I reported four incidents of this type to the Tesla during the drive which I believe were date/time stamped so they could diagnose (using the "report" function). I matched these with my own observations which I also provided to the service department. I have had the car diagnosed by Tesla and they claim they could not duplicate the problem and/or fix the problem.
Since the recall for Autopilot / Full Self Driving which required driver alerts (often called steering wheel nag), the alerts are now so frequent and reoccuring that it forces you to stare at the screen and not the road. If you look at the road, both hands on the steering wheel, every few seconds, as little as 10 seconds since the previous, the screen begins to flash (only at the top) indicating that you must now slightly shift your hands. These are often hard to see unless you stare at the screen and give no other noticable indicator. Recently I was just awarded a strike for not adjusting to it, about 30 seconds after the previous indication and it had no audible sound or indication the only warning apparently was a visual one. The screen is not in front of the driver like a heads up display, its off to the side. Other vehicles warn you not to use the on screen displays while driving, even tesla, so why does this feature force you to stare at the screen your entire drive. It seems unsafe to force -constant- checks of the screen from the driver to determine if you're in autopilot compliance.
When driving in "Autopilot" mode which uses adaptive speed control, the car suddenly slows down without any warnings. This happens even when there is no vehicle in front of the car and therefore slowing down is not expected. This has happened several times while driving on freeways. If it weren't for my immediate reaction, serious multiple-car accidents would have happened. My reaction has been to quickly disengage the "Autopilot" system and manually accelerating to avoid rear collisions. The other issue is the car suddenly steer the wheel and drag the car to the side even when there is no reason to do so. In such cases, I have to forcefully steer the wheel to the opposite direction to avoid collision with other cars moving on either side, or parked cars on the streets.
Rear camera works intermittently. It says it's temporarily unavailable and then it restores when car is started next time after exiting. Seems to be a sensor or camera cable issues because camera works fine when it does.
FSD Beta 11.4.9 and past versions. A possible explanation for some degree of phantom braking on controlled access roadways. FSD Beta reacts to a stop light at the end of an offramp when the direction of travel is inline with the ramp. This reaction does not occur in daylight, only in darkness. It occurs across multiple versions of FSD Beta and is consistent and repeatable at night under similar conditions. The point at which normal speed is resumed appears to coincide with the beginning of the offramp, but that may be coincidental. Since it doesn't occur in daylight it is less likely that the reaction is triggered by lane topography or some other factor relating to the converging right hand lane that occurs after the ramp is passed. In one test pass that was not recorded, the stoplight at the top of the ramp turned green while the braking was happening and the car seemed to react to the green light and resume speed, but that may have been coincident to the turn in the roadway or being adjacent to the beginning of the offramp. Another example where the light turned green has not happened again during tests. This test was at 3:08 am on a weekday with almost zero traffic in this direction and there were no vehicles within 0.25 mile behind the test vehicle at the time this was recorded. LINK TO VIDEO: [XXX] INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
I am writing to provide feedback on the recent mandated Tesla Full Self Driving software update (recall). When I keep both hands on the steering wheel, I now get constant warnings. Specifically, even when I drive with both hands firmly on the wheel at 9 and 3 o'clock and maintaining what I feel is adequate control, the system nags me with disruptive alerts. It tells me to apply pressure and flashes visuals that my hands are not properly positioned, despite both remaining clearly in contact with the wheel the whole time. I typically drive responsibly with proper grip, yet these frequent notifications are frustrating and stress-inducing. The sensors seem to be calibrated too sensitively if they are detecting grip issues when both my hands are correctly and securely holding the wheel. Prior to the mandated update, I didn't receive nearly as many warnings. The update was not helpful and makes driving more stressful and less safe if anything. I understand the goal is to enforce safe driving, but the current warnings are excessive even when drivers are maintaining proper hand positioning. Perhaps the regulations could be relaxed so alerts only activate when grip is legitimately inadequate, rather than routinely throughout normal driving. Please let me know if any further details on my experience with both hands on the wheel would be helpful. I would be happy to provide additional information to aid review of this system. Thank you for your consideration.
This morning, I was waiting in the left turn lane on [XXX] turning east onto [XXX] in Moorpark, CA. Waiting at the light, I tapped to turn on autopilot and expected it to turn left… it tried to make a right turn from the left turn lane. This is not the first time autopilot has tried to make an unsafe and illegal turn from the wrong lane. I once had it attempt a U turn from the right lane on [XXX] . INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
At 11:47 AM Pacific time on December 23, 2023 my Tesla model three auto pilot system failed after a recent update mandated by the NHTSA. Ever since the update, the vehicle, cruise control and auto pilot systems no longer function properly, and while I was in, cruise control and auto pilot traffic stopped ahead and the auto pilot system disengaged while I was waiting while it was slowing down, which has never happened in four years of ownership and over 60,000 miles of driving. The new recall software update has made the vehicle unsafe and it should be retracted and corrected. What once was a safe feature is now unsafe. Please investigate this with Tesla to prevent possible injury or death resulting from this programming failure.
Your latest mandates has arguably made the car less safe by removing what allowed the system to operate safely. Please roll back this “recall” it’s not a recall in the slightest it’s a shot at someone or multiple people that dislike Tesla.
I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. You need to investigate this issue as a matter of urgency. Adjustments are necessary to prevent potential accidents and ensure that the tech helps, not hinders.
Phantom breaking occurred twice while driving east on highway sr 152. Our car was traveling straight with no cars in front of us. I was using cruise control set at the speed limit when the car suddenly slammed on the brakes, the car slowed from 65 mph to 20 mph in a few seconds. This happened twice in a matter of 10 minutes. I no longer use cruise control. Apparently there is no fix. I have two relatives that have experienced same problem with their Teslas. I will suggest that they report to NHTSA.
Showing 1–20 of 50 complaints
Complaints are unverified consumer reports submitted to NHTSA. A high complaint count may reflect vehicle popularity, not defect severity. Data sourced from NHTSA public records.
Data synced from NHTSA on May 4, 2026