There are 50 owner-reported driver assist & adas complaints for the 2023 Tesla Model Yin NHTSA's database. These are unverified consumer reports and may not reflect confirmed defects.
Hello, I am reporting a safety issue with the Full Self-Driving (FSD) system on my 2023 Tesla Model Y. The vehicle has been taken in for service four separate times for this issue. During these visits, Tesla service technicians were able to reproduce and confirm the problem. I have been explicitly advised in writing by Tesla service not to use the Full Self-Driving system due to safety concerns and to use standard AutoSteer instead. While FSD is engaged, the vehicle exhibits dangerous and unpredictable behavior, including: Crossing over double yellow lines into opposing traffic, Driving significantly below the speed limit (e.g., ~4 mph in a 25 mph zone), Stopping in the roadway without any traffic or obstructions, Swerving within the lane while attempting to maintain position, Hesitation during turns, stop signs and at intersections. Tesla has attempted multiple repairs, including replacing steering-related components and multiple cameras across separate service visits. These repairs did not resolve the issue. At my last service visit Tesla has since indicated that the problem is likely software or firmware-related and that there is currently no available fix, with a recommendation to wait for a future update of an unknown timeline. If their statement is correct, this issue should be affecting every vehicle out there on the same software. At this time, I have been advised not to use a system that controls steering, braking, and acceleration due to safety concerns, and no timeline or resolution has been provided. I am submitting this complaint because the system behavior presents a potential safety risk to myself, passengers, and other drivers, and the manufacturer has not provided a current fix despite multiple confirmed service visits. I am attaching my last service invoice only in regards to this matter where they indicated to not use FSD, but I can provide other documentation showing the replacement of cameras and steering parts as well as videos if needed.
The vehicle has a phantom breaking problem. It does that 3 times daily on the same route in the same place. This is a big safety flag. It has the same problem on the Autopilot & normal cruise control.
I was driving with the car in Autopilot in the center lane of the highway. I could see a white vehicle move close on my right, and I confirmed that it had crossed over into my lane with both its front and back left wheels on my Tesla screen with a quick glance. I then tried to maneuver to the left lane (which my screen showed was clear) but the steering wheel had significant tension. All of a sudden, I heard the chime signaling that Autopilot disengaged, the car jerked hard, and my car was headed toward the median. I swerved quickly to the right and hydroplaned. As I straightened out, my vehicle hit a vehicle in the right hand lane. The strong tension followed by jerk and quick release of tension prevented me from having full control of my vehicle. I believe that if Autopilot had not been engaged, I could have safely moved into the left lane with control. The police arrived but I was very shaken up and simply said I swerved to avoid a car in my lane but in retrospect wish that I had gone into detail. I’d also like to note that I have a clean driving record, was not in a hurry, did not have loud music on, had both hands on the wheel at 10 and 2 o’clock, and was focused on the road.
Make/Model: Tesla Year: 2023 Component / System: Autopark system (vehicle’s automated parking feature) Description of Failure/Malfunction: While reversing under Autopark control, the vehicle collided with a parked vehicle whose door was open. No driver input was provided, and the system gave no warnings prior to the collision. The incident appears to result from a failure or limitation of the Autopark system, not driver error. Safety Risk: The vehicle moved autonomously without any driver input, creating a risk of injury to pedestrians or damage to nearby vehicles. Had someone been standing near the open door, they could have been struck. Reproduction/Confirmation: The malfunction has not yet been reproduced or confirmed by Tesla, a dealer, or an independent service center. Inspections Conducted: No inspection has yet been performed by the manufacturer, police, insurance representatives, or others. Photographs of the incident have been documented and attached. Warning Lamps/Messages/Symptoms: No warning lamps, alerts, or system messages were displayed prior to the incident. The system gave no indication of a failure during the parking maneuver. Incident Details: Date/Time: Saturday [XXX] Location: [XXX] Weather Conditions: Light drizzle Attachments / Evidence: Photographs of both vehicles involved Incident report Request for preservation of vehicle telemetry and Autopark logs Additional Notes/Requests: I request that all relevant vehicle data be preserved in their original form to allow a full investigation. The incident occurred solely under system control without driver input, and I assert that responsibility lies with the system malfunction. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
The computer in my car has been undergoing short circuiting, rendering all driver safety features inoperable. This is a known problem with hardware 4 and reported many places. I was backing out of my garage and the read camera and automated braking are not functional and it is nearly impossible to see out the back. When backing out, I struck another car in the driveway, resulting in damage to both cars. Had the system been functional, visibility would have been there and the automated braking would have avoided the collision. The vehicle has been seen by Tesla and I have been told that the warranty has expired and the computer must be replaced at my cost. Tesla has replaced other computers under a recall for the same issue reported. This is a serious safety issue as without a functional computer, features such as software updates, navigation, bilnd spot monitoring, lane keeping, cruise control, all cameras (rear, side, front), autopilot, autobraking etc. are nonfunctional.
While operating my Tesla using the Auto Park function (on 12/8/2025 between 8:45 am and 9:00 am) , which is part of the Full Self-Driving (FSD) system, the vehicle initiated an automated parking maneuver. During the maneuver, the vehicle was reversing into a parking space when it failed to detect a stationary wall located behind the vehicle. The vehicle continued reversing without issuing any visual or audible collision warnings and without performing automatic braking. The Auto Park system did not disengage or request driver intervention prior to impact. The vehicle subsequently collided with the wall, resulting in damage to the vehicle. Based on the vehicle behavior, it appears that the object detection and/or obstacle avoidance functions—relying on onboard cameras and proximity sensing—did not properly recognize the stationary obstacle during the automated maneuver. Following the incident, I contacted Tesla to request a review of vehicle telemetry and system logs to determine whether a fault existed within the Auto Park software, perception system, or associated sensors. Tesla advised that a service diagnostic fee would be required solely to access and review the logs, despite the incident occurring during use of an automated driving feature.
I was trying to park my vehicle on a normal street and Tesla's Automatic Emergency Braking APP_w050 kicks in and aggressively brake when i try to parallel park.
FSD tried to pass on the shoulder multiple times and brake checked people multiple times. The FSD is unsafe and getting worse. Speed cannot be controlled. NO WAY TO CONTROL SPEED!!!?? Did you know this? Did you approve it? Did elon convince you it is ok?
The Full Self Driving (Supervised) ADAS in this vehicle will react to shadows on the road and can put the car or other cars at risk. We have had multiple instances where, on a sunny day, the car will react to shadows cast on the road and take evasive action like braking or switching lanes abruptly. This can be hazardous due to the random nature of the of the occurrences and them happening at times when a driver may be lulled into a false sense that the car seems to be driving itself well. So far we have been lucky that no other vehicles were traveling too close behind or to the side of us. We have experienced this behavior multiple times and there are other people online who have reported this. The car gave no warning indicators either before, during, or after the occurrence. Note that our car uses a Hardware 3 computer. I believe that this issue may have been fixed on newer Tesla cars that use Hardware 4.
The forward-facing camera system, Automatic Emergency Braking (AEB), and Autopilot/cruise control have repeatedly failed on a 2023 Tesla Model Y with approximately 22,000 miles. The AEB system becomes unavailable and the Autopilot disengages without warning, including at highway speeds of 70 mph and city speeds of 30 mph. The active safety systems are functional less than 10% of the time. The sudden, unexpected loss of active safety systems at speed presents a serious risk of accident and injury. The problem has been reproduced and confirmed by an authorized Tesla service center on three separate occasions. Repairs attempted include replacement of the triple forward-facing camera assembly (twice) and replacement of the car computer (HW3) with a firmware update. Despite these repairs, the defect recurred within days of each service visit. A fourth service appointment is now scheduled for the same issue. Warning alerts appeared on the vehicle display prior to each failure, specifically "Camera Blocked" and "Automatic Emergency Braking Unavailable." These alerts were first detected in August 2025 and have recurred through January 2026. All three repair attempts were covered under the manufacturer's Basic Vehicle Limited Warranty, confirming acknowledgment of a manufacturing defect. The vehicle and components have been inspected by the manufacturer's authorized service technicians. The vehicle is available for inspection upon request.
While driving a Tesla Model Y on a freeway in Arkansas, the vehicle was operating in Autopilot mode (Tesla's driver-assistance system). During a highway merge, the vehicle followed a state trooper's car too closely and failed to slow down appropriately. The trooper had to apply his brakes to avoid a collision. After stopping the vehicle, the trooper issued a written warning, explicitly attributing the issue to equipment failure and advising immediate correction. This issue appears to involve a failure of the adaptive cruise control and forward collision detection system while Autopilot was engaged. No collision occurred, but the situation posed a clear safety risk to both the trooper and my vehicle. This was not the first issue with the vehicle's automation. In a previous incident (November 2024), the vehicle exceeded the posted speed limit while on Autopilot, resulting in a traffic citation. Tesla has been notified, and a service request is open. Diagnostic review is pending. There were no warning lights or alerts from the vehicle before the incident. The vehicle remains available for further inspection if required.
-As a regular driver in the DC area, I have repeatedly experienced the Lane Departure Avoidance system disabling itself mid-drive. This is particularly troubling in city driving conditions marked by unpredictable traffic, unclear lane markings, and frequent distractions—scenarios where these safety features are most essential. Lane Departure Avoidance helps prevent accidents by alerting drivers or gently correcting steering if the vehicle unintentionally drifts out of its lane. -I don't know if there was a specific concern about the safety of others, but it may have. -Unknown -Not needed -No - just happened
Statement of Facts Regarding Tesla Accident [XXX] On [XXX], I was involved in an accident with a Tesla that I had rented directly from Tesla’s rental program. The vehicle was equipped with Tesla’s full software capabilities, including AutoDrive and the Smart Summon feature. At no time during the rental process was I given warnings, training, or any overview about how these features work, the risks they pose, or the long history of reported incidents involving Teslas colliding with stationary objects while operating under self-driving or summoning functions. On [XXX], I used the Summon feature for the first time in the parking lot of my business at [XXX] . The car was parked in spot [XXX], where I regularly park. I attempted to use Summon while standing in full view of the vehicle, but the software repeatedly displayed “lost connection.” After disabling Wi-Fi, I held the Summon button again and the car successfully pulled out of the spot, drove forward, and arrived directly in front of me. This demonstrated the feature was functioning properly on that day. On [XXX], while again parked in the same spot, I attempted to demonstrate Summon to my wife. I was standing in clear view, approximately 50–75 feet away, near the complex entrance. As before, the software repeatedly displayed “lost connection.” I attempted to reengage by hitting the summon button to reconnect the command. That did not work. When I walked up to the vehicle, I discovered that the front quarter panel near the wheel had come into contact with the covered parking pole. At no time was I aware that the car had hit anything, nor did the system provide any alert that it had engaged with an object. This stands in stark contrast to the way Tesla’s software consistently warns when the car is being manually driven — giving constant alerts when drifting out of a lane or approaching another vehicle. The absence of such a warning during Summon demonstrates a failure of the software’s sa INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5
While attempting to park my Tesla, I accidentally struck another vehicle’s side step with the front passenger side of my car. The other vehicle was not damaged, but my Tesla sustained visible damage. The vehicle’s camera system failed to detect or display the side step, giving the impression that there was sufficient clearance. As a result, I relied on what I believed to be accurate visual guidance from the Tesla camera system. During a follow-up visit to the Tesla Service Center in Gilroy, California, I was told that similar incidents have occurred before. According to the technicians, the camera system has known limitations in detecting low-profile or side-mounted obstacles, such as running boards or side steps. This presents a significant safety concern. Tesla vehicles rely heavily on cameras and sensors for driver awareness, particularly during parking maneuvers. When key blind spots exist in the system—especially in areas the driver may not be able to visually confirm easily—it puts both property and people at risk. Drivers are conditioned to trust the vehicle’s spatial guidance systems, but in this case, that trust may lead to preventable accidents. Given Tesla’s growing market share and the increasing use of its camera-based safety and autonomy features, I believe this issue could affect many other drivers. It may warrant broader review to determine whether a design or software update is necessary to improve detection of low-profile obstacles and better inform drivers in parking scenarios.
The Tesla “self driving” features are inconsistent, at best, and frequently fail. I can not believe that the government approves these capabilities or the Tesla is accepting liability for road use. If these features are not reliable under supervised conditions, why would they be considered safe for robo taxi release? I have repeatedly tried to report this to Tesla and they say the systems are operating as designed.
My 2023 Tesla Model Y has produced concerningly loud mechanical noises while parked, plugged in, and not being driven. I’ve experienced this on two separate occasions—[XXX] and [XXX]—after the vehicle had been parked for several hours in a garage. Upon waking the car (without driving it), it emitted a very loud and persistent mechanical sound, both while plugged in and even after unplugging. The volume and intensity of these sounds were far beyond normal operation and raised immediate concerns of overheating, component failure, or an underlying safety defect. The experience felt unsafe enough that we considered exiting the vehicle both times. We attempted to schedule a service appointment through the Tesla app, but received no response, and the request appears to have disappeared with no record remaining in the app. Based on the nature of the sound, we’re unsure what it could be, and may indicate a malfunction in the cooling system or battery thermal management. I have video documentation of both incidents, along with video from other nights with similar temperatures where this did not occur, for comparison. I’m submitting this report out of genuine concern for vehicle safety and reliability, especially given Tesla’s known issues with thermal and battery behavior in high-heat conditions. These noises began recently, and we have owned the car since 2022 and have driven it consistently in high Texas heat every summer. This suggests a new or developing malfunction, not a feature of normal operation. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
I am writing to raise a safety concern regarding the use of Tesla’s Full Self-Driving (FSD) system on my 2023 Tesla Model Y when a rear hitch-mounted bike rack is installed. Numerous users, including myself, have experienced dangerous or erratic behavior from FSD when carrying bicycles on a rear-mounted rack. Specific issues include: False detection of a trailing vehicle directly behind the car. Phantom braking or swerving, including abrupt lane changes or acceleration. Unreliable navigation at intersections (e.g., nearly running stop signs) due to apparent misinterpretation of sensor data. In some cases, tailgating or aggressive following behavior by the vehicle. These behaviors appear to stem from the FSD system misinterpreting the presence or visual signature of the rack and bikes as another vehicle. In several online reports, this has led to unsafe maneuvers. To temporarily mitigate the issue, some drivers have resorted to taping over the rear camera, which is neither safe nor user-friendly. While I recognize Tesla has made improvements such as camera obstruction warnings (e.g., in FSD v12.5.6+), these do not address the core misclassification problem. My requests: That Tesla officially acknowledge and investigate FSD misbehavior in the presence of rear-mounted bike racks. That Tesla provide clear driver guidance on whether FSD should be used when such a rack is installed. That the FSD system include an option for a “Bike Rack Mode” or alternative logic to prevent rear-camera misinterpretation. That this concern be escalated within your safety and AI development teams to ensure user safety is prioritized.
The issue lies with Tesla’s Full Self-Driving (Supervised) system 12.6.4 on HW3. While driving on roads with lower speed limits (such as 35 or 45 MPH), the vehicle occasionally brakes abruptly for no apparent reason, even when there are no vehicles or obstacles ahead. This unexpected braking behavior can be dangerous, as it may lead to rear-end collisions if the driver behind is not attentive or prepared to stop suddenly.
This started at about 65000 miles in my 2023 Model Y. The rear camera first lost picture and I received a warning that it may be intermittent. It has progressed now to the point that autopilot, navigation, all cameras, cruise control, auto headlight, and auto wipers are all nonfunctional. The car does not know its location. All safety features such as lanekeeping, cross traffic alert, auto braking are disabled. The rearview camera no longer functions. Software updates no longer complete downloading. While there are numerous reports of computer short circuits in hardware 4 cars, the only solution I am offered is to have the computer replaced at my own cost of just under 3000 dollars. And that would be warranted only for one year or 12500 miles. Reviewing the submissions on the NHTSA website as well as searching online, this problem is not infrequent. I do not feel safe letting my wife or daughter drive the car. Certain 2024 and 2025 models with this problem have been recalled for new computers. This must be expanded to include 2023 models as well.
This is my 7th report. On [XXX] [XXX] pm I was traveling home from Alburquerque and was just past Isleta Resort and Casino heading south on [XXX]. I don't use cruise control very often because of the emergency braking that occurs (and that I have reported several times before). I was on a long drive and decided to try it on a short section of [XXX] on my first day, and found that it was slowing on it's own when I approached a car from behind. Two days later on the way home, when there was no one in front of me, it slowed significantly without needing to and I thought it was better than the past rapid braking...as I was thinking about this, the car abruptly "slammed on the brakes". I was forced forward but had my seat belt on. Stuff fell off back seat onto the floor. The Igloo cooler on the front passenger floor toppled forward. I'm quite sure there would have been a accident had someone been behind me. Additionally, the car is not recording my tire rotation from 16,000 miles, and continually says the last tire rotation was at 8,000. Good thing Discount Tire keeps track, as I am now overdue for another rotation (25k+ miles). INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6
Showing 1–20 of 50 complaints
Complaints are unverified consumer reports submitted to NHTSA. A high complaint count may reflect vehicle popularity, not defect severity. Data sourced from NHTSA public records.
Data synced from NHTSA on May 4, 2026