2024 TESLA CYBERTRUCK Forward Collision Avoidance Problems
14 complaints about Forward Collision Avoidance
High Severity Issue
This component has been associated with crashes, fires, or deaths.
This Problem Across All Years
All Forward Collision Avoidance Complaints (14)
On December 23, 2025, my 2024 Tesla Cybertruck (Foundation Series) was operating with Full Self-Driving (FSD – Supervised) engaged. While FSD was active, the vehicle executed an unsafe driving trajectory. I attempted to intervene and regain control; however, the system did not disengage as expected and did not yield control appropriately to driver input. Despite driver intervention attempts, the vehicle continued along the unsafe trajectory, resulting in a loss of control and collision with roadside objects. There were no mechanical warnings or alerts prior to the incident. The issue appears related to system behavior, disengagement logic, and driver override response while FSD was active. The vehicle is currently preserved under a formal litigation hold. No inspection, teardown, or data access has occurred. Tesla has confirmed in writing that it does not modify, delete, or alter vehicle data it has received. I am submitting this complaint due to concerns regarding the safety behavior of Full Self-Driving (Supervised), including the system’s failure to disengage upon driver input and the human-machine interface governing control authority.
While exiting the Chick-fil-A drive-thru lane, I lightly pressed the accelerator to move into the next lane. Suddenly, the vehicle jerked forward very quickly, as if I had floored the accelerator. At the same time, the display showed "Emergency Braking in Progress." It felt as though the truck briefly braked and then rapidly released, causing unexpected acceleration. No collision occurred, but it was startling and very scary.
The contact’s father owns a 2024 Tesla Cybertruck. The contact’s father stated that while driving at an undisclosed speed, the Autopilot function became inoperable. The contact stated that while engaging the Autopilot function and attempting to make a turn in a residential area, the vehicle unexpectedly accelerated and collided with a tree. The air bags did not deploy during the incident. As a result of the crash, both the contact’s father and nephew sustained injuries and received medical attention at a local emergency room. The contact’s father sustained a rib injury and bruising, while the contact’s nephew suffered an ongoing back injury due to the incident. A police report was filed. The vehicle was towed and taken to a body shop. The dealer was contacted; however, the vehicle was not diagnosed or repaired. The manufacturer was made aware of the failure, and a case was opened. The approximate failure mileage was 1,500.
Mileage: 1,500
On May 15, 2025, at around 11:30 AM, my Tesla was involved in a collision while Full Self-Driving (FSD) was engaged. The incident occurred at the Tesla West Covina Dealership. The vehicle struck a barrier wall without any collision warning, suggesting both a failure of the sensors and FSD system. After the incident, Tesla staff directed me to their West Covina Service Center. A service advisor reviewed the dashcam footage with me and acknowledged the crash appeared to result from FSD malfunction. I was told Tesla will cover the repair. However, when I returned on May 26, 2025, I was informed that the dashcam footage was no longer available. I did not delete it and suspect it may have been removed by Tesla personnel. They also then claimed they will not cover any repairs. I have written to their legal department to formally request the video footage and any other data they have regarding the crash. Despite follow-up, Tesla has not responded or released any data. I am reporting this as a possible safety defect and mishandling of evidence. I request NHTSA investigate: The failure of Tesla’s FSD and collision avoidance system. The disappearance of crash footage after review by Tesla staff. Tesla’s lack of response to requests for data and documentation. Thank you for your attention.
On May 15, 2025, at around 11:30 AM, my Tesla was involved in a collision while Full Self-Driving (FSD) was engaged. The incident occurred at the Tesla West Covina Dealership. The vehicle struck a barrier wall without any collision warning, suggesting both a failure of the sensors and FSD system. After the incident, Tesla staff directed me to their West Covina Service Center. A service advisor reviewed the dashcam footage with me and acknowledged the crash appeared to result from FSD malfunction. I was told Tesla will cover the repair. However, when I returned on May 26, 2025, I was informed that the dashcam footage was no longer available. I did not delete it and suspect it may have been removed by Tesla personnel. They also then claimed they will not cover any repairs. I have written to their legal department to formally request the video footage and any other data they have regarding the crash. Despite follow-up, Tesla has not responded or released any data. I am reporting this as a possible safety defect and mishandling of evidence. I request NHTSA investigate: The failure of Tesla’s FSD and collision avoidance system. The disappearance of crash footage after review by Tesla staff. Tesla’s lack of response to requests for data and documentation. Thank you for your attention.
I was driving [XXX] going through Saint Paul and the truck sped up and became uncontrollable. I started braking as fast as I could and caused it to rear end into another vehicle. I’ve tried to contact Tesla about this, and they have not responded to me. I was told the engineering team would get back to me about the possible cause of the accident, but never heard back and they could not provide me any of the vehicle data or video which they told me in the service center. They would be able to find that when they diagnosed it, they did fix the vehicle and I ended up selling it due to not being too excited about driving ever again. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
I am writing to report an incident that occurred on [XXX] [XXX] while my Cybertruck was operating in Full Self-Driving (FSD) mode. While driving on [XXX] Brooklyn NY with no other moving traffic present, my Cybertruck did not detect a vehicle that was parked improperly—sticking out too far into the lane. The FSD system failed to adjust its path accordingly and struck the side view mirror of my vehicle, causing significant damage to the mirror glass and housing. I understand that FSD is a driver-assist feature that requires supervision, and I was attentive during the drive. However, the clearance of side view mirrors from the driver’s perspective is extremely difficult to judge at street level, and I believe stationary obstacles of this nature should be reliably detected and avoided by the FSD system. This type of incident raises concerns about the performance of FSD in handling stationary objects, especially when the vehicle was otherwise properly supervised and operating in a straightforward environment. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
I am writing to report an incident that occurred on [XXX] [XXX] while my Cybertruck was operating in Full Self-Driving (FSD) mode. While driving on [XXX] Brooklyn NY with no other moving traffic present, my Cybertruck did not detect a vehicle that was parked improperly—sticking out too far into the lane. The FSD system failed to adjust its path accordingly and struck the side view mirror of my vehicle, causing significant damage to the mirror glass and housing. I understand that FSD is a driver-assist feature that requires supervision, and I was attentive during the drive. However, the clearance of side view mirrors from the driver’s perspective is extremely difficult to judge at street level, and I believe stationary obstacles of this nature should be reliably detected and avoided by the FSD system. This type of incident raises concerns about the performance of FSD in handling stationary objects, especially when the vehicle was otherwise properly supervised and operating in a straightforward environment. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
I am writing to report an incident that occurred on [XXX] [XXX] while my Cybertruck was operating in Full Self-Driving (FSD) mode. While driving on [XXX] Brooklyn NY with no other moving traffic present, my Cybertruck did not detect a vehicle that was parked improperly—sticking out too far into the lane. The FSD system failed to adjust its path accordingly and struck the side view mirror of my vehicle, causing significant damage to the mirror glass and housing. I understand that FSD is a driver-assist feature that requires supervision, and I was attentive during the drive. However, the clearance of side view mirrors from the driver’s perspective is extremely difficult to judge at street level, and I believe stationary obstacles of this nature should be reliably detected and avoided by the FSD system. This type of incident raises concerns about the performance of FSD in handling stationary objects, especially when the vehicle was otherwise properly supervised and operating in a straightforward environment. INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
On [XXX], at approximately around [XXX], an incident occurred involving the Auto Park feature of a Tesla vehicle in a quiet parking lot. The conditions were clear, with no adverse weather, obstacles, or surrounding vehicles. The Auto Park function was engaged, and the vehicle reversed and moved forward as expected. However, it failed to stop and collided with a yellow pole directly in front. The front camera feed was visible on the screen, but the vehicle did not detect or recognize the pole, and no collision alert was issued. The incident occurred too quickly to allow manual intervention via the brake pedal. The collision resulted in minor front bumper damage, with repair costs estimated at over $1,500. All incident details and vehicle data have been provided to Tesla for investigation. And they rejected to provide front plastic bumper replacement. [XXX] [XXX] INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
On [XXX], at approximately around [XXX], an incident occurred involving the Auto Park feature of a Tesla vehicle in a quiet parking lot. The conditions were clear, with no adverse weather, obstacles, or surrounding vehicles. The Auto Park function was engaged, and the vehicle reversed and moved forward as expected. However, it failed to stop and collided with a yellow pole directly in front. The front camera feed was visible on the screen, but the vehicle did not detect or recognize the pole, and no collision alert was issued. The incident occurred too quickly to allow manual intervention via the brake pedal. The collision resulted in minor front bumper damage, with repair costs estimated at over $1,500. All incident details and vehicle data have been provided to Tesla for investigation. And they rejected to provide front plastic bumper replacement. [XXX] [XXX] INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
The contact owns a 2024 Tesla Cybertruck. The contact stated that while driving at 70 MPH with the "Full Self-Driving" (FSD) active, the vehicle ahead of the contact's vehicle abruptly slowed down. The contact began to notice that the vehicle failed to detect the vehicle slowing down and the contact attempted to manually stop the vehicle. The contact crashed into the vehicle which resulted in a three-vehicle collision. The air bags failed to deploy upon impact; as a result, the contact hit her chest on the steering wheel. The contact was taken to the hospital via an ambulance and was treated for neck, back and chest pain. The two other vehicle occupants sustained both neck and back injuries but did not seek medical treatment. The whereabouts of the other drivers were unknown. A police report was filed(report unavailable). The vehicle was initially towed to an independent tow yard. The manufacturer was notified of the failure and later towed the vehicle to an authorized Tesla service center where it remained in their possession. The current condition of the vehicle remains unknown. The failure mileage was 5,149.
Mileage: 5,149
I was driving in a rain storm and the wiper started to malfunction. Would only clean half the windshield. Then shut off completely. Made a weird noise and started smoking in the front.Started working for on wipe then did a 360 on front end of the car. Wiper blade flew off and wiper arm Would not stay in position. Was able To get home but had to get the car towed to Tesla.
NHTSA has created a safety risk by forcing FSD to have extreme monitoring. While in FSD I have to go back to normal driving to change the station on the radio or look at my phone. This is something I do in my non fsd vehicles all the time. The reason for me purchasing FSD is to be safer. Your ridiculous overreach may actually kill me.The entire purpose is to reduce distracted driving, yet your ridiculous rules force an unsafe environment. I hope whoever pushed these restrictions realizes they are accountable for the crashes and deaths they create for forcing people out of FSD.