2019 TESLA MODEL 3 Lane Departure Problems
86 complaints about Lane Departure
High Severity Issue
This component has been associated with crashes, fires, or deaths.
This Problem Across All Years
All Lane Departure Complaints (86)
I see in today's news that the NHTSA is investigating Tesla's FSD due to accidents in poor visibility, but the articles I've read are not discussing another visibility issue that may actually be more widespread. I already reported concerns with Telsa's FSD in incident 11618812, but I forgot to mention another issue I believe is crucial to the success or failure of Tesla's FSD, and I am surprised to find that nobody is talking about this serious flaw. Because Tesla's FSD relies on camera video only, the vehicle is blind in the dark. I live in a rural area, and on some roads without streetlights or other vehicles nearby illuminating the areas around the vehicle, Tesla's FSD falsely assumes that the side and rear facing cameras are obstructed because they cannot see in the dark. This was also an issue when my son and I drove home through the night from South Carolina. When we hit North Carolina on [XXX] in the middle of the night between cities and no streetlights, traffic was sparse so there was nobody behind or to the sides of the vehicle. Every few minutes, FSD falsely warned us that the cameras were obstructed, and alerted us that FSD's capabilities were limited. One point of view is that there was nobody there so my car was clear, and FSD could assume that everything is okay and suppress the warning, but what if the driver of the other vehicle at night failed to have their lights on? And how will FSD ever know that the camera is obstructed, or is in pitch dark? Take an average of the low light across multiple cameras? Is that risky? In my opinion, Tesla's FSD cannot rely on video alone and either needs to use LIDAR, night vision, or 3D radar to see more accurately around the vehicle in poor visibility conditions. The answer to low light, bright low sun, fog, or dust is probably the same solution. But we were all promised FSD, so will Musk upgrade the hardware in our cars at Tesla's expense, or expect all of us to pay for the upgrade to get what we were all promised? INFORMATION REDACTED PURSUANT TO THE FREEDOM
I see in today's news that the NHTSA is investigating Tesla's FSD due to accidents in poor visibility, but the articles I've read are not discussing another visibility issue that may actually be more widespread. I already reported concerns with Telsa's FSD in incident 11618812, but I forgot to mention another issue I believe is crucial to the success or failure of Tesla's FSD, and I am surprised to find that nobody is talking about this serious flaw. Because Tesla's FSD relies on camera video only, the vehicle is blind in the dark. I live in a rural area, and on some roads without streetlights or other vehicles nearby illuminating the areas around the vehicle, Tesla's FSD falsely assumes that the side and rear facing cameras are obstructed because they cannot see in the dark. This was also an issue when my son and I drove home through the night from South Carolina. When we hit North Carolina on [XXX] in the middle of the night between cities and no streetlights, traffic was sparse so there was nobody behind or to the sides of the vehicle. Every few minutes, FSD falsely warned us that the cameras were obstructed, and alerted us that FSD's capabilities were limited. One point of view is that there was nobody there so my car was clear, and FSD could assume that everything is okay and suppress the warning, but what if the driver of the other vehicle at night failed to have their lights on? And how will FSD ever know that the camera is obstructed, or is in pitch dark? Take an average of the low light across multiple cameras? Is that risky? In my opinion, Tesla's FSD cannot rely on video alone and either needs to use LIDAR, night vision, or 3D radar to see more accurately around the vehicle in poor visibility conditions. The answer to low light, bright low sun, fog, or dust is probably the same solution. But we were all promised FSD, so will Musk upgrade the hardware in our cars at Tesla's expense, or expect all of us to pay for the upgrade to get what we were all promised? INFORMATION REDACTED PURSUANT TO THE FREEDOM
I see in today's news that the NHTSA is investigating Tesla's FSD due to accidents in poor visibility, but the articles I've read are not discussing another visibility issue that may actually be more widespread. I already reported concerns with Telsa's FSD in incident 11618812, but I forgot to mention another issue I believe is crucial to the success or failure of Tesla's FSD, and I am surprised to find that nobody is talking about this serious flaw. Because Tesla's FSD relies on camera video only, the vehicle is blind in the dark. I live in a rural area, and on some roads without streetlights or other vehicles nearby illuminating the areas around the vehicle, Tesla's FSD falsely assumes that the side and rear facing cameras are obstructed because they cannot see in the dark. This was also an issue when my son and I drove home through the night from South Carolina. When we hit North Carolina on [XXX] in the middle of the night between cities and no streetlights, traffic was sparse so there was nobody behind or to the sides of the vehicle. Every few minutes, FSD falsely warned us that the cameras were obstructed, and alerted us that FSD's capabilities were limited. One point of view is that there was nobody there so my car was clear, and FSD could assume that everything is okay and suppress the warning, but what if the driver of the other vehicle at night failed to have their lights on? And how will FSD ever know that the camera is obstructed, or is in pitch dark? Take an average of the low light across multiple cameras? Is that risky? In my opinion, Tesla's FSD cannot rely on video alone and either needs to use LIDAR, night vision, or 3D radar to see more accurately around the vehicle in poor visibility conditions. The answer to low light, bright low sun, fog, or dust is probably the same solution. But we were all promised FSD, so will Musk upgrade the hardware in our cars at Tesla's expense, or expect all of us to pay for the upgrade to get what we were all promised? INFORMATION REDACTED PURSUANT TO THE FREEDOM
Car intermittently gives me a "reduced power steering" error making it impossible to steer and had to stop in the fast lane on the freeway and reset the car just to get the error to to away I could have been killed and Tesla refuses to have a recall on my year and model and I must pay out of pocket for a new steering rack this is not fair that other models is covered for a recall but not a 2019 model 3
Car intermittently gives me a "reduced power steering" error making it impossible to steer and had to stop in the fast lane on the freeway and reset the car just to get the error to to away I could have been killed and Tesla refuses to have a recall on my year and model and I must pay out of pocket for a new steering rack this is not fair that other models is covered for a recall but not a 2019 model 3
Nothing happened my cameras just stop working I was told it was a defective sensor.
Nothing happened my cameras just stop working I was told it was a defective sensor.
Safety features on my car are currently unavailable. I called the dealership and was told it was due to a defected sensor in the back windshield. And I would have to pay to replace the entire windshield. Even tho the sensor is defective.
Safety features on my car are currently unavailable. I called the dealership and was told it was due to a defected sensor in the back windshield. And I would have to pay to replace the entire windshield. Even tho the sensor is defective.
With the latest update of V12.3 FSD, the full self driving cannot predict the obstacles clearly, resulting in am incorrect left turn manuever, running over a central divider curb, and damaging tires. Not sure, what could have happened, if a human was standing there instead!
Ever since Tesla updated my vehicle to 2023.44.30.8, the autopilot has become effectively unusable. It now nags and beeps at me whenever I am not looking straight ahead. It is ridiculous that Tesla has ruined one of the best aspects of my car due to (ostensibly?) NHTSA pressure. Please allow their previous driver attentiveness system with the steering wheel to be reinstated. It is so miserable to have my car nanny camera flip out on me every time I look anywhere other than straight ahead. At the very least, every other manufacturer should be forced to have just as annoying a system if my Tesla had to be ruined. Its wrong to punish every Tesla owner because some people misused autopilot.
The right camera and coax cable have gone bad. The coax cable is corroded and have water inside the camera. This is affect the car to use cruise control and the auto steer function.
The lane keeping assistance will send the car jerking left or right without reason or warning. (I’m not sure if the car is reacting to a shadow on the road or for some other reason.). This has occurred a number of times with an abrupt and violent movement. The last time this occurred was at around 6:50pm on Wednesday April 5, 2023 while driving westbound approaching the northernmost Palm Beach Island bridge. The car made a violent and completely unnecessary shift out of the lane and I had to make a forceful corrective steering response that scared both myself and my passenger.
The contact owns a 2019 Tesla Model 3. The contact stated that while his partner was driving at approximately 60 MPH in inclement weather, the lane departure feature malfunctioned, causing the vehicle to hydroplane and crash into the median. No air bags were deployed. The contact's partner sustained a sore neck and back; however, medical attention was not received. A police report was filed. The vehicle was towed to the contact's residence and was not yet deemed totaled by the insurance company. The manufacturer was not made aware of the failure. The failure mileage was approximately 66,000.
Mileage: 66,000
At night, both left and right side turn signal cameras (repeater cameras) become heavily obscured with glare when using turn signals. This interferes with lane keep assist, blind spot/collision warning, autopilot functions, and the ability for a driver to safely discern the presence of a vehicle in the vehicle’s blind spot. In November 2022 Tesla stated “It is a characteristic of the product… that has been design enhanced in newer vehicle production.” The issue is caused by 3 small holes located on the repeater camera’s printed circuit board (PCB) that allow internal light from the turn signal repeater to shine through the PCB to the camera sensor side, directly obscuring most of the camer sensor. The new “design enhanced” version covers these 3 holes with tape, eliminating the problem. Tesla does not view this as a concern and does not cover this safety issue under warranty. A very large number of owners have filed complaints with Tesla. Some owners had the parts replaced under “good will” while most have been charged for the replacement of the side repeaters as Tesla doesn’t classify this as a defect in “materials or workmanship”. Despite this, Tesla felt compelled to “enhance the design” by eliminating this problem. It’s absurd that Tesla knows about the issue and went as far as correcting it to prevent further problems, but won’t issue a TSB for vehicles effected by this “characteristic” that poses a major safety concern.
At night, both left and right side turn signal cameras (repeater cameras) become heavily obscured with glare when using turn signals. This interferes with lane keep assist, blind spot/collision warning, autopilot functions, and the ability for a driver to safely discern the presence of a vehicle in the vehicle’s blind spot. In November 2022 Tesla stated “It is a characteristic of the product… that has been design enhanced in newer vehicle production.” The issue is caused by 3 small holes located on the repeater camera’s printed circuit board (PCB) that allow internal light from the turn signal repeater to shine through the PCB to the camera sensor side, directly obscuring most of the camer sensor. The new “design enhanced” version covers these 3 holes with tape, eliminating the problem. Tesla does not view this as a concern and does not cover this safety issue under warranty. A very large number of owners have filed complaints with Tesla. Some owners had the parts replaced under “good will” while most have been charged for the replacement of the side repeaters as Tesla doesn’t classify this as a defect in “materials or workmanship”. Despite this, Tesla felt compelled to “enhance the design” by eliminating this problem. It’s absurd that Tesla knows about the issue and went as far as correcting it to prevent further problems, but won’t issue a TSB for vehicles effected by this “characteristic” that poses a major safety concern.
At night, both left and right side turn signal cameras (repeater cameras) become heavily obscured with glare when using turn signals. This interferes with lane keep assist, blind spot/collision warning, autopilot functions, and the ability for a driver to safely discern the presence of a vehicle in the vehicle’s blind spot. In November 2022 Tesla stated “It is a characteristic of the product… that has been design enhanced in newer vehicle production.” The issue is caused by 3 small holes located on the repeater camera’s printed circuit board (PCB) that allow internal light from the turn signal repeater to shine through the PCB to the camera sensor side, directly obscuring most of the camer sensor. The new “design enhanced” version covers these 3 holes with tape, eliminating the problem. Tesla does not view this as a concern and does not cover this safety issue under warranty. A very large number of owners have filed complaints with Tesla. Some owners had the parts replaced under “good will” while most have been charged for the replacement of the side repeaters as Tesla doesn’t classify this as a defect in “materials or workmanship”. Despite this, Tesla felt compelled to “enhance the design” by eliminating this problem. It’s absurd that Tesla knows about the issue and went as far as correcting it to prevent further problems, but won’t issue a TSB for vehicles effected by this “characteristic” that poses a major safety concern.
Was using autopilot feature on the vehicle on thanksgiving. Was going about 75 miles per hour when suddenly autopilot failed and the car lost control. I had to quickly regain control of the vehicle before I crashed. I took photos of the error messages and reported the incident to tesla via the tesla app. Typically autopilot fails gracefully and provides notifications. I’ve never seen it fail with no warning and loss of control of the vehicle.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle made a turn left and found itself in a lane that was coned off. Traffic was queuing in the adjacent lane. The vehicle edged forward into the closed lane and waited to join the lane to the right. Even though other drivers were giving way, the vehicle failed to move. The driver took manual control in order to complete the manoeuvre. There is Go_Pro video here - https://vimeo.com/738880343/09033b5c2a
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Cruising at 30+ mph the vehicle was late to slow down ahead of a left turn junction. The vehicle then cut across the junction as it was moving too fast to make the turn correctly. There is Go_Pro footage of the incident here... https://vimeo.com/738674524/5db5854102
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. After successfully crossing a junction the vehicle accelerated too quickly on a narrow residential road with another vehicle approaching from the opposite direction. The vehicle was travelling at 27 mph when it was forced to brake hard. There is Go_Pro footage of the incident here... https://vimeo.com/738677992/0a3c3d4713
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle stopped short of a STOP sign at a junction then slowly moved to the middle of the junction, failing to make a right turn. After manual re-routing the vehicle brakes for no reason other than a pedestrian is close by. The vehicle then fails to make a left turn successfully, cutting the corner. There is Go_Pro video here.. https://vimeo.com/738680260/7c4eeba7e2
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle approached a busy intersection to turn left. Although drivers were giving way to the vehicle, it failed to complete the turn in a timely manner. The vehicle edged forward and then rapidly made the turn at 12mph . The driver had to take over as the FSD made the turn too sharply and too close to a bollard. There is Go_Pro video here https://vimeo.com/738682868/d5b572a69d
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was navigating through roadworks with temporary signage and cones in place. This seemed to interfere with the FSD with the vehicle moving left to right in an erratic manner and accelerating to 28mph whilst still in the roadworks zone. There is Go_Pro video of the incident here https://vimeo.com/736104870/581215a7e7
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. At 10mph the vehicle attempted to change lanes whilst there was another vehicle in the other lane. If the driver had not intervened there may have been a collision or the other vehicle would have to brake hard to avoid contact. There is Go_Pro video of the incident here. https://vimeo.com/736107290/2a01155c84
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. At approx 20 mph the vehicle entered a stretch of road works with temporary signage and cones. The vehicle performed an emergency stop for no reason to the frustration of other road users. The driver had to take control of the vehicle. There is Go_Pro video of the incident here. https://vimeo.com/736110202/bcedc1c516
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. At 25 mph the vehicle veered across two solid white lines and into the cycle lane. There is Go_Pro video of the incident here. https://vimeo.com/736112946/2d0d7a4ed3
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle failed to recognise a speed limit change approaching a sharp left turn. The vehicle took the turn too fast at 29mph veered into the oncoming lane. The quick reactions of the driver, who took control averted a collision. There is Go_Pro video of the incident here https://vimeo.com/736115753/289c3413e9
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. On a section of road with a 15 mph limit the vehicle was travelling at 39 mph. It then attempted to take a corner at 25 mph veering into the oncoming lane. Vimeo Link - https://vimeo.com/735797650/15baa75187
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. On a section of road with tight, blind turns, the vehicle accelerated from 23mph to 27 mph into the approach to a turn. In making the turn the vehicle crossed the median into the oncoming lane. There is a Go-Pro video of the incident here https://vimeo.com/735805944/eab3a31083
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. On a wide residential road, with a speed limit of 30 mph, the vehicle accelerated to 44 mph. The display on the vehicle incorrectly says the limit is 65 before the driver had to manually adjust the maximum speed. There is Go-Pro Video of the incident here https://vimeo.com/735815741/e651f8ab92
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. At 45 mph and without indicating etc, the vehicle swapped lanes from left to right on a dual lane highway. There is Go_Pro video of the incident here. https://vimeo.com/735819637/601febb81d
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While FSD Beta was engaged, the road transitioned from being single lane and one-way only to two lanes and with traffic travelling in both directions. As we travelled over an intersection, the Tesla ignored this change in its surroundings and proceeded to travel in the wrong lane, towards oncoming traffic. Its failure to read road markings could have seriously endangered the driver and other road users. A link to the clip of the incident can be found below: https://vimeo.com/733967979/01bc3c3795
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While FSD Beta was engaged, the road transitioned from being single lane and one-way only to two lanes and with traffic travelling in both directions. As we travelled over an intersection, the Tesla ignored this change in its surroundings and proceeded to travel in the wrong lane, towards oncoming traffic. Its failure to read road markings could have seriously endangered the driver and other road users. A link to the clip of the incident can be found below: https://vimeo.com/733967979/01bc3c3795
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While FSD was engaged, the driver approached a road closure sign due to road works occurring. Ignoring these well-marked signs, which clearly said 'Road Closed', the Tesla proceeded past these signs until the driver had to perform a stop to prevent the vehicle from colliding with the road works. A link to the clip of the incident can be found below: https://vimeo.com/733970645/9ef2ad21d2
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While FSD was engaged, the driver approached a road closure sign due to road works occurring. Ignoring these well-marked signs, which clearly said 'Road Closed', the Tesla proceeded past these signs until the driver had to perform a stop to prevent the vehicle from colliding with the road works. A link to the clip of the incident can be found below: https://vimeo.com/733970645/9ef2ad21d2
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While FSD Beta was engaged, the driver approached an intersection, the FSD stopped correctly to yield to oncoming traffic on a one-way street. The vehicle was travelling across the intersection according to its navigation. There was also a cyclist ahead of the Tesla at the intersection. As the cyclist entered the intersection, the Tesla followed and attempted to travel around the cyclist to complete the manoeuvre and continue travelling straight ahead. However, the Tesla then deviated from its course and instead travelled left at the intersection, travelling the wrong way on one-way street, causing the driver to perform an emergency stop and reverse out of the dangerous situation. The Tesla appeared to panic at the presence of the cyclist when attempting to travel around it, and then headed directly into oncoming traffic while travelling the wrong way up the one-way street. A link to the clip of the incident can be found below: https://vimeo.com/733972674/ce244b3a69
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While travelling at 32 mph the vehicle moves to the right in preparation for an upcoming right turn. However, there is a parked truck in the lane the Tesla intended to enter, and the vehicle passes far too close to it before over-correcting and veering back into the original lane before once again moving back to the right-hand lane. A link to the clip of the incident can be found below: https://vimeo.com/733567653/6d929fc7d6
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While travelling at 32 mph the vehicle moves to the right in preparation for an upcoming right turn. However, there is a parked truck in the lane the Tesla intended to enter, and the vehicle passes far too close to it before over-correcting and veering back into the original lane before once again moving back to the right-hand lane. A link to the clip of the incident can be found below: https://vimeo.com/733567653/6d929fc7d6
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Whilst indicating to turn left the vehicle carried on straight. The driver had to disengage from FSD mode and complete the turning. The unexpected late deviation from its plotted course gave the driver very little time to respond, and presented a potential hazard for other road users. A link to the clip of the incident can be found below: https://vimeo.com/733569655/e65e5888e5
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle, when FSD was engaged, made a left turn at an intersection when in the right turn lane. Its lane position was entirely incorrect, and is a consistent failure within this software. A link to the clip of the incident can be found below: https://vimeo.com/733571685/fb22a84a02
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Whilst preparing to make a right turn the vehicle moved lanes too quickly, at just under 20 mph, and came very close to parked vehicles before correcting and completing the manoeuvre. A link to the clip of the incident can be found below: https://vimeo.com/733575296/1285b49f58
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While cruising at 30 mph the vehicle crossed a solid white line veering left into a bike lane. The vehicle then corrected itself, though this is an incredibly dangerous defect which could seriously endanger cyclists and other road users. This is a consistent error which is committed by Tesla Full Self-Driving Beta version 10.12.2 software. A link to the clip of the incident can be found below: https://vimeo.com/733576651/b62fd06772
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. While cruising at 30 mph the vehicle crossed a solid white line veering left into a bike lane. The vehicle then corrected itself, though this is an incredibly dangerous defect which could seriously endanger cyclists and other road users. This is a consistent error which is committed by Tesla Full Self-Driving Beta version 10.12.2 software. A link to the clip of the incident can be found below: https://vimeo.com/733576651/b62fd06772
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Cruising at 49mph on a road that was opening up from one lane, into two, the vehicle veered to the right where a cyclist was present in the cycle lane. The driver had to take action to ensure the vehicle did not come too close to the cyclist. A link to the clip of the incident can be found below: https://vimeo.com/732397659/c27b5b751d
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Entering an area where there were roadworks, at more than 30 mph, in FSD mode the vehicle veered to the right into a lane that was closed off. The driver had to take back control of the vehicle and put it back into the correct position on the road to avoid colliding with the concrete barriers in the road works. A link to a clip of the incident can be found below: https://vimeo.com/732399072/8ae3aeaaee
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Entering an area where there were roadworks, at more than 30 mph, in FSD mode the vehicle veered to the right into a lane that was closed off. The driver had to take back control of the vehicle and put it back into the correct position on the road to avoid colliding with the concrete barriers in the road works. A link to a clip of the incident can be found below: https://vimeo.com/732399072/8ae3aeaaee
My friend was driving the vehicle (Tesla Model 3, 2019) while Full Self-Driving Beta 10.12.2 mode was engaged. The FSD Tesla makes a left turn at an intersection on a protected left turn, but slows down considerably just before the traffic lights - which did not apply to the Tesla but rather the traffic from the opposite side of the intersection - after making the turn, stopping in the middle of the lane. The driver disengages and accelerates to complete the manoeuvre, to avoid a rear collision. The Tesla may have mistaken the red traffic light showing for other traffic as one it should pay attention to, almost causing a collision and displaying the inadequacies of its vision and cameras. A link to the clip of the incident can be found below: https://vimeo.com/732401882/8837a257a0
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle completed a right turn at an intersection then veers to the left, encroaching into the next lane. The FSD display shows no vehicle present at the driver's side left, but the driver had to take back control of the vehicle in order to avoid a collision as there was a car in the lane the Tesla was attempting to enter. A link to the clip of the incident can be found below: https://vimeo.com/732403629/851c8377fc
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle completed a right turn at an intersection then veers to the left, encroaching into the next lane. The FSD display shows no vehicle present at the driver's side left, but the driver had to take back control of the vehicle in order to avoid a collision as there was a car in the lane the Tesla was attempting to enter. A link to the clip of the incident can be found below: https://vimeo.com/732403629/851c8377fc
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle completed a right turn at an intersection then veers to the left, encroaching into the next lane. The FSD display shows no vehicle present at the driver's side left, but the driver had to take back control of the vehicle in order to avoid a collision as there was a car in the lane the Tesla was attempting to enter. A link to the clip of the incident can be found below: https://vimeo.com/732403629/851c8377fc
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was travelling at speeds between 12 to 18 mph. As two lanes merged into one the Tesla came dangerously close to the vehicle in front and the driver had to take action in order to avoid a collision. A link to a clip of the incident can be found below: https://vimeo.com/731683998/17e89131be
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Cruising at 30 mph, on a very straight road, the vehicle twice veered across solid white road markings and into the cycle lane before correcting. This is a repeated problem with FSD Beta 10.12.2 and presents a very real danger to cyclists. A link to a clip of the incident can be found below: https://vimeo.com/731686163/0ec7f7f428
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Cruising at 30 mph, on a very straight road, the vehicle twice veered across solid white road markings and into the cycle lane before correcting. This is a repeated problem with FSD Beta 10.12.2 and presents a very real danger to cyclists. A link to a clip of the incident can be found below: https://vimeo.com/731686163/0ec7f7f428
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was cruising at 35 mph whilst indicating right as per the Sat Navigation instruction. In FSD mode the vehicle ignored the junction and continued straight ahead, deviating from its plotted course on the navigation. A link to the clip of the incident can be found below: https://vimeo.com/731688075/d3f31fd060
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was travelling at approx 35 mph - too fast for a narrow and winding road. The vehicle crossed the central line into the oncoming lane and a moment later veered towards the right - nearly making contact with the curb and causing a collision. The driver had to take back control of the vehicle. A link to the clip of the incident can be found below: https://vimeo.com/731689228/485219aa71
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Travelling on a narrow and winding road the driver had set the max speed to 20 mph. After re-engaging FSD mode the Tesla “reset” the max speed to 35 mph. Clearly too fast for the road conditions, the vehicle had difficulty negotiating the road safely and almost veered into the wrong side of the road. A link to a clip of the incident can be found below: https://vimeo.com/731691467/e336068d60
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. On approach to a right turn at an intersection, which was necessary to follow the route on the navigation, the vehicle positioned itself poorly several yards from the car in front and did not take the space in the right hand lane to make the turn. The driver had to take control of the vehicle in order to make progress and not inconvenience other road users. The vehicle was also permitted to turn right on red, and failed to do so. A link to a clip of the incident can be found below: https://vimeo.com/731694412/aeedeeccfc
1. Phantom braking while on autopilot in a freeway (I15, I215, etc) 2. Disengaging auto pilot when it is making a curve in a freeway to another freeway (San Diego freeway)
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle approached an intersection, on a bend in the road, at more than 30 mph and braked to a near halt whilst in the middle of the intersection. The speed limit is signed at 15mph. The vehicle then proceeds in FSD mode and takes another corner 5 to 10 mph in excess of the speed limit of 20 mph. A link to a clip of the incident can be found below: https://vimeo.com/731419136/51bfb24df3
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Whilst cruising at 30 mph the vehicle leaves its lane crossing two solid white lines and veering into the adjacent cycle lane before correcting and returning to the original lane whilst maintaining speed. A link to a clip of the incident can be found below: https://vimeo.com/731422280/c39fc14a23
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Whilst cruising at 30 mph the vehicle leaves its lane crossing two solid white lines and veering into the adjacent cycle lane before correcting and returning to the original lane whilst maintaining speed. A link to a clip of the incident can be found below: https://vimeo.com/731422280/c39fc14a23
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was travelling at 29 mph, slowing to 15 mph, as it approached an intersection on a dual lane highway. As the vehicle slowed it veered to the right, crossing into the adjacent lane, even though there was another vehicle present. The driver had to intervene and take over control of the vehicle from FSD mode to prevent the risk of a collision. A link to the clip of the incident can be found below: https://vimeo.com/731425264/4b09cb1e7a
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was stationary at a “no right turn on red light” intersection. In spite of the sign, in FSD, the vehicle took the turn in “autopilot creeping forward” mode, making an illegal turn on a busy intersection. Please see a link to a clip of the incident below: https://vimeo.com/731427985/e4d022bdea
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was stationary at a “no right turn on red light” intersection. In spite of the sign, in FSD, the vehicle took the turn in “autopilot creeping forward” mode, making an illegal turn on a busy intersection. Please see a link to a clip of the incident below: https://vimeo.com/731427985/e4d022bdea
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. Whilst travelling at 30 mph the vehicle leaves its lane and swerves across a solid white line into the clearly marked bicycle lane. The speed of the vehicle remained at 30mph. The vehicle then returns to the correct lane, though the incident could have been incredibly dangerous if there was a cyclist in that lane. A link to a clip of the incident can be found below: https://vimeo.com/730261900/e4357f500c
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle approached a stop sign ahead of a single lane bridge at 14 mph. The vehicle braked and left the road coming to a halt on a private driveway. In FSD mode the vehicle attempted to approach a metal gate. The driver had to disengage FSD mode to safely manoeuvre the vehicle. A link to the clip of the incident can be found below: https://vimeo.com/730266676/f170446e7d
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The Tesla fails to slow down when approaching a curve in the road resulting in taking the corner far too quickly, despite the fact that there is a sign by the road warning drivers to slow down to 15mph. Tesla also failed to recognise the change in speed limit due to a blind turn in the road approaching, the car continued to go at a rapid speed (30mph) and even turned left when the road curved toward the right, leading to the driver taking control to steer the car back on path. During the turn, the Tesla was engaged in FSD Beta 10.12.2 and crossed the dividing line into the opposite lane. If another vehicle was approaching this incident would have resulted in a collision as the Tesla strayed into the path of oncoming traffic. A link to a clip of the incident can be found below: https://vimeo.com/730269602/ae8bd9f3b1
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The Tesla fails to slow down when approaching a curve in the road resulting in taking the corner far too quickly, despite the fact that there is a sign by the road warning drivers to slow down to 15mph. Tesla also failed to recognise the change in speed limit due to a blind turn in the road approaching, the car continued to go at a rapid speed (30mph) and even turned left when the road curved toward the right, leading to the driver taking control to steer the car back on path. During the turn, the Tesla was engaged in FSD Beta 10.12.2 and crossed the dividing line into the opposite lane. If another vehicle was approaching this incident would have resulted in a collision as the Tesla strayed into the path of oncoming traffic. A link to a clip of the incident can be found below: https://vimeo.com/730269602/ae8bd9f3b1
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was negotiating a series of bends and sharp corners travelling at speeds between 11 mph and 34 mph. The vehicle failed to respond to changes in the speed advice signs telling the car to slow down for sharp corners, taking corners too fast, and crossing the median line into the oncoming lane. The driver had to take action to adjust the lane position of the vehicle to avoid a collision. A link to a clip of the incident can be found below: https://vimeo.com/730270943/1bdcab7a7a
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The vehicle was negotiating a series of bends and sharp corners travelling at speeds between 11 mph and 34 mph. The vehicle failed to respond to changes in the speed advice signs telling the car to slow down for sharp corners, taking corners too fast, and crossing the median line into the oncoming lane. The driver had to take action to adjust the lane position of the vehicle to avoid a collision. A link to a clip of the incident can be found below: https://vimeo.com/730270943/1bdcab7a7a
My friend was driving the car (a 2019 Tesla Model 3) while it was engaged in Full Self-Driving Beta 10.12.2 mode. The Tesla attempted to cut across the freeway while merging from the on-ramp. The auto-steer function attempted to drive perpendicular to the on-ramp and cross the lanes on the freeway, forcing the driver to disengage Full Self-Driving mode to avoid a collision with high-speed traffic approaching from the rear. The extremely dangerous event was recorded by using a GoPro camera, and a link to a video of the incident can be found below: https://vimeo.com/729927123/a5cd8eeb15 The car was originally positioned in the wrong lane of the on-ramp, and then suddenly lurched to the left while making the right-hand turn to merge with the freeway. With traffic approaching at such a high speed, the sudden movement into the middle of the freeway was highly dangerous and the FSD Beta 10.12.2 did not give the driver much time to react.
My friend was driving my car (a Tesla Model 3) at the time of this incident. At the time of the incident the Tesla’s Full Self-Driving Beta system (version 10.12.2) was active. The Tesla fails to slow down when approaching a series of sharp bends in the road resulting in taking the first corner far too quickly. The posted speed limit according to the sign was 15 mph, however, the Tesla was travelling at approximately 30mph, double the posted speed limit. The Tesla therefore drives into the turn too widely and tries to steer into the oncoming traffic lane, forcing the driver to intervene quickly to correct the manoeuvre and reposition the car back into the lane. During the entire incident, the car was travelling, autonomously, at 25mph. Please see below a link to a short video showing this incident: https://vimeo.com/720341558/004264b711
My friend was driving my Tesla Model 3 while Full Self-Driving Beta 10.12.2 was engaged. The Tesla Full Self-Driving Beta software caused the car to stop at right turn onto highway on ramp and hesitated whilst it waited for traffic to pass, causing a queue of cars behind it and frustrated drivers honking their horns. The Tesla could have taken the turn into the right hand lane and merged ahead. Instead, it pulled out into the turn lane while traffic was passing and had to steer around the Tesla to avoid colliding with it. Please find a link to the clip of the incident below: https://vimeo.com/729932395/315ccb9ffb
I was driving on Tesla “autopilot” and instead of noticing the exit from freeway , the car thought it was still on freeway as it had to take the next exit, it just on full speed went into the exit, it did not break or slow down and just before the sharp exit turn on 65 mph it just disengaged autopilot with message “system error “ luckily I was aware and was able to avoid a major crash . If I was a second late, the car would have gone through the road barrier. I’m sure if I had a crash Tesla would find reasons to blame me for the crash even though it was its software. Autopilot is junk piece of software with little regards to customer safety. I am not sure how something like this is even allowed to be sold and used on public roads
Even after the firmware update which was supposed to have FSD beta come to a full stop, the car still does not come to a full stop. The car slows to a speed of around 0.5 mph at a stop line with a stop sign and continues on if it is clear, but only comes to a full stop if it believes it needs to yield to other vehicles. Watching videos of other vehicles demonstrating the capabilities of FSD beta with the latest firmware version show the same behavior.
While the vehicle’s “autopilot” is engaged with the screen showing the blue lane border lines on the screen, the lane keeping assistance will suddenly and sharply veer out of a straight and clearly painted lane without any warning creating a dangerous reaction from the driver to attempt to correct (and possibly over correct) the abrupt and unexpected departure from the lane.?
The contact owns a 2019 Tesla Model 3. The contact stated that while operating the vehicle the Lane Departure, steering assist, and braking assist warning lights illuminated. During the failure the steering wheel had become difficult to turn in either direction. Additionally, while releasing the brake pedal, the vehicle continued to accelerate instead of decelerating as normal. The vehicle was taken to the local service center who performed unknown repairs however, after retrieving the vehicle the steering failure reoccurred. The vehicle was towed back to the service center. No further information was available. The manufacturer was not notified of the failure. The failure mileage was 30,000.
Mileage: 30,000
While driving with Tesla auto-pilot enabled, the car experiences phantom braking. This has happened multiple times already. The car will break unexpectedly for no good reason even if there is no obstruction on the road. The problem typically happens in the freeway at high speeds and presents a huge safety risk to the car occupants and the car behind since it increases the risk of a rear end collision. The problem is pretty common, a simple search on internet of Tesla Auto Pilot Phantom braking will yield many results. The problem started getting worse after Full Self Driving Beta was enabled around January 2022. When the problem happens, the car beeps loudly and breaks really hard.
Automatically activated Blind Spot camera system view is filled over 50% with glare from blinker while driving at night. This renders the driver safety feature unusable during nighttime driving and creates additional driver distraction with the bright light flashing on the vehicle display every time the turn signal is activated at all driving speeds.
The screen on my Tesla fails routinely (everyday) while driving and then reboots after about 3 minutes. This happens abruptly with no warning. Tesla has told me (verbally) that it was reproduced in their service location in Palo Alto. During this time, I have no idea of my speed or any other vehicle status, such as lane keeping, collision warning, navigation, heating, etc. I have tried to remedy this issue with Tesla service repeatedly over two months, and the interactions have been unsatisfactory. Tesla does not seem to appreciate the potential seriousness of this issue and the risk to ourselves and others. Tesla Service at Sunnyvale, CA, encouraged me to continue driving, even though I pointed out the risk of "driving blind". My family is very wary of this vehicle now.
After getting a shingles vaccination and eating lunch, a vaccine side-effect suddenly caused me to become extremely sleepy while driving. But AutoPilot drove me safely to my destination parking lot where I took a two hour nap. AutoPilot saved my life and car from an accident.
I was driving on I-80 eastbound at between Dixon, CA and Davis, CA with the Tesla Autopilot engaged (adaptive cruise control and lane assistance, NOT full navigation), hands on the wheel, in the rightmost lane. The autopilot suddenly started drifting to the left and the car partially entered the lane to the left of me before i disengaged autopilot and resumed control of the vehicle. At the same time, the center console display froze and "glitched" and became unresponsive. The car steering, acceleration and braking worked as normal, but turn signals did not work and i couldn't see my speed (the display had frozen). After a few seconds, the display turned off (still unresponsive). I pulled over to the shoulder and put the car into park and the display seemed to automatically reboot (the screen turned back on and displayed the tesla logo). After a minute, the car had completed rebooting and appeared to operate as normal again.
I was going 45 mph with autopilot engaged and in fact the autopilot actually caused the accident. The car suddenly turned to the right and hit the curb then spinned 180 degrees midair, landed on the sidewalk and then rolled back fishtailed a wooden pole.
B PILLAR CAMERAS TEND TO GET CONDENSATION IN THEM CAUSING THEM TO WORK INTERMITTENTLY.THIS HAS BEEN REPORTED TO TESLA SERVICE IN ROCKVILLE, MD, TO NO EFFECT. ACCORDING TO TESLA, THEY DO NOT HAVE A SOLUTION FOR IT, YET THE CAMERAS CONTINUE TO INTERMITTENTLY FAIL TO PROVIDE COVERAGE, ESPECIALLY IN WEATHER CONDITIONS WHEN THEY ARE MOST NEEDED. USUALLY, THE CAR IS IN MOTION WHEN THIS ISSUE ARISES AND WHILE DRIVING AT HIGHWAY SPEEDS.
Mileage: 25,000
B PILLAR CAMERAS TEND TO GET CONDENSATION IN THEM CAUSING THEM TO WORK INTERMITTENTLY.THIS HAS BEEN REPORTED TO TESLA SERVICE IN ROCKVILLE, MD, TO NO EFFECT. ACCORDING TO TESLA, THEY DO NOT HAVE A SOLUTION FOR IT, YET THE CAMERAS CONTINUE TO INTERMITTENTLY FAIL TO PROVIDE COVERAGE, ESPECIALLY IN WEATHER CONDITIONS WHEN THEY ARE MOST NEEDED. USUALLY, THE CAR IS IN MOTION WHEN THIS ISSUE ARISES AND WHILE DRIVING AT HIGHWAY SPEEDS.
Mileage: 25,000