Skip to content

Tesla Full Self Driving requires human intervention every 21 kilometers

    Koahsiung, Taiwan - June 16, 2022: Man driving an automatic car on the highway
    Enlarge / An independent automotive testing company evaluated Tesla FSD and came up with some disturbing results.

    PonyWang/Getty Images

    Tesla’s controversial “Full Self Driving” is now capable of some pretty advanced driving. But that may be leading to undeserved complacency, according to independent tests. The partially automated driving system exhibited dangerous behavior that required human intervention more than 75 times during more than 1,000 miles (1,600 km) of driving in Southern California, with an average of one intervention every 13 miles (21 km).

    AMCI Testing evaluated FSD builds 12.5.1 and then 12.5.3 in four different environments: city streets, rural two-lane roads, mountain roads, and interstate highways. And as the videos demonstrate, FSD was sometimes capable of some pretty advanced driving behavior, such as merging into a gap between parked cars to let an oncoming vehicle pass, or pulling to the left to give space to pedestrians waiting in a crosswalk for a light to turn green. AMCI also praised FSD for how it handled blind curves in the rural areas.

    “There is no denying that FSD 12.5. 1 is impressive, given the wide variety of human responses it is able to achieve, especially for a camera system,” said Guy Mangiamele, Director of AMCI Testing.

    “But its seeming infallibility in the first five minutes of FSD use creates a sense of awe that inevitably leads to dangerous complacency. When drivers are driving with FSD enabled, driving with their hands in their laps or away from the steering wheel is incredibly dangerous. As you will see in the videos, the most critical moments of FSD miscalculation are split-second events that even professional drivers, driving with a test mentality, have to focus on to catch,” Mangiamele said.

    The dangerous behavior AMCI encountered included running a red light and crossing into the oncoming lane on a curving road while another car was approaching the Tesla. What made matters worse was that FSD’s behavior proved unpredictable, perhaps a consequence of Tesla’s reliance on the probabilistic black box that is machine learning?

    “Whether it's a lack of processing power, a buffering issue when the car gets “behind” in calculations, or a minor detail of the environmental assessment, it's impossible to know. These errors are the most insidious. But there are also persistent errors from simple programming shortcomings, such as not initiating lane changes toward a freeway exit until a mere tenth of a mile before the exit itself, which hampers the system and casts doubt on the overall quality of the basic programming,” Mangiamele said.