Today, federal safety investigators opened a new investigation focused on Tesla's electric vehicles. This is now the 14th investigation by the National Highway Traffic Safety Administration and one of several investigations currently open. This time, it's the automaker's highly controversial “full self-driving” feature that's in the crosshairs. NHTSA says it now has four reports of Teslas using FSD and then crashing after the camera system encountered fog, sun glare or airborne dust.
Of the four accidents that prompted this investigation, one caused the death of a pedestrian when a Model Y crashed into them in Rimrock, Arizona in November 2023.
NHTSA has a standing general order requiring it to be notified if a car crashes while operating partially or fully automated. Fully automated or autonomous means that cars can be called “actually self-driving,” like the Waymos and Zooxes that clutter the streets of San Francisco. These four-wheeled test benches are adorned with dozens of external sensors and drive around – usually without passengers – collecting data for later training on their own, without human supervision. (This is also called SAE Level 4 automation.)
But the systems that come in cars you or I might buy are far less advanced. These systems, also called 'level 2+' (which include Tesla Autopilot, Tesla FSD, GM's Super Cruise, BMW Highway Assistant and Ford BlueCruise, among others), are partially automated and not autonomous. They will steer, accelerate and brake for the driver, and they can even change lanes without explicit instructions, but the human behind the wheel is. always intended to be in charge even when the car is driving hands-free.