This week, a The US Department of Transportation report details crashes involving advanced driver assistance systems over the past year. Tesla’s advanced features, including Autopilot and Full Self-Driving, were responsible for 70 percent of the nearly 400 incidents, far more than previously known. But the report may raise more questions about this security technology than it answers, researchers say, because of data blind spots.
The report examined systems that promise to take some of the unpleasant or dangerous things out of driving by automatically changing lanes, staying within lane lines, braking for collisions, slowing down for big bends in the road and, in some cases, , to work on highways without driver intervention. Systems include Autopilot, Ford’s BlueCruise, General Motors’ Super Cruise, and Nissan’s ProPilot Assist. While it goes to show that these systems aren’t perfect, there’s still a lot to learn about how a new breed of safety features actually work on the road.
That’s largely because automakers have wildly different ways to submit their crash data to the federal government. Some, such as Tesla, BMW and GM, can wirelessly extract detailed data from their cars after a crash has occurred. This allows them to quickly comply with the government’s 24-hour reporting obligation. But others, such as Toyota and Honda, do not have these capabilities. Chris Martin, an American Honda spokesperson, said in a statement that the automaker’s reports to the DOT are based on “unverified customer statements” about whether their advanced driver assistance systems were on when the crash occurred. The automaker can later extract “black box” data from its vehicles, but only with customer approval or at the request of law enforcement agencies, and only with specialized CBNewz equipment.
Of the 426 crash reports detailed in the government report’s data, only 60 percent came through automotive telematics systems. The other 40 percent was through customer reports and claims—sometimes trickling in through diffuse dealer networks—media reports and law enforcement. As a result, the report does not allow anyone to make “apples-to-apples” comparisons between safety features, said Bryan Reimer, who studies automation and vehicle safety at MIT’s AgeLab.
Even the data that the government does collect is not put in full context. For example, the government does not know how often a car with an advanced assistance function crashes per kilometer driven. The National Highway Traffic Safety Administration, which released the report, warned that some incidents could occur more than once in the dataset. And automakers with high market share and good reporting systems, especially Tesla, are likely to be overrepresented in crash reports simply because they have more cars on the road.
It’s important that the NHTSA report doesn’t discourage automakers from providing more comprehensive data, said Jennifer Homendy, chair of the federal watchdog National Transportation Safety Board. “The last thing we want is to punish manufacturers who collect robust safety data,” she said in a statement. “What we do want is data that tells us what security improvements need to be made.”
Without that transparency, it can be difficult for drivers to understand, compare and even use their cars’ features — and for regulators to keep track of who is doing what. “As we collect more data, NHTSA will be able to better identify any emerging risks or trends and learn more about how these technologies perform in the real world,” said Steven Cliff, the agency’s administrator, in a statement. declaration.