The NHTSA investigation into Tesla’s autonomous and semi-autonomous driving tech has been upgraded to an “engineering analysis” following reports that six additional crashes involving Teslas and first-responder emergency vehicles had been added to the scope of the investigation in another sign of increased scrutiny of the electric car brand.
The probe, which was initiated last year, initially followed 11 crashes with stationary first responder vehicles since 2018. Those 11 crashes resulted in 17 injuries and one death — but, as of June 10th (Friday), that seems to have been expanded, with NHTSA reportedly having identified and added 6 more incidents that have occurred since then to the official investigation. It’s believed that those crashes led to 15 injuries (including one fatality), and that they were added in part because they represented cases in which Autopilot “gave up” control of the car before the impact. It’s worth noting, though, that Automatic Emergency Braking seems to have intervened in at least half of those crashes.
That’s not just random speculation, by the way. That’s from the NHTSA itself:
“The agency’s analysis of these sixteen subject first responders and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that consequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.”
The agency also revealed some details into the specific crashes, showing that — in most cases — they had reason to believe that the Tesla drivers would have been able to see the stopped emergency vehicles an average of 8 seconds prior to the impact, but that none of them took evasive action on their own. That’s baffling to me, but you can read the report and let me know if you interpret it differently:
All subject crashes occurred on controlled-access highways. Where the incident video was available, the approach to the first responder scene would have been visible to the driver an average of 8 seconds leading up to impact. Additional forensic data available for eleven of the collisions indicated that no drivers took evasive action between 2–5 seconds prior to impact, and the vehicle reported all had their hands on the steering wheel leading up to the impact. However, most drivers appeared to comply with the subject vehicle driver engagement system as evidenced by the hands-on wheel detection and nine of eleven vehicles exhibiting no driver engagement visual or chime alerts until the last minute preceding the collision (four of these exhibited no visual or chime alerts at all during the final Autopilot use cycle).
The way I read this — and I know I have to tread lightly here — is that many of these cases involved inattentive drivers who were abusing the Tesla Autopilot feature. Which, frankly, isn’t a new thing. Below is one idiot who kept hanging out in the back seat of his Tesla while on Autopilot, despite having previously been arrested while for hanging out in the back seat of his Tesla on Autopilot. (!) See for yourself.
Backseat Tesla Drive Unapologetic After Arrest
This is abuse, sure. This isn’t what these systems are intended for. But the argument here is somewhat nuanced. It’s not, as some Tesla/Elon Musk defenders might say, about manipulating statistics or calling out supposed edge cases when a Tesla drives itself into an airplane. It’s about whether or not Tesla has done enough to prevent the abuse of those systems by cognitively impaired drivers (read: high/drunk) or even dubiously competent in-duh-videos.
The NHTSA Probe Doesn’t End There
Tech magazine Engadget is also reporting that, while the NHTSA’s probe into Tesla’s self-driving tech is focused on emergency vehicle crashes, it isn’t limited to them. The agency is looking into 191 additional crashes not involving first responders. In at least 53 of those incidents, the agency found the Tesla driver to be “insufficiently responsive” as evidenced by them not intervening when needed.
To me, that seems to suggest that even drivers who the systems believe to be complying with directions to have their hands on the wheel at all times are not necessarily paying enough attention to the world around them to keep themselves — and others — safe. And, of course, you have dozens of companies selling “Autopilot hacks” to defeat the “hands on wheel” warnings, and even people like this oxygen-thief who are blasting down the road at 86 mph with a piece of citrus manning the helm .
Hacking Tesla Autopilot | ORANGE TRICK
In addition to upgrading the
probe’s engineering analysis’ status, the Wall Street Journal is reporting that the NHTSA investigation has been expanded under the guidance of Transportation Secretary Pete Buttigieg, and now covers 830,000 units. Or, to put it another way, nearly every Model S, 3, X, and Y that the company has sold in the United States since 2014. As WaPo explains, an engineering analysis is the final stage of an investigation, and in most cases NHTSA decides within a year if there will be a recall or if the probe will be closed without further action.
Either way the NHTSA’s investigation goes, however, I’ll close this out by advising you all to stay safe out there. You never know when you’re about to be rear-ended by some literal fruit driving a Tesla!
NOTE: an NHTSA 2016 investigation into Tesla Autopilot concluded that crash rates were reduced by 40% in vehicles equipped with the technology. Tesla itself cited agency’s findings in its own marketing, until the study was later retracted as having been based on fundamentally flawed data.
Related story: New Research Finds Tesla Drivers 50% Less Likely To Crash
Appreciate CleanTechnica’s originality? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.