Some interesting bits from the article:
1. The company said that the driver had his
hands off the steering wheel and was
not responding to warnings to re-take control.
2. Tesla said the reason the crash was so severe was because the attenuator (the barrier crash reduction fence) had been crushed in a prior accident without being replaced. “We have never seen this level of damage to a Model X in any other crash,” the company said.
3. Tesla claimed that Autopilot successfully completed “over 200 trips per day on this exact stretch of road.”
https://electrek.co/2018/03/30/tesla-autopilot-fatal-crash-data/
4. Autopilot was engaged with the adaptive cruise control follow-distance set to minimum.
Based on this information, I think I would summarize the accident as follows:
1. The Tesla was on Autopilot and should not have crashed into the barrier. If Teslas really make over 200 trips per day on this exact road, why did it crash straight into a barrier? So shame on Tesla here.
2. The barrier had not been reset, creating a safety issue. It's likely that the driver would not have died if the barrier had been functioning. So shame on the state for not resetting it here. Granted, it takes time to schedule a road crew to get out there, and we don't know how long the barrier was inactive for, when the next scheduled reset was for, etc. In theory, after the accident is cleared & documented, they should have a crew there right after to reset it.
3. The driver purposely set the safe-follow distance to the minimum. From this 2015 article (not sure if anything has changed since then), there are 7 levels of follow distances available. It's possible the accident may have been reduced if he had a larger detection distance enabled. On the flip side, per the video in the article below, leaving a larger follow distance does "invite lane changes", and having lived in California, a lot of people drive fast & close, so it's not unreasonable to set the follow distance to the lowest setting. So you would think that perhaps the driver is to blame here for that setting, but (1) it is an available option built into the car (so it's not like it was being used out of spec), and (2) that's just the driving style in certain parts of the country.
https://www.teslarati.com/set-tesla-tacc-following-distance/
4. The driver did not have his hands on the wheel (for at least 6 seconds before the accident). Per the terms of use, you're supposed to have your hands on the wheel. But Tesla also advertises their car as a self-driving car, with what I currently consider the mis-named Autopilot feature.
5. The driver did not response to multiple visual warnings and one audible warning. Based on the driving situation, the driver would have had roughly 5 seconds and nearly 500 feet of clear viewing of the divider.
So I think the first question here is, why was he not paying attention? Did he trust the Autopilot too much & got lax? Or did he have a health issue like a heart attack? Regardless, technically, the core fault is with the driver - he should have been paying attention with his hand on the wheel and taken control before the accident happened. However, I think people come to rely on the Autopilot system and get into the flow of not having to drive themselves. For example, in this case, if he was playing on his phone or something, he might not have seen the visual warnings, might have heard the single audible warning, looked up, and only had five seconds to mentally switch from whatever he was doing & from allowing the car to drive itself to having to make a decision to grab the wheel & take control. And I would assume that this is his standard daily commute route, so he probably had no reason to be suspicious of the car self-driving into the barrier.
Which leads into the second question, if Teslas make hundreds of commutes a day on this exact stretch of road, then why exactly did it crash? I'd imagine the barrier has not been reset in the past & Teslas in Autopilot mode have driven by it without incident (because we haven't heard of any other fatal crashes in a Tesla in this particular area). So with that assumption, then did the car detect that the driver wasn't paying attention, and tried to pull over, thinking the open median from the barrier was a safe place to park the car? I'm very curious to know what logic the car went through here. It sounds like it had plenty of data on this stretch of road, so the only thing I can think of is that it detected that the driver wasn't paying attention & wanted to pull over. I've seen some videos on how that works and it seems to be variable based on how fast you're going, where you're at, etc. If it only did one audible alert, that seems too little to really alert the driver that it was going to stop. So something is funky with why the Autopilot crashed in this situation.