Trooper rear ended by self-driving Tesla:
Troopers were stopped on the side of the highway Saturday morning responding to a disabled vehicle when their car was hit by Tesla driving in autopilot.
www.wtnh.com
Guy said he was checking on his dog in the backseat while his car was on Autopilot. This precisely illustrates what I think the current danger of partial FSD is: the "lull of complacency". We are creates of habit, and habit enables us to trust things more than we should, which is the case in the majority of Autopilot accidents...people are texting or eating or goofing off & get into an otherwise avoidable accident because they weren't paying attention, because they trusted their system too much, despite warnings in the software. The warnings don't do enough to overcome human behavior in practice, although they do skate by legally. Tesla is in a tricky position because:
1. They need that Autopilot feature to help sell cars
2. They need that Autopilot feature to develop their neural network to get the software better for hands-free FSD on all roads
3. They could activate the internal camera on the Model 3 for gaze tracking, but that would probably reduce the number of cars sold because people don't want a nanny
4. The Lull of Complacency is real & kills people
tbh, this is one of the reasons I haven't purchased a Tesla yet...I know that I would be alert & vigilant, because I'm a computer guy & I'm aware of the system's limitations, but (1) I would probably fall into that trap eventually, because I'm human & we become complacent over time to the systems we operate within, and (2) I don't want to expose my family to that trap & risk injury or death as a result of a partially-working FSD system that they would end up relying on simply because it's available at all times.