something doesn't seem right about that video. From what I could see, the woman didn't jump out of nowhere, she was walking her bike across the road at a normal pace. It only looks like she jumped out of nowhere because of the crappy camera, there is this weird black spot about 10 feet in front of the car, but that cannot be what the computer is seeing since it should have lidar.
She basically materializes out of nowhere in the middle of the road because the camera sucks (poor low light performance?). Unless I am misinterpreting something this looks like it could be a fault in the sensor system. Seems impossible to me that lidar would not have seen her, she had already walked across a full empty lane, there is nothing obstructing her.
I agree with your assessment. The woman made a mistake crossing where she did, and a human may or may not have done any better a job of seeing her, but there is still something wrong with this system as she had already crossed a full empty lane. From the video, she did not just jump out of nowhere and it only appears this way because of the crappy camera recording system. There is something strange with this recording as 55 to 60 mph is the point where you begin to drive past the point of what your headlights will register, and what you will visually see at night (without the brights on). I work with RADAR, LIDAR, Targeting Forward Looking Infrared (TFLIR), and Digital Aperture Systems (DAS) and she should have shown up on any one of these systems at some point prior to collision (if they are not using multiple detection system techniques, then this is part of the problem). There was no attempt at all by the system to apply the brakes. Just a 10 mph slowing down of the vehicle by applying the brakes could have made a life or death difference.
My biggest point in this whole matter, unlike aircraft systems, is that there no NTSB safety guidelines whatsoever for these companies to follow. It isn't that any of these sensor systems are new, but how they are applied differently across multiple companies and multiple platforms are. I do not like the fact that this is being rushed out and tested on the general public. Murphy's Law has to be kept in mind at all times and I don't like any of us being guinea pigs and considered disposable for the sake of profit.
This is an earlier example that I gave but everyone chose to ignore: Phantom AI was experiencing numerous error codes on their auto braking system so they just disabled it, then took the media out for a ride during testing one of their vehicles.
https://www.youtube.com/watch?v=zGoE6Hco4jE
This type of testing should not be allowed when the general public could be severely injured.
Not only should a lawsuit be filed against Uber, but the NTSB as well and force the NTSB to come up with safety requirements for the hardware, software, and the system as a whole (this should not be very hard at all because they already exist for air vehicles). All vehicles should at least be tested at Safety Evidence Assurance Level 1 (SEAL1) prior to exposure to the general public.