• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Uber Suspends Driverless Car Program After Pedestrian Is Struck and Killed

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
No misinformed, plenty of incidents where the computer caused a problem and the crew did not know what to do. I will always prefer human , who can adapt on the fly, at the controls of a car or airplane

You must not drive or commute to work everyday if you would trust a human over a computer at the controls of a car. Do you even see how many people are talking on the phone while driving, eating while driving, putting on make up on driving, screwing around with their music while driving.
 
You are so misinformed I don't know where to begin.

-KeithP

Rudimentary training for pilots was eye opening. An airline is going to teach them the basics and toss them into a 200 ton 250 million dollar aircraft carrying 300 people and hope for the best. lolwut?
 
Retired. And yes, I do see all those idiots not paying attention to the road. Too many gadgets in cars now. But I will still trust myself and not some computer to do the driving for me.

I trust myself too. But I don't trust anyone else. The day the computers take over the roads will become a safer place even if the good drivers are taken out of the equation. There are just too damn many idiots out there for anyone to be safe even if that person drives perfectly.
 
I trust myself too. But I don't trust anyone else. The day the computers take over the roads will become a safer place even if the good drivers are taken out of the equation. There are just too damn many idiots out there for anyone to be safe even if that person drives perfectly.

yeah I've been crashed into twice by people doing completely moronic things that made no sense. Luckily there were no injuries but these people should not be driving multi-ton vehicles at high speeds. If autonomous vehicles were on the road we could dramatically increase the requirements of getting a drivers license without impeding peoples livelihoods.
 
WRONG! In the U.S. pedestrians always have the right of way. It is amazing to me that they are trying to shift the blame of person's death on her. Sure, she was walking her bike across the street outside of the cross walk, but she was still near the intersection, and the operator behind the wheel was not paying attention and failed to stop this fatality from happening.

No they don't, that varies depending on state.

ARS 28-793. Crossing at other than crosswalk

A. A pedestrian crossing a roadway at any point other than within a marked crosswalk or within an unmarked crosswalk at an intersection shall yield the right-of-way to all vehicles on the roadway.

B. A pedestrian crossing a roadway at a point where a pedestrian tunnel or overhead pedestrian crossing has been provided shall yield the right-of-way to all vehicles on the roadway.

C. Between adjacent intersections at which traffic control signals are in operation, pedestrians shall not cross at any place except in a marked crosswalk.

The car had the right of way. Period.
 
Just saw on the news the video from the uber when it hit the woman. She comes right out of the dark and walks in front of the vehicle. The self-driving vehicle is a Volvo XC90 which without even including the Uber automated system has pedestrian and auto collision systems. Even though at much slower speed, I had someone walk right in front of my vehicle and my Volvo with it's City Safety system stopped my vehicle right instead of hitting her. I wonder why the vehicles systems didn't detect the woman earlier.

Found the video:
 
Just saw on the news the video from the uber when it hit the woman. She comes right out of the dark and walks in front of the vehicle. The self-driving vehicle is a Volvo XC90 which without even including the Uber automated system has pedestrian and auto collision systems. Even though at much slower speed, I had someone walk right in front of my vehicle and my Volvo with it's City Safety system stopped my vehicle right instead of hitting her. I wonder why the vehicles systems didn't detect the woman earlier.

Found the video:

Link for other anti-Facebook folks:
https://www.youtube.com/watch?v=8IqpUK5teGM

The Volvo system doesn't work above 30MPH AFAIK. The faster you go, the harder these things are to detect.

IMO, some people will claim they are superior drivers and wouldn't have hit her, but I watched the video and while noting the actions of the car in front (look for the tailights), she appears practically out of nowhere. It's night, and she invisible until the lights hit her, and then it is pretty much too late.
 
Last edited:
Pedestrian laws are odd. While there are laws about when they can and can't be on the road and situations where they can actually be charged (ex: J walking), more often than not, it's the motor vehicle that is held liable. If someone jumps in front of your car to commit suicide, you're the one that's responsible even if there's nothing you could have done.

This is an interesting case though, because human error can somewhat be removed from the equation. Self driving cars will set an interesting situation as far as accident investigation goes. The scary thing though is for the coders. Will it come at a point where the coders actually get blamed? They could say that their code should have accounted for a specific situation etc. On the other hand as a drone operator if your drone takes off on you and causes an issue you're still responsible, so maybe the responsibility will always still be on the driver. Basically you're still going to have to have your hands on the wheel and paying attention to the road so you can intervene.
 
Pedestrian laws are odd. While there are laws about when they can and can't be on the road and situations where they can actually be charged (ex: J walking), more often than not, it's the motor vehicle that is held liable. If someone jumps in front of your car to commit suicide, you're the one that's responsible even if there's nothing you could have done.
I don't know where you live, but around these parts, drivers almost always get away with few to no repercussions after killing vulnerable road users. And to top it off, roads are being inappropriately designed in some areas, totally favoring vehicles over any sort of walkability or safety of other users of the space.
 
As above. I only see drivers held accountable when they actually contributed to the collision.

In most places, if you are obeying the rules of the road, and someone enters the road in front of you, giving you no reasonable time to react, you won't face charges.
 
As above. I only see drivers held accountable when they actually contributed to the collision.

In most places, if you are obeying the rules of the road, and someone enters the road in front of you, giving you no reasonable time to react, you won't face charges.

Correct. Red Squirrel just talks out of his ass and has little clue about the real world.
 
I don't know where you live, but around these parts, drivers almost always get away with few to no repercussions after killing vulnerable road users. And to top it off, roads are being inappropriately designed in some areas, totally favoring vehicles over any sort of walkability or safety of other users of the space.

A lot of the times you may be able to fight it I think but still have to go through the whole court/legal process and your insurance will most likely take a hit too. Though it's been a while since I've heard of any actual incidents like this, so it could be things have changed too. Either way hitting someone whether or not is my fault is one of my worse nightmares. I imagine a dash cam would help a lot in a case like this too, if it can be proven that the pedestrian was at fault then you're probably ok.
 
A lot of the times you may be able to fight it I think but still have to go through the whole court/legal process and your insurance will most likely take a hit too. Though it's been a while since I've heard of any actual incidents like this, so it could be things have changed too. Either way hitting someone whether or not is my fault is one of my worse nightmares. I imagine a dash cam would help a lot in a case like this too, if it can be proven that the pedestrian was at fault then you're probably ok.
You obviously haven't seen any stories out of NYC: people legally crossing in crosswalks killed or injured by vehicles and drivers are rarely held accountable, even when there is video evidence of the drivers failing to yield as they are supposed to. The drivers don't have to worry about fighting it, because they are almost never charged.
 
Link for other anti-Facebook folks:
https://www.youtube.com/watch?v=8IqpUK5teGM

The Volvo system doesn't work above 30MPH AFAIK. The faster you go, the harder these things are to detect.

IMO, some people will claim they are superior drivers and wouldn't have hit her, but I watched the video and while noting the actions of the car in front (look for the tailights), she appears practically out of nowhere. It's night, and she invisible until the lights hit her, and then it is pretty much too late.

Yeah, I don't see a human driver doing any better. I feel like, somehow there has to be some way of improving the recognition there. Not even speaking just for self-driving cars, but for human drivers' benefit as well. I know some cars have "night vision" which would've helped a driver some at least.

I feel like they've been trying to rush autonomous car development a bit too much. There's still a lot of improvement to sensors that need to be made. Like I said earlier in the thread, as part of the development I'd be intentionally doing lots of worst case scenarios (stuff like people throwing stuff off overpasses, people/objects falling into the roadway out of other vehicles, people/objects darting into the road from unseen spots), so they should have some data on how it can handle that stuff and have something telling it to apply some caution for areas that it was weak in.
 
something doesn't seem right about that video. From what I could see, the woman didn't jump out of nowhere, she was walking her bike across the road at a normal pace. It only looks like she jumped out of nowhere because of the crappy camera, there is this weird black spot about 10 feet in front of the car, but that cannot be what the computer is seeing since it should have lidar.

She basically materializes out of nowhere in the middle of the road because the camera sucks (poor low light performance?). Unless I am misinterpreting something this looks like it could be a fault in the sensor system. Seems impossible to me that lidar would not have seen her, she had already walked across a full empty lane, there is nothing obstructing her.
 
something doesn't seem right about that video. From what I could see, the woman didn't jump out of nowhere, she was walking her bike across the road at a normal pace. It only looks like she jumped out of nowhere because of the crappy camera, there is this weird black spot about 10 feet in front of the car, but that cannot be what the computer is seeing since it should have lidar.

She basically materializes out of nowhere in the middle of the road because the camera sucks (poor low light performance?). Unless I am misinterpreting something this looks like it could be a fault in the sensor system. Seems impossible to me that lidar would not have seen her, she had already walked across a full empty lane, there is nothing obstructing her.


Possibly because there are street lights that are widely spaced, and she in the dark patch between them, the bright patches swamp the darker area she is in. It's an effect that can also affect peoples eyes, not just cameras. Plus the bright patches of your headlights make areas outside them look darker as well. I can see her being essentially invisible until the headlights hit her, as I have seen the same thing driving at night for people without reflectors near poorly lit roads, and deer, which tend to be invisible outside headlights.

This does make UBER system look very questionable. LIDAR doesn't depend on lighting, it should have been able to detect that they were approaching something. It seems like even a dumb short range Radar of last resort should be able to trip emergency braking...
 
And as you've seen from the TED talk video I posted earlier on what a car would see, a predictive system should've at least slowed the car upon detecting her coming off the curb up ahead. Definitely should not depend on lighting. The system failed here IMO, especially if it's not doing prediction / defensive driving. And especially if it can't see in the dark.
 
I trust myself too. But I don't trust anyone else. The day the computers take over the roads will become a safer place even if the good drivers are taken out of the equation. There are just too damn many idiots out there for anyone to be safe even if that person drives perfectly.

So you're saying whatever device you are using to view these forums has never locked up, reset, crashed, rebooted, resized, or behaved in any other unplanned or illogical manner?

i don't know about you but electronics can be pretty vexing at times. Simple things like heat, moisture, cloud coverage, electrical storms, etc can render them useless if only for a moment.

Now you're expecting how many millions of vehicles to be on the road autonomously at the same time and using different competing systems, signals from space, adverse weather conditions, all on human built and maintained roads without a lot of issues?

Yeah ok. Wait for the first cluster of them to figure out what to do when a tornado sweeps in and everyone is trying to leave town. When 2' of snow hits NYC streets. When hackers over ride systems. When there's a pile up. When there's a bug in the code. When an update breaks a safety feature.
 
Without seeing the actual video it would be impossible to say, but based on the sheriff's description, no driver be it autonomous or manual would have been able to avoid the person who was hit.

In regards to driver's license, I would like to see more stringent testing for older people. Some of them really should not be behind a wheel anymore.

I see a handful of older bad drivers. In California, you can report them. I thought about reporting a few of them. But then I thought...fuk, even though they're bad, they're not nearly as bad as the people playing with their phones everyday. Nor are they as bad as those who purposely run stop signs and red lights...or make a right, an immediate u-turn, and a right. Nor are they as bad as those who cannot even read street signs. In California, you cannot drive with headphones, but you can drive deaf. You can even drive with only one eye (not even sure how a one eye person can pass a parallel parking test?). In the grand scheme of things, at least the seniors are very cautious drivers.
 
something doesn't seem right about that video. From what I could see, the woman didn't jump out of nowhere, she was walking her bike across the road at a normal pace. It only looks like she jumped out of nowhere because of the crappy camera, there is this weird black spot about 10 feet in front of the car, but that cannot be what the computer is seeing since it should have lidar.

She basically materializes out of nowhere in the middle of the road because the camera sucks (poor low light performance?). Unless I am misinterpreting something this looks like it could be a fault in the sensor system. Seems impossible to me that lidar would not have seen her, she had already walked across a full empty lane, there is nothing obstructing her.

I agree with your assessment. The woman made a mistake crossing where she did, and a human may or may not have done any better a job of seeing her, but there is still something wrong with this system as she had already crossed a full empty lane. From the video, she did not just jump out of nowhere and it only appears this way because of the crappy camera recording system. There is something strange with this recording as 55 to 60 mph is the point where you begin to drive past the point of what your headlights will register, and what you will visually see at night (without the brights on). I work with RADAR, LIDAR, Targeting Forward Looking Infrared (TFLIR), and Digital Aperture Systems (DAS) and she should have shown up on any one of these systems at some point prior to collision (if they are not using multiple detection system techniques, then this is part of the problem). There was no attempt at all by the system to apply the brakes. Just a 10 mph slowing down of the vehicle by applying the brakes could have made a life or death difference.

My biggest point in this whole matter, unlike aircraft systems, is that there no NTSB safety guidelines whatsoever for these companies to follow. It isn't that any of these sensor systems are new, but how they are applied differently across multiple companies and multiple platforms are. I do not like the fact that this is being rushed out and tested on the general public. Murphy's Law has to be kept in mind at all times and I don't like any of us being guinea pigs and considered disposable for the sake of profit.

This is an earlier example that I gave but everyone chose to ignore: Phantom AI was experiencing numerous error codes on their auto braking system so they just disabled it, then took the media out for a ride during testing one of their vehicles. https://www.youtube.com/watch?v=zGoE6Hco4jE

This type of testing should not be allowed when the general public could be severely injured.
Not only should a lawsuit be filed against Uber, but the NTSB as well and force the NTSB to come up with safety requirements for the hardware, software, and the system as a whole (this should not be very hard at all because they already exist for air vehicles). All vehicles should at least be tested at Safety Evidence Assurance Level 1 (SEAL1) prior to exposure to the general public.
 
Last edited:
Yeah, after rethinking it, there's something very off about this. Either it wasn't in autonomous mode and the driver was distracted (kept looking down just before, maybe she was adjusting some settings or getting weird readings and was focused on them instead of remembering that she's piloting a 4000lb object down the road), or the sensors screwed up.

Wonder if there might be a "defeat" control, where they can override the sensors if they think they detect something but the person doesn't? Might be useful for instances where the sensors wig out over something (like it detects a person off of a poster or something along the road or on another vehicle) that isn't actually an issue. So it was saying there was something there but she couldn't see anything so she said it was a false identification but it wasn't.
 
she appears practically out of nowhere. It's night, and she invisible until the lights hit her, and then it is pretty much too late.

yea 99% of drivers out there would have also hit her, it may actually be 100% because unless you were able to see her before the camera does i really doubt shes not being hit in some fashion
 
My biggest point in this whole matter, unlike aircraft systems, is that there no NTSB safety guidelines whatsoever for these companies to follow. It isn't that any of these sensor systems are new, but how they are applied differently across multiple companies and multiple platforms are. I do not like the fact that this is being rushed out and tested on the general public. Murphy's Law has to be kept in mind at all times and I don't like any of us being guinea pigs and considered disposable for the sake of profit.

Exactly, by Uber, a for-profit company who will stand to gain more by getting rid of drivers. They want this tech out like yesterday. I don't recall if I said it in this thread, but it should only be allowed through a tech/car company and not a rideshare company.
 
Back
Top