Uber Suspends Driverless Car Program After Pedestrian Is Struck and Killed

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

slayer202

Lifer
Nov 27, 2005
13,679
119
106
Read the post I quoted, you summarized my argument as "who cares if AVs are bad drivers"

You are making a leap from "people are bad drivers (and technology should be able to be better in the future)" to "because people are bad drivers, who cares if autonomous vehicles are bad drivers and we shouldn't regulate and just send them out there" which is bullshit

First of all, this is EXACTLY what you are claiming people in this thread are saying. You went further by somehow making it MY argument because I said "people are bad drivers." But wait, let's see how you interpreted this bold part of my post

BTW: You are now making the strawman because I never said that people said they were okay with AVs killing people

now you're jumping to "killing people." Now that is some real bullshit. That's just a lie or ignorance. Either way, I stand by my statement, just fuck off
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
Wait I did not realize driverless cars were ACTUALLY live yet. I thought the tech was still highly in prototype stage and that there was always a driver.

Interestingly enough driverless cars and true "auto pilot" is probably more complicated than autopilot on a plane. A plane is in the open sky with nothing around it, and has lot of time and distance to react. Using various sensors like altitude and compas, and ADS-B and radar, they can more or less avoid each other. Cars on the other hand are always operating on split second decisions and moving within feet of obstacles and have to account for road conditions and stop and go a lot. Planes don't stop, they just move around anything and have lot of time to do so.

Then again, ships are probably even simpler than planes in terms of traffic avoidance because of being on the open sea with relatively little traffic compared to a road, yet the US military has managed to crash at least 2 within a year. lol

It's far more complicated than auto pilot on a plane and development of aircraft control software is subject to stringent regulation and oversight.

We will get autonomous cars but it has been way over hyped at this point.
 

Zorba

Lifer
Oct 22, 1999
15,613
11,255
136
First of all, this is EXACTLY what you are claiming people in this thread are saying. You went further by somehow making it MY argument because I said "people are bad drivers." But wait, let's see how you interpreted this bold part of my post

That is not what I am claiming. I am claiming that people have no idea if they are dangerous or not without proper oversight and regulation. I assume, and I have stated as such in responses to you, that most people just ASSUME they will be safe. I never once said that any one wants to knowingly release dangerous autonomous cars on the road. I said that people are more concerned with getting them on the road quickly, than they are about actually setting up the proper legal framework for them.

now you're jumping to "killing people." Now that is some real bullshit. That's just a lie or ignorance. Either way, I stand by my statement, just fuck off

Considering the context of humans being bad drivers because of how many fatalities there are on the road (how my posts started), how is you saying "who cares if autonomous vehicles are bad drivers" not implying "killing people?" How exactly would autonomous cars be bad drivers if they were not dangerous? I think it was pretty obvious that I was never talking about AVs having a rubber necking problem.

You also said that my argument against releasing AV and semi-AV systems to production rapidly was a strawman, but have refused to actually counter any of my examples of them currently being pushed rapidly into the real world.

Also why are you so worked up? I am trying to debate the merits of of regulation and oversight on something that could have massive safety and liability implications, and I said nothing against you personally until you told me to fuck off, but you feel the need to call me a jackass and tell me to fuck off. If you don't want to talk about this, stop replying to me.
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
For companies like uber it's really just a marketing gimmick :) realistically how many AVs are they going to deploy on a global level?

Wouldn't it still be cheaper. To pay someone minimum wage. Who then has to provide and maintain their own vehicle?

Autonomous uber omfg! I wouldn't get in it personally. Even though most uber drivers look like terrorists I am still risk averse. :p

Most countries aren't just going to say Fuck me autonomous cars eh? Just start driving them on our roads with no regulation or oversight.
 

Zorba

Lifer
Oct 22, 1999
15,613
11,255
136
While saying I was creating a strawman, YOU CREATED A STRAWMAN! If you were a troll I'd tell you that was too on the nose.
Then define what you meant by "bad at driving" if you didn't mean dangerous.

Regardless, I've made my point clear countless times, you keep just shitting in the thread and whining, but have yet to even make real argument or statement about anything. So I think you need to look up the definition of troll, because it is generally the person throwing out insults and adding zero content, not the person attempting to discuss the topic of the thread logically.
 

bononos

Diamond Member
Aug 21, 2011
3,938
190
106
Uber has one of the shadiest company histories that I've ever seen published. I trust Musk & his camera-based approach about a million times more than I do Uber's approach, and even his system has killed people before.
Uber has to be a shady business because its model is to grow so big and fast even using illegal means to shut out the competition. And its also bleeding cash so I doubt its going to change its ways of becoming 'fairer' because it can't. And in some markets currently it doesn't make sense to work for Uber because the pay isn't enough for fuel/insurance/maintenance.
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
51,540
7,233
136
Uber has to be a shady business because its model is to grow so big and fast even using illegal means to shut out the competition. And its also bleeding cash so I doubt its going to change its ways of becoming 'fairer' because it can't. And in some markets currently it doesn't make sense to work for Uber because the pay isn't enough for fuel/insurance/maintenance.

I don't think they HAVE to be shady. Companies are like families - it all starts with the leadership, which flows down & creates a culture in the workplace. Sometimes that culture is toxic.
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
51,540
7,233
136
Tesla says Autopilot was engaged during fatal Model X crash
The company posted a statement online late Friday, after local news reported that the victim had made several complaints to Tesla about the vehicle’s Autopilot technology prior to the crash in which he died.
More to come:

https://www.theverge.com/2018/3/30/17182824/tesla-model-x-crash-autopilot-statement

Some interesting bits from the article:

1. The company said that the driver had his hands off the steering wheel and was not responding to warnings to re-take control.
The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

2. Tesla said the reason the crash was so severe was because the attenuator (the barrier crash reduction fence) had been crushed in a prior accident without being replaced. “We have never seen this level of damage to a Model X in any other crash,” the company said.

screen-shot-2018-03-27-at-9-49-22-pm.jpg



3. Tesla claimed that Autopilot successfully completed “over 200 trips per day on this exact stretch of road.”

https://electrek.co/2018/03/30/tesla-autopilot-fatal-crash-data/

4. Autopilot was engaged with the adaptive cruise control follow-distance set to minimum.

Based on this information, I think I would summarize the accident as follows:

1. The Tesla was on Autopilot and should not have crashed into the barrier. If Teslas really make over 200 trips per day on this exact road, why did it crash straight into a barrier? So shame on Tesla here.

2. The barrier had not been reset, creating a safety issue. It's likely that the driver would not have died if the barrier had been functioning. So shame on the state for not resetting it here. Granted, it takes time to schedule a road crew to get out there, and we don't know how long the barrier was inactive for, when the next scheduled reset was for, etc. In theory, after the accident is cleared & documented, they should have a crew there right after to reset it.

3. The driver purposely set the safe-follow distance to the minimum. From this 2015 article (not sure if anything has changed since then), there are 7 levels of follow distances available. It's possible the accident may have been reduced if he had a larger detection distance enabled. On the flip side, per the video in the article below, leaving a larger follow distance does "invite lane changes", and having lived in California, a lot of people drive fast & close, so it's not unreasonable to set the follow distance to the lowest setting. So you would think that perhaps the driver is to blame here for that setting, but (1) it is an available option built into the car (so it's not like it was being used out of spec), and (2) that's just the driving style in certain parts of the country.

https://www.teslarati.com/set-tesla-tacc-following-distance/

4. The driver did not have his hands on the wheel (for at least 6 seconds before the accident). Per the terms of use, you're supposed to have your hands on the wheel. But Tesla also advertises their car as a self-driving car, with what I currently consider the mis-named Autopilot feature.

5. The driver did not response to multiple visual warnings and one audible warning. Based on the driving situation, the driver would have had roughly 5 seconds and nearly 500 feet of clear viewing of the divider.

So I think the first question here is, why was he not paying attention? Did he trust the Autopilot too much & got lax? Or did he have a health issue like a heart attack? Regardless, technically, the core fault is with the driver - he should have been paying attention with his hand on the wheel and taken control before the accident happened. However, I think people come to rely on the Autopilot system and get into the flow of not having to drive themselves. For example, in this case, if he was playing on his phone or something, he might not have seen the visual warnings, might have heard the single audible warning, looked up, and only had five seconds to mentally switch from whatever he was doing & from allowing the car to drive itself to having to make a decision to grab the wheel & take control. And I would assume that this is his standard daily commute route, so he probably had no reason to be suspicious of the car self-driving into the barrier.

Which leads into the second question, if Teslas make hundreds of commutes a day on this exact stretch of road, then why exactly did it crash? I'd imagine the barrier has not been reset in the past & Teslas in Autopilot mode have driven by it without incident (because we haven't heard of any other fatal crashes in a Tesla in this particular area). So with that assumption, then did the car detect that the driver wasn't paying attention, and tried to pull over, thinking the open median from the barrier was a safe place to park the car? I'm very curious to know what logic the car went through here. It sounds like it had plenty of data on this stretch of road, so the only thing I can think of is that it detected that the driver wasn't paying attention & wanted to pull over. I've seen some videos on how that works and it seems to be variable based on how fast you're going, where you're at, etc. If it only did one audible alert, that seems too little to really alert the driver that it was going to stop. So something is funky with why the Autopilot crashed in this situation.
 

slayer202

Lifer
Nov 27, 2005
13,679
119
106
Without getting too in depth, another accident in which there were multiple factors involved, multiple failures.
 

Zorba

Lifer
Oct 22, 1999
15,613
11,255
136
Some interesting bits from the article:

1. The company said that the driver had his hands off the steering wheel and was not responding to warnings to re-take control.


2. Tesla said the reason the crash was so severe was because the attenuator (the barrier crash reduction fence) had been crushed in a prior accident without being replaced. “We have never seen this level of damage to a Model X in any other crash,” the company said.

screen-shot-2018-03-27-at-9-49-22-pm.jpg



3. Tesla claimed that Autopilot successfully completed “over 200 trips per day on this exact stretch of road.”

https://electrek.co/2018/03/30/tesla-autopilot-fatal-crash-data/

4. Autopilot was engaged with the adaptive cruise control follow-distance set to minimum.

Based on this information, I think I would summarize the accident as follows:

1. The Tesla was on Autopilot and should not have crashed into the barrier. If Teslas really make over 200 trips per day on this exact road, why did it crash straight into a barrier? So shame on Tesla here.

2. The barrier had not been reset, creating a safety issue. It's likely that the driver would not have died if the barrier had been functioning. So shame on the state for not resetting it here. Granted, it takes time to schedule a road crew to get out there, and we don't know how long the barrier was inactive for, when the next scheduled reset was for, etc. In theory, after the accident is cleared & documented, they should have a crew there right after to reset it.

3. The driver purposely set the safe-follow distance to the minimum. From this 2015 article (not sure if anything has changed since then), there are 7 levels of follow distances available. It's possible the accident may have been reduced if he had a larger detection distance enabled. On the flip side, per the video in the article below, leaving a larger follow distance does "invite lane changes", and having lived in California, a lot of people drive fast & close, so it's not unreasonable to set the follow distance to the lowest setting. So you would think that perhaps the driver is to blame here for that setting, but (1) it is an available option built into the car (so it's not like it was being used out of spec), and (2) that's just the driving style in certain parts of the country.

https://www.teslarati.com/set-tesla-tacc-following-distance/

4. The driver did not have his hands on the wheel (for at least 6 seconds before the accident). Per the terms of use, you're supposed to have your hands on the wheel. But Tesla also advertises their car as a self-driving car, with what I currently consider the mis-named Autopilot feature.

5. The driver did not response to multiple visual warnings and one audible warning. Based on the driving situation, the driver would have had roughly 5 seconds and nearly 500 feet of clear viewing of the divider.

So I think the first question here is, why was he not paying attention? Did he trust the Autopilot too much & got lax? Or did he have a health issue like a heart attack? Regardless, technically, the core fault is with the driver - he should have been paying attention with his hand on the wheel and taken control before the accident happened. However, I think people come to rely on the Autopilot system and get into the flow of not having to drive themselves. For example, in this case, if he was playing on his phone or something, he might not have seen the visual warnings, might have heard the single audible warning, looked up, and only had five seconds to mentally switch from whatever he was doing & from allowing the car to drive itself to having to make a decision to grab the wheel & take control. And I would assume that this is his standard daily commute route, so he probably had no reason to be suspicious of the car self-driving into the barrier.

Which leads into the second question, if Teslas make hundreds of commutes a day on this exact stretch of road, then why exactly did it crash? I'd imagine the barrier has not been reset in the past & Teslas in Autopilot mode have driven by it without incident (because we haven't heard of any other fatal crashes in a Tesla in this particular area). So with that assumption, then did the car detect that the driver wasn't paying attention, and tried to pull over, thinking the open median from the barrier was a safe place to park the car? I'm very curious to know what logic the car went through here. It sounds like it had plenty of data on this stretch of road, so the only thing I can think of is that it detected that the driver wasn't paying attention & wanted to pull over. I've seen some videos on how that works and it seems to be variable based on how fast you're going, where you're at, etc. If it only did one audible alert, that seems too little to really alert the driver that it was going to stop. So something is funky with why the Autopilot crashed in this situation.

This shows the problem (again) with semi-autonomous cars. People trust the system and check out. You also can't realistically expect a person that is checked out to jump back in and take over within a second or two, much less a fraction of a second.

As I said earlier, this works with aircraft because generally autopilot isn't on for takeoff and landing when things can go bad quickly. Outside of those phases there is much more time to react to an issue. There are also multiple warning systems and checks that alert the crew any time autopilot is faulting. As a point of comparison, on US Air 1549 (the miracle the Hudson) a flight known for the extreme quick action/thinking, took 5.5 seconds from engine roll back to their first positive action (turning on engine ignition), 8.3 seconds before they started the APU, and 10.2 seconds before Sully took over control of the aircraft.

In a car it is easy to go from everything being fine, to needing action within a second or two. It also appears that outside of the warning to put your hands on the wheel there was also no warning active warning that auto-pilot wasn't working in any of the well known auto-pilot accidents.

Companies hanging their hat on "well the driver should've taken over" is a pretty poor excuse, but it will be often repeated and people will likely accept it, even though there is plenty of evidence that it is not a realistic expectation.

That barrier is a concern too. It looks like the replaced the hit plate, and got rid of the debris, without actually rebuilding it. Around here they will place a crappy old truck with a crash pad on the back of it by crushed barriers until they are rebuilt, usually within a week or so. They generally don't do anything to clean up the debris until they fully fix it, except to clear anything that would be a hazard.
 
  • Like
Reactions: DietDrThunder

urvile

Golden Member
Aug 3, 2017
1,575
474
96
Uber has to be a shady business because its model is to grow so big and fast even using illegal means to shut out the competition. And its also bleeding cash so I doubt its going to change its ways of becoming 'fairer' because it can't. And in some markets currently it doesn't make sense to work for Uber because the pay isn't enough for fuel/insurance/maintenance.

In at least one state in aus money comes out of each uber trip and goes into a compensation fund for taxi drivers. The problem with "disrupting the industry" is that most countries are governed by laws and if the taxi industry is regulated then as far as lawmakers are concerned uber should be too.

Taxi drivers in aus have to pay for the privilege. They have to buy taxi plates. There is a reason they got so pissed off with uber. It's effecting their livelihoods so what do you expect? I always use uber when overseas though. Because then I can't get ripped off like I have been by taxi drivers. :) Whenever a taxi driver asks if you have been to the country before. Always say yes. Unless you want to go on a scenic tour on the way to the hotel.
 

bononos

Diamond Member
Aug 21, 2011
3,938
190
106
I don't think they HAVE to be shady. Companies are like families - it all starts with the leadership, which flows down & creates a culture in the workplace. Sometimes that culture is toxic.

I don't think it has anything to do with someones personal values or a company's culture if the intention is to dominate the market using shady tactics, pay workers poor wages and then hike up the rates later. Changing Amazon's or Uber's executive positions isn't going to turn them into good companies. And they haven't turned a regular profit yet to boot.
 

bononos

Diamond Member
Aug 21, 2011
3,938
190
106
In at least one state in aus money comes out of each uber trip and goes into a compensation fund for taxi drivers. The problem with "disrupting the industry" is that most countries are governed by laws and if the taxi industry is regulated then as far as lawmakers are concerned uber should be too.

Taxi drivers in aus have to pay for the privilege. They have to buy taxi plates. There is a reason they got so pissed off with uber. It's effecting their livelihoods so what do you expect? I always use uber when overseas though. Because then I can't get ripped off like I have been by taxi drivers. :) Whenever a taxi driver asks if you have been to the country before. Always say yes. Unless you want to go on a scenic tour on the way to the hotel.
Its only makes sense that Uber is regulated. Otherwise you'll get robbers and rapists who are legit Uber drivers who signed up with fake ids. And the number of taxis have to be capped otherwise there will be too many taxis that are empty.
Uber is good in places where taxis refuse to turn on their meters or where taxi drivers have to pay exorbitant rates for licenses due to corruption.
 

Zorba

Lifer
Oct 22, 1999
15,613
11,255
136
Its only makes sense that Uber is regulated. Otherwise you'll get robbers and rapists who are legit Uber drivers who signed up with fake ids. And the number of taxis have to be capped otherwise there will be too many taxis that are empty.
Uber is good in places where taxis refuse to turn on their meters or where taxi drivers have to pay exorbitant rates for licenses due to corruption.
Uber/Lyft is also good for cities that have basically no traditional taxi service, like much of the Midwest. But they are a taxi service and should follow the laws of one.
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
Its only makes sense that Uber is regulated. Otherwise you'll get robbers and rapists who are legit Uber drivers who signed up with fake ids. And the number of taxis have to be capped otherwise there will be too many taxis that are empty.
Uber is good in places where taxis refuse to turn on their meters or where taxi drivers have to pay exorbitant rates for licenses due to corruption.

Yep and then there are the safety issues. That's why the taxi industry is regulated. Uber offers the same service. In my home state uber drivers need a charter vehicle license. It's probably the same in other countries but uber was actually illegal in some states in aus until the kinks were worked out. Uber drivers were fined a lot of money*. I do think they shook the taxi industry up though which was needed.

*they are australian heroes.
 

bononos

Diamond Member
Aug 21, 2011
3,938
190
106
The problem with Uber is that drivers can't make money even working over 100 hours/wk in some places and taxi drivers have commited suicide. Uber is trying to 'disrupt' the market at a severe cost to society.
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
Taxi drivers have also bashed uber drivers.
The problem with Uber is that drivers can't make money even working over 100 hours/wk in some places and taxi drivers have commited suicide. Uber is trying to 'disrupt' the market at a severe cost to society.

ouch.

https://www.brisbanetimes.com.au/na...-million-in-three-months-20160719-gq8zv3.html

That's just one example. Don't worry though they are martyring themselves for Dara Khosrowshahi's yearly bonus. So it is for a worthy cause.

I will add that QLD is australias most redneck state. Homosexuality was illegal until 1991 there. God damn kids and their industry disrupting apps.
 

urvile

Golden Member
Aug 3, 2017
1,575
474
96
There are constant crackdowns here on uber drivers who refuse to do what they are told. The problem with these decentralised companies is who do I complain to? Uber eats is the same. I have had them not deliver food that I already paid for and I had to dispute the charge through paypal. Pain in the arse mate.

It's wonderful when it works as expected but when it doesn't you are pretty much fucked.
 

Ichinisan

Lifer
Oct 9, 2002
28,298
1,235
136
Ahhh, but it has everything to do with the point I’ve been trying to make in this tread. That these systems are being released on the general public without proper testing and oversight.
Lots of testing and oversight. This was simply a failure, which will lead to corrections.
 

Ichinisan

Lifer
Oct 9, 2002
28,298
1,235
136
I agree, I don't understand why these cars aren't being held to something like DAL-A standards. Instead, they seem to be following the app development model. The fact that safety critical software and electrical-mechanical systems are being released into the wild with no real regulations, standards and seemingly little testing is insane.

Airplanes would be much cheaper if everyone said "Well it's safer than a car, do we really need all this testing?"
I'm fairly certain there is a lot more testing and preparation within these limited test areas than you or I know.
 
  • Like
Reactions: slayer202