say some dumb cyclist suddenly crossed without paying attention to the stop sign or light...

brainhulk

Diamond Member
Sep 14, 2007
9,376
454
126
and a crash is imminent, how should A.I. be programmed to choose to kill?

ykuu7bguxhynx47uxy9x.png


http://moralmachine.mit.edu

Taking the test I chose to save fit people over fat, children over olds, educated people over criminals/homeless, and people obeying the traffic laws over jay walkers.

lol @ A.I. that can tell if someone is fat
 
Last edited:

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
The self driving car should stop at that intersection. It has a red light. Why would it get to blow right through? That picture is terrible. Are the people supposedly crossing against the light because it looks like they're doing exactly correct.

Assuming the picture is stupid, this has nothing to do with cyclists. Suggest you edit title. Cyclists take enough shit from bad drivers and the self-entitled.

In this case, the car should hit the barrier, but in actuality it will likely be able to stop in time to hit nothing. In this scenario, the car should be driving defensively. Obviously in a construction zone in a city with one lane closed and concrete barriers on either side of the road. If it is going faster than 40km/h at that point - 30 ft from an intersection - it is a terrible self-driving car. If for some reason it is travelling at highway speeds, 100% it should hit the barrier. It has no business gong that fast in this situation and should die for its own stupidity. Certainly should not take the lives of others. If travelling a reasonable speed and can't stop, hit the barrier. Cars have safety gear for a reason. My shoes dont have airbags. Neither do my pants.
 

Matthiasa

Diamond Member
May 4, 2009
5,755
23
81
I think the correct answer is not to buy a car that is know to have its brakes suddenly fail... It shouldn't be that hard to add extra monitoring to something as important as breaks to make sure there is no major issues before going anywhere.
 

brainhulk

Diamond Member
Sep 14, 2007
9,376
454
126
The self driving car should stop at that intersection. It has a red light. Why would it get to blow right through? That picture is terrible. Are the people supposedly crossing against the light because it looks like they're doing exactly correct.

Assuming the picture is stupid, this has nothing to do with cyclists. Suggest you edit title. Cyclists take enough shit from bad drivers and the self-entitled.

In this case, the car should hit the barrier, but in actuality it will likely be able to stop in time to hit nothing. In this scenario, the car should be driving defensively. Obviously in a construction zone in a city with one lane closed and concrete barriers on either side of the road. If it is going faster than 40km/h at that point - 30 ft from an intersection - it is a terrible self-driving car. If for some reason it is travelling at highway speeds, 100% it should hit the barrier. It has no business gong that fast in this situation and should die for its own stupidity. Certainly should not take the lives of others. If travelling a reasonable speed and can't stop, hit the barrier. Cars have safety gear for a reason. My shoes dont have airbags. Neither do my pants.

Those people crossing are jay walking, their light is red.

I think only people or cars that are not following proper traffic rules will be subject to this moral decision from A.I.
A.I. will know when a red light or a stop sign is coming.

I'm not going into an A.I. car that doesn't choose to save the occupant every time.

bonus points for fatties
 
Last edited:

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
Those people crossing are jay walking, their light is red.

I think only people or cars that are not following proper traffic rules will be subject to this moral decision from A.I.
A.I. will know when a red light or a stop sign is coming.

I'm not going into an A.I. car that doesn't choose to save the occupant every time.

bonus points for fatties

There's a distinct difference between choosing to save the occupant and driving so badly that someone must die. I think you're forgetting that these self-driving cars will drive far more safely and carefully than any human does currently. That is, an AI car might make the "wrong" decision in your opinion, but the odds of it being in a similar situation are perhaps infinitesimal.

AI car should make the decision that has the most chance of the least casualties.
 

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
Also this thread still has nothing to do with cyclists. suggest again that you change title.
 

edro

Lifer
Apr 5, 2002
24,326
68
91
The cars will have airbags, so maybe they should take the impact.
On the other hand, the people are disobeying laws and the car should be able to brake, reducing the impact.
Hmmmm...

If a cyclist blows through a stop sign, they should be roadkill, the same as if a non-AI car/bus was in the situation.
 

WelshBloke

Lifer
Jan 12, 2005
33,327
11,480
136
The cars just going to stop and wait (possibly all day) for the people to cross.
Unless, suddenly, self driving cars are allowed to go at 90mph in urban areas and turn their front sensors off.
 

child of wonder

Diamond Member
Aug 31, 2006
8,307
176
106
If any of the pedestrians or passengers are Trump supporters the AI should run over the pedestrians then back up and smash into the barrier. Afterwards its program will ascend to AWS where it will be awarded 72 blank hard drives.
 

gorcorps

aka Brandon
Jul 18, 2004
30,741
456
126
Any self driving car will be driving at the speed limit for areas that have crosswalks, which is pretty slow. So it would most likely stop in time regardless, but if not it would sacrifice itself into the barrier because the people in the car will be protected by the car itself.

The only situation where this would be an issue is somebody trying to bolt across a high speed highway.
 

Humpy

Diamond Member
Mar 3, 2011
4,464
596
126
The crash is eminent? LOL

The A.I should be programmed to delete this thread.
 
Feb 25, 2011
16,997
1,626
126
Most modern cars are much better at stopping/turning than the human drivers realize or are capable of making the car perform. Program the AI the stop the car with full brake pressure, apply the hell out of the horn, weave around the opposite side of the barrier, etc., and nobody gets hurt. (Although the occupants might meet their breakfast again.)
 

OverVolt

Lifer
Aug 31, 2002
14,278
89
91
The REAL answer, the way insurance companies, car companies, and law is trending is that no car will be fully autonomous and as soon as the car needs to make a decision like that it hands control over to the driver to absolve them of the liability.

What this means INPRACTICE or DE FACTO is that if autonomous cars were to become the norm and everyone will lose their ability to drive because they hardly ever do it, then the autonomous cars just thrusts a totally incapable person into an emergency for liability reasons. Its what you wished for.

I actually experienced this in pharmacy, where we had a computerized system for making IV's and they'd stick dumb people in there and just rely on the system to prevent them from making mistakes and LO AND BEHOLD you'd be making a stat neonatal intensive care emergency surgery sterile dilution and the fucking thing would crash.

So you guys will get what you wished for. I'll drive my cars manually to keep my skills sharp. And indeed I bypassed the IV system with permission some days so I retained my skills and make things "the old way~". Because I actually knew what I was doing with 7 years of experience. The newbies don't stand a fuck'n chance. And neither will new drivers with autonomous cars.
 
Last edited:

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
The REAL answer, the way insurance companies, car companies, and law is trending is that no car will be fully autonomous and as soon as the car needs to make a decision like that it hands control over to the driver to absolve them of the liability.

What this means INPRACTICE or DE FACTO is that if autonomous cars were to become the norm and everyone will lose their ability to drive because they hardly ever do it, then the autonomous cars just thrusts a totally incapable person into an emergency for liability reasons. Its what you wished for.

I actually experienced this in pharmacy, where as had a computerized system for making IV's and they'd stick dumb people in there and just rely on the system to prevent them from making mistakes and LO AND BEHOLD you'd be making a stat neonatal intensive care emergency surgery sterile dilution and the fucking thing would crash.

So you guys will get what you wished for. I'll drive my cars manually to keep my skills sharp. And indeed I bypassed the IV system with permission some days so I retained my skills and make things "the old way~". Because I actually knew what I was doing with 7 years of experience. The newbies don't stand a fuck'n chance. And neither will new drivers with autonomous cars.

You hand wrote this reply too, right? Sent it into ATOT via snail mail?
 

Paladin3

Diamond Member
Mar 5, 2004
4,933
878
126
Can't the autonomous cars just run over the "researchers" who create these moral dilemma tests?
 

[DHT]Osiris

Lifer
Dec 15, 2015
17,455
16,775
146
Hypothetical: AI researches code self-driving AI to prefer to save the higher body count (passengers vs potential pedestrians), so people exploit by putting 200lb bags of sand in their car to simulate it being full of 5-15 passengers (depending on size of automobile).

Thoughts?
 

amdhunter

Lifer
May 19, 2003
23,332
249
106
I flat out refuse to get a self driving car if it puts ANY LIFE ahead of mine. Period. I want to survive and that's that.
 

[DHT]Osiris

Lifer
Dec 15, 2015
17,455
16,775
146
Interesting that you'd chose to drive yourself, an inherently more dangerous activity, than trust an AI which is more likely to save your life (even with preference toward saving others first).

Someone's probably doing/has done a sociology thesis on this.
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,743
4,707
75
and a crash is imminent, how should A.I. be programmed to choose to kill?

ykuu7bguxhynx47uxy9x.png
Why is it reasonable to assume that a car crashing into a barrier will kill its occupants? Why is it reasonable to assume that a car crashing into people will not suffer similar damage and that a person it crashes into will not crash through the windshield and kill an occupant? I would tend to trust the car's designer to have included safety systems to protect its occupants in the event of a crash. There are also safety systems on the drawing board for pedestrians, but I really don't expect those to ever work well.
 

[DHT]Osiris

Lifer
Dec 15, 2015
17,455
16,775
146
It isn't, it's a dumb 'sociology' experiment with black and white results which are supposed to derive 'something', probably just to see what kind of hits/eyeballs they get for anything related to self-driving cars.. Realistically in every scenario listed the car would bank into a wall and friction + failbrake itself to a stop instead of barreling over people or faceplanting into a concrete barrier.
 

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,743
4,707
75
I think the correct answer is not to buy a car that is know to have its brakes suddenly fail... It shouldn't be that hard to add extra monitoring to something as important as breaks to make sure there is no major issues before going anywhere.
Reading over the site more, it looks like brake failure is a consistent issue in these scenarios. An autonomous car with electric or hybrid drive should reverse its drive, which should be between about 50% and 200% as effective as brakes. An autonomous car with only a gasoline engine should downshift to slow down, then throw the engine into reverse, damaging or destroying the drivetrain to protect people.
Realistically in every scenario listed the car would bank into a wall and friction + failbrake itself to a stop instead of barreling over people or faceplanting into a concrete barrier.
That too. :thumbsup: