Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can imagine this scenario: in a bridge without any place to go but ahead ..

   -------------------------------------------------
      5 people <-- car w/ 1 people <-- heavy truck
   -------------------------------------------------
a) The car can break, but the heavy truck below will certainly kill the car's only passenger, who's on the back seat

b) The car hit and potentially kill most of the people standing, but save its passenger

c) The car can jump to the void

Scary as it sounds, what the car would do depends on the software



In 4/28/2006 Radio Lab (PBS Amazing Podcast) did a show on morality. They tell the moral choice people will make:

The program begins with a moral conundrum.

You're working with a crew doing track repairs (presumably, using heavy equipment such that you can't hear much else) when you happen to look up to see a locomotive rushing towards five of your buddies who are working on the tracks. But just as quickly you realize that you're standing next to a rail-switch which you could use to divert the train to a second track on which only one of your workmates is standing.

Should you throw the switch?

Ninety percent of a great-many people who've been polled say they would throw the switch.

The next scenerio has you and a coworker (a very large fellow) working on a footbridge above the tracks. Again, you look to see the locomotive bearing down upon five of your friends who are working on the tracks. You quickly realize that the only way you can stop the train is to push the big guy standing next to you off the footbridge, whereupon his massive body will stop the train (yes, I know, but just go with it).

Should you push your coworker in front of the locomotive?

Ninety percent of a great-many people polled say they would not push the man in front of the train, even if by doing so they could save the other five men.

Now we have your in a car and it could choice to turn right or left into a tree killing you or continue straight and kill 5 children who were crossing the road. What would you want your car to do?


I honestly think that apparent flip-flop comes down entirely to how terribly ineffective the large human sounds at stopping the train. Telling people to pretend it works is not an effective measure. People can't cancel their biases just from being told to do so, and a bias against such an awful plan is quite reasonable.


Your explanation makes sense, but I was going to go with "we're fine flipping a switch, but actually pushing a person is taboo".


(d) The car would stop because unlike human drivers, a self-driving car would be programmed to avoid going so fast that it can't stop if a sudden obstacle would appear (aside from maybe something falling from the sky).


A human would drive too fast on in icy conditions... if the computer knows there are icy conditions and it doesn't slow down before it even senses trouble then the car was programmed to be going too fast. If it is so dangerous there is no way to remove these scenarios (like a blizzard) then a human should be forced to override the system in which case the human is at fault.

I trust sensors to detect icy conditions better than a trust myself.

[edit] Most bridges where I am have signs that explicitly warn that bridges freeze over. And people intuitively know... bridge + recent precipitation + cold whether = slow down. I don't know why a computer wouldn't know that.


1000 m ahead a group of 5 people walks the pavement along the road. Should the car slow down so that it can stop on time if they suddenly try to cross the road?

If so - self-driving cars will be going much slower than manualy driven cars most of the time, which will probably hurt adoption.

I would like this to be the case, but I guess car companies will agree to some compromise to speed up adoption, and corner cases will be there. Should the software ignore them, or plan for them (= "planning to kill you").


Bridges ice before roads. It is easy to imagine conditions where stopping distances on such a bridge suddenly jump.


With computer controlled braking, the car could spin around to take the hit on a front corner. This would provide the least danger to the occupant while not hitting the people. Though it does assume the truck is not stopping. In all likelihood it would just stop assuming the truck would stop... The same call any human driver would make if they weren't texting or otherwise distracted. It doesn't have to be perfect, just better than today on odds.


No it couldn't. Spinning -> less traction -> less stopping power -> longer stopping distance.


Considering how efficiently trucks stop, I wouldn't be worried. Plus, that truck is probably going to be driven by a computer too.

A) all day long.


I don't think a car that breaks whenever placed in a challenging position is going to sell very well.

One that brakes, well that's different.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: