Do we really want to turn this into a numbers game? The difference between those two scenarios is that different people will die. It also assumes we all adopt AVs too. Now of course everybody has equal value, but are you prepared to accept that a software bug and a lapse in human judgement are also equal? Without accepting that at a societal level the average person will not accept AVs.
Humans who cause accidents will have remorse and learn never to do it again, or be punished if alcohol was involved. With an AV, currently nobody knows. It could be as simple as "we've checked in the bugfix to prevent this from happening again" with no emotion behind it.
Hasn't the rise of Uber and Lyft already turned it into a numbers game? I get drivers that I would consider below average all the time, and I don't have any control over which driver I get. Even when you're the one driving, you have no control over whether the car in front of you is being driven by a good driver or somebody texting who might cause an accident that endangers you. I agree that it needs to be accepted on a societal level, but if you can definitively say "self driving cars are safer than the average Uber driver," then I don't think it will be that hard of a sell. That's especially true if it results in less expensive rides in addition to lower crash rates.
This. For the majority of fatalities, the driver that caused the accident is rarely among the dead. They are generally in another vehicle. Everything is random. Decreasing the odds of failure will kill individuals, but even a software edge case can cause a phone to explode. The assumption we are always in control is what needs to change.
> Do we really want to turn this into a numbers game?
The reality is lots of people die on the road and the numbers show this sad fact.
Now the reality is some of those who are killed, die from no fault of their own. They die at the hands of the other driver.
The sooner we can take away that human element from driving the better, because that one act will end up saving lots of lives.
> Humans who cause accidents will have remorse and learn never to do it again
That is so untrue, at least for what happens here in Australia.
Here in Australia we have many hundreds in not thousands of repeat drink driving, speeding offenders on the roads, and short of locking them up there is no way to keep them off the roads.
Taking away their license makes no difference as they just drive without a license.
The argument in this thread seems to suggest we should be slow at adopting av technology because it could kill some innocent people.
Now I accept that point as a fact. There is no doubt people will die in av cars just like people die in cars today.
Yet, on the other we have human drivers killing innocent people today, which is acceptable collateral damage, even though introducing av technology would help to greatly reduce that number.
> Yet, on the other we have human drivers killing innocent people today, which is acceptable collateral damage, even though introducing av technology would help to greatly reduce that number.
Speaking for no one other than myself, vehicular deaths are not "acceptable collateral damage" so much as a statistical inevitability, one sought to be minimized as much as possible.
The problem with introducing autonomous vehicle technology, again IMHO, is the assumption that it "would help to greatly reduce that number."
> And without actually doing the implementation that question will never get answered.
Exactly. Which is why it is imperative the engineering community does not assume outcomes beforehand.
Much of the concerns you list I agree are likely benefits, with the "not drive drunk", "not get tired", and "not susceptible to road rage" items being a near certainty due to eliminating biological aspects.
What autonomous vehicles might not be better at could be:
. snow
. black ice
. mud
. severe thunderstorms
. able to operate in high dust environments
. adapting to rapid unexpected operating conditions
. when to run a red light (emergencies, external threats, some combination of conditions above)
Are these edge conditions? Maybe. Or you could call them functional requirements not often discussed.
All I'm saying is that there is a lot more to general-purpose driving than what can be shown in a limited setting.
IOW, show me an autonomous vehicle which can complete the Paris-Dakar Rally[0] and I'll show you someone who will say it's time for people to stop driving ;-).
Humans learned to drive for years, initially, we will be better, but as time goes there is no chance against an automated solution. The whole industrialization process is just this and there were problems and bad things happened, but it works in the end.
I specifically added that clarification only because I suspected without it, the argument would have been but what happens when a bug in the software causes the av to crash where as humans don't face have that problem types of arguments.
Being a software developer I have no doubt there will be bugs in these av software system.
But these systems will also include many fail safe overrides designed to minimize the impact of these occasional bugs.
> You can't possibly think that drunk drivers and those driving without a license are the target market for self-driving cars.
You are correct, I don't think that.
That group just represents a small subset of the drivers that present a danger on the roads today.
Now since you asked, I would say the real target market for self-driving cars is big business.
They are the ones pushing hard for this technology, not for the safety aspects that I have focused on, but instead for the cost savings this technology will bring to their bottom line.
> Humans who cause accidents will have remorse and learn never to do it again, or be punished if alcohol was involved. With an AV, currently nobody knows.
Easy to turn that around. Only individual humans learn from their mistakes - and even then, not always. But with AI, the whole fleet learns that lesson.
Humans who cause accidents will have remorse and learn never to do it again, or be punished if alcohol was involved. With an AV, currently nobody knows. It could be as simple as "we've checked in the bugfix to prevent this from happening again" with no emotion behind it.