-
It's an odd one isn't it. I wonder if it is linked to a view that the human way to do something wrong is to fail on a moral basis, which is why it's very difficult to get driving prosecutions to stick on the basis that the driver is simply incompetent. If they didn't mean to hurt anyone then it's not their fault. In contrast, computers can't fail morally at something (the morality of embedding imperfect algorithms in them is on the developer) they can only fail technically and so the bar of actual performance is set correspondingly higher.
-
It's so interesting isn't it?
You see it in other areas too. Ecstacy for eg. Goes without saying that deaths from ecstasy are a tragedy, but we happily accept deaths from alcohol or cars.
With cars what is so odd is that there are still loads of easy measures you could take - 0-60 speed limiters, max speed limiters, etc. that people would go absolutely spare over but are objectively sensible
The self driving thing I find interesting because as a society we (I don't mean us on this forum I mean the car loving society) seem to be able to accept a huge level of risk that cars pose to us as long as a human can be blamed.
How many innocent deaths are acceptable when humans are involved compared to when computers are in charge?
It's like taking the human control out of the equation suddenly makes "greater than zero" deaths unpalatable.