Self Driving Cars
Responsibility
There is some debate on whether, because self driving cars are safer than human drivers, we should embrace them. And we should embrace the process of rolling them out in our cities and countries so that they can be optimized.
(Note: self driving cars are safer than human drivers. Human drivers are, on average, terrible. Not up for debate.)
Why not? If it reduces deaths, then it’s a good thing.
I think this to be a flawed argument, or at least a malicious shift of responsibility.
Self driving cars have two important implications: They significantly reduce overall risk. They change the fault model from the individual to random.
Obviously, the second is the point here. Non self driving accidents happen because someone did something wrong, self driving accidents happen because of random glitches.
Imagine you have a 15 year old son. Would you rather a 1% chance he dies in a car accident because of a drunk driver or a 0.1% chance he dies in a car accident because the car thought a white truck was a cloud?
I’d hope that everyone would take the clouds, but it’s not so clear. The drunk driver can take responsibility for their actions, they can repent, and seek forgiveness. There is a potential path forward to grow from the situation, that doesn’t exist in the latter.
What concerns me more though is that today the clouds are the responsibility of the person in the self driving car! We have been played for fools when Clown Corp can have a software failure and make that the responsibility of the user - bananas!
I think this is the missing point in this dichotomy people have invented. Clown Corp must be responsible when people die in their self driving cars. If they are not, then we have a system where Clown Corp can systematically, but randomly, murder a small percentage of people, and then blame that on, also random, users of their software.
In no way is this acceptable.