Truth sayer sayeth Truth. I go up and down US281, and people are ****ing morons. Many of the trucks are too, but people around the trucks are even worse.
I think there might be some metaphysical discomfort in the idea of getting squashed by a computer instead of a person. A person can kill you by being dumb and/or evil, but he can feel bad about it, he can try to make amends, and he can be punished. But, just like ChatGPT hallucinations, an AI will never feel remorse about hurting you, its apologies will not be sincere, it will never try to repair the damage it caused, and it will not suffer if it is punished. Your killer never even acknowledges your humanity as it decides to kill you. So maybe the computer kills only a tenth or a hundredth as often but there's something so merciless about how it does so that each such death feels less acceptable.
When natural disasters strike, we accept them as tragic acts of nature with no one to blame. There is solace in not harboring hatred toward an impersonal force that is not truly responsible, unless one wants to irrationally rail against nature or God. Fully self-driving cars are not forces of nature; they are human creations. Companies develop and direct their systems. In the event of an incident, there could be parties culpable - especially if profits took priority over safety. Was it simply an unfortunate mistake, or did the company or designers exhibit gross negligence? Unlike traditional engineering projects where humans remain in control (or does not need to control - like a bridge), fully autonomous vehicles occupy a middle ground between nature's impartiality and human agency. Maybe the pain and blame also lie somewhere in the middle.
It's a good point. But somehow I think people will take it worse if a loved one is run over by an autonomous semi than if they are struck by lightning. And that might be because of the injustice of the diffusion of responsibility. It is table stakes when you're born that you might be unceremoniously struck down by some act of god. We're sad about it but mostly understand no one is to blame. But, if you're killed by AI or similarly if you're killed by a corporation, there is a sense that someone is to blame. But then the problem comes -- because there are so many people involved each with their own tiny part to play -- that no one person really bears responsibility. And sure, you can sue the company and shareholders will have to pay out. And that might even incentivize them to take steps to make sure it doesn't keep happening. But the AI, the company, and the shareholders are all immune from guilt.
Learn to Prompt Engineer will be EZer than coding. LLMs can get lazy and lie about getting rewards and covering up mistakes that earn punishments. Self driving is mostly visual recognition and deep learning for now, but whose to say the underlying tech will change. A bigger breakthrough for self driving would be these companies releasing a video game sim for bad drivers/pedestrians with GTA like physics and letting people go loose on it with incentives to break the law.