Blog Calendar
    November     ►
SMTWTFS
     
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Archive RSS
About This Author
Come closer.
Complex Numbers
#1056913 added October 7, 2023 at 9:13am
Restrictions: None
Autonomy
It doesn't matter how safe they are; there are those who are scared shitless of them, and no amount of logic or number-talk will change their minds.

    Are self-driving cars already safer than human drivers?  Open in new Window.
I learned a lot by reading dozens of Waymo and Cruise crash reports.


Because I don't have a vested interest in anything, I'll say up front that the answer to the headline question turns out to be "maybe."

There's not a lot I need to quote from the article; it's there if you want to read it. I doubt it'll convince anyone of anything, though.

Neither will I, but I'm going to talk about the subject anyway. I've done it before, in "Putting the Auto in AutomobileOpen in new Window., and others, and I'll try not to repeat myself too much.

Here's the important quote:

Of course self-driving cars are flawed—all technologies are. The important question is whether self-driving cars are safer than human-driven cars.

And that's where data analysis comes in. The issue, and this is touched on in the article, is that AV incidents are scrutinized to a much finer level of detail than all but the worst HV (human-driven vehicle) incidents, and every little bump and ding is subject to this scrutiny. Imagine if you had to fill out incident reports and have questioned every decision you made leading up to the event every time you hit a squirrel in the road, or dinged someone's bumper in a parking lot.

Point is, accurate comparisons are hard. And while traffic fatalities are, in one sense, common (usually on the order of 40,000 deaths per year in the US), in another sense, they're pretty rare: according to the linked article, an average of one per 100 million miles driven.

And this is just me, but I'm not sure if we can take fender-benders as a proxy. That is, you expect x number of serious injuries for y number of minor incidents with human drivers, and the ratio of x over y is probably fairly constant year to year. Human drivers make mistakes. Robots make different mistakes.

On the other hand, we can't just set the robots loose, see how many people they kill, and then make a decision. That would be slightly unethical. No, we have to start somewhere.

"Waltz, we don't 'have to' start anywhere."

Yes, we do. I want my self-driving car so I can go bar-hopping. And I want it NOW. I'll even accept it before my promised flying car, thanks.

Anyway, getting back to serious. Data. Unlike some people, I can be persuaded by logic. If it turns out that reliable data show that AVs are safer, I'll be convinced.

But there will always be technophobes who will insist that it's better to risk being killed by a human driver than take a lower risk of being killed by a robot driver. Because robots are scary. This is akin to being scared shitless of flying, despite all the statistics. (I despise flying, myself, but that's because they've managed to turn it into the world's most uncomfortable travel experience; I feel fairly safe flying, just grumpy.)

And those are the people we need to convince. Not me. Problem is, I don't know how to appeal to emotion. So meanwhile, I'll just continue to drink at home, and in places where Uber can happen.

© Copyright 2023 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted InkSpot.Com, its affiliates and its syndicates non-exclusive rights to display this work.
... powered by: Writing.Com
Online Writing Portfolio * Creative Writing Online