site banner

Small-Scale Question Sunday for December 8, 2024

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

I guess it depends on how society approaches the problem.

It's possible that self-driving cars are already safer than humans. So let's say, in 5 years, the death rate looks like this per 100 million miles.

  • Perfect human driver: 0.01

  • Self-driving vehicle: 0.2

  • Actual humans: 1.2

Are we willing to accept a huge reduction in auto deaths with the caveat that self-driving cars will still be killing a couple thousand people annually due to errors? Probably not, unless there's money to be made, in which case... maybe.

It's possible that self-driving cars are already safer than humans

I don't think we're there yet. Some days are great but I often have to intervene and stop it from doing moving violations. One time I had to stop it from turning into a cyclist. It rattled me enough that I stopped using it and I wasn't even using it that much.

Now, I don't always follow the speed limit or am 100% attentive, but I don't make mistakes like that. I wouldn't even say I'm an especially attentive driver. Tesla's FSD is both the most amazing AI driving system ever built but still not yet good enough.

Waymo's driving system seems much safer than FSD (I've ridden in a waymo but not in a FSD Tesla). But they have not yet released highway driving to the public.

The problem with highways is that "stop and wait for a remote driver to take over" is not a safe option. Tesla are selling a product which reserves the right to hand back control to the in-car human driver on short notice. Waymo (correctly according to everyone who has looked at this) don't think that is safe.