Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
In 2010, the vibe (to me at least) was that self-driving cars had effectively been developed; by 2025, substantially all the cars on the roads of American cities would be fully autonomous (without requiring a human standing at-the-ready to take over in a crisis).
So what happened? Why, at least outside Silicon Valley, is my Uber cab still driven by a human? Do technical challenges remain in developing autonomous technologies? Is regulation / liability the major obstacle to adoption?
Can confirm, I actually knew a guy who wanted to start some consulting service for soon-to-be-ex truckers helping them to retrain for new jobs.
Futurists are some of the worst people to ask about the future, perhaps only second to CEOs who rely on hyping up investors to pay the bills.
More options
Context Copy link
Tesla were (and probably still are) lying about how much progress they had already made, and bullshitting about the likely speed of future progress, and were widely believed by fanbois and the low end of the tech media because Tesla.
As far as I am aware, there has never been a time when people who had worked on autonomous vehicles thought Tesla was ahead of Waymo, but Tesla have always been publicly over-optimistic (including shipping a non-working product for cash upfront) in a way Waymo don't need to be.
More options
Context Copy link
I think it's because we see quick progress from "hey this ride didn't cause an immediate horrific car wreck" to "90% of the time the car arrives at the end of the obstacle course. amazing" and believe getting to 100% is in the bag.. But in truth the reliability rate has to be something absurdly high. Even if a Tesla on FSD only needs a few interventions a week, this is still a very, very long way from full autonomy. We need something that requires no safety interventions per approx 500 years, not just something that requires less than around 1 intervention a day.
As we know in reliability engineering, every 9 you add next to "99% reliability" adds an order of magnitude of complexity or cost. Full self-drive might fall under similar development burdens.
I guess it depends on how society approaches the problem.
It's possible that self-driving cars are already safer than humans. So let's say, in 5 years, the death rate looks like this per 100 million miles.
Perfect human driver: 0.01
Self-driving vehicle: 0.2
Actual humans: 1.2
Are we willing to accept a huge reduction in auto deaths with the caveat that self-driving cars will still be killing a couple thousand people annually due to errors? Probably not, unless there's money to be made, in which case... maybe.
I don't think we're there yet. Some days are great but I often have to intervene and stop it from doing moving violations. One time I had to stop it from turning into a cyclist. It rattled me enough that I stopped using it and I wasn't even using it that much.
Now, I don't always follow the speed limit or am 100% attentive, but I don't make mistakes like that. I wouldn't even say I'm an especially attentive driver. Tesla's FSD is both the most amazing AI driving system ever built but still not yet good enough.
Waymo's driving system seems much safer than FSD (I've ridden in a waymo but not in a FSD Tesla). But they have not yet released highway driving to the public.
The problem with highways is that "stop and wait for a remote driver to take over" is not a safe option. Tesla are selling a product which reserves the right to hand back control to the in-car human driver on short notice. Waymo (correctly according to everyone who has looked at this) don't think that is safe.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I think any such "vibe" is from clueless journalists and tech "influencers" over-predicting things either from simple total naivety or deliberately making out-there proclamations for more attention.
I think the main issue is more fear of unfriendly regulation and legal liability, and of bad publicity leading to the same. This manifests in several ways. First off, there will probably never be a personally-owned self-driving car. They're so complex and in need of constant maintenance and updates and being in perfect condition for the self-driving stuff to work right that nobody would ever be willing to sell one to an individual. Second, all the companies that run them are being very conservative about expanding their programs. Going too fast into new situations has too much risk of something bad happening. Third, the companies have a huge amount of control over who they let use them and where they let them go. They pretty clearly make careful use of this to not allow them to be used in any situation they think might be risky. And also keep all of the details about how they're doing that behind closed doors.
There's also the physical plant issue. Even if we had a perfect self-driving car ready for mass production today, it would still take decades to replace substantially all of the cars on the road in the entire country.
More options
Context Copy link
What happened is that the real world is several orders of magnitude more complex than a closed and instrumented course, and that the hype-men vastly overstated the maturity of the technology.
Case in point, my old car had lane guard that couldn't be permanently disabled. There were certain spots where it would try to guide me off the road / towards the incoming traffic every single time I drove through them. The road wasn't even in particularly bad shape at those spots.
More options
Context Copy link
More options
Context Copy link
There's a common pattern which goes something like this:
"People overestimate how much can get done in a year, but underestimate how much can get done in a decade".
We've been making steady advances in self-driving the last 15 years and Waymo cars are already operating as taxis in Phoenix, LA, and SF (although I think they still have some sort of human element for edge cases).
Self-driving cars will continue to advance until, 15 years from now, it will be everywhere and it all sort of happened gradually. The regulatory framework will evolve over time much like it did when cars first came out.
Most legacy automakers are doomed of course.
More options
Context Copy link
I get the sense that in 2010 self driving cars were 50% ready. They had the easy cases down. Highway driving in good conditions with clear signage.
By 2020 they have more of the edge cases down, its like they are at 90%. City or neighborhood driving, sometimes unclear signage. Unexpected obstacles like pedestrians. Weather, very unclear signage or strange roads, and the disconnect between written vs actual rules of the road.
They are over 95% ready, maybe even 99% but their remains basically impossible use cases. Times when human drivers literally just muddle along and guess at what the correct behavior is, or make it up as they go. This works fine at an individual level. The times when you have to do something truly strange and out of the ordinary are not that common. A construction crew is out early without their signage guys and I'm expected to just carefully drive around them. Or take a totally different route through the neighborhood. Or I'm doing christmas shopping and a bunch of shoppers treat the parking lot like an extra wide sidewalk, and I just need to weave in and out of them slowly to make any progress. Or a pedestrian waiting at a crosswalk looking like they are about to cross, but they wave me through, because they are waiting for someone to catch up with them further back on the sidewalk.
Its minor stuff and easy for a thinking and reasoning adult to figure out many of the situations. But you can't exactly codify it as rules of the road. No one writes laws about it. Accidents related to it are rare for an individual, but probably become guaranteed with enough mileage.
I've said for a long time that American roads are easy mode for self driving cars. If they start testing things in a place like India with its insane traffic then you should start paying close attention.
Or even a pedestrian who lurks on the sidewalk and deliberately runs in front of the car as an insurance scam.
More options
Context Copy link
More options
Context Copy link
Some technical challenges still exist, but more importantly they are legible. Regulators (and voters) can point to specific deficiencies and failures, which are used to justify stronger regulations and delays.
If we said that a self-driving system would be treated equal to a human driver (mandatory insurance, pass a driving test, etc.), then I bet we would have several competing companies, all of which would pay lower insurance rates than average humans because they would crash less.
Instead, we have ridiculous debates like "should it swerve to hit one pedestrian, or brake to hit five jaywalkers" holding up the process.
More options
Context Copy link
For my anecdotal two cents, it feels like it's a mix of legal liability and the great PR obstacle. Normies won't accept autonomous vehicles until the accident rate is zero, not merely slightly better than the average human driver.
Elon's chud arc may also be a factor in reduced enthusiasm.
More options
Context Copy link
More options
Context Copy link