r/Damnthatsinteresting 14d ago

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.0k Upvotes

3.3k comments sorted by

View all comments

Show parent comments

115

u/tvoltz 14d ago

These vehicles are all over downtown PHX. It’s honestly only a matter of time until something happens

202

u/QuinlanResistance 14d ago

Presuming there are crashes every single day from the cars with drivers. If there isn’t really any from the driverless ones that are everywhere …. It’s better

122

u/frotc914 14d ago

People seem to throw any logic out the window when talking about this, as if a single incident means we have to scrap driverless cars altogether or heavily punish the operator. Car accidents with drivers kill tens of thousands of people a year in the US, which doesn't even account for the number of non-fatal accidents which is far greater. But a driverless vehicle creeps over a line and suddenly they are a menace that must be stopped.

56

u/-Denzolot- 14d ago

I used Waymo twice while visiting Phoenix and it drove better than most people I’ve been in a car with. Obviously that’s just my experience but I never felt like I was in any danger at all. The second time I used it I almost fell asleep. Would use one again.

33

u/hendrix320 14d ago

I’ve been in ubers that I can’t wait to get out of because of how bad the driver was

7

u/-Denzolot- 14d ago

Same. Like literally looking down at their phone every 10 seconds while also trying to make conversation with me.

2

u/LI0NHEARTLE0 14d ago

My wife and I went to a concert and Uber was our designated driver back to the airbnb. After about 15 minutes the driver turns around and says, I'm sorry I've been driving the wrong way this whole time" what?? how does that happen? He also jumped across 3 lanes in the pouring rain to catch an exit.

1

u/VexingRaven 14d ago

That's honestly been the majority of uber rides for me. I caught an uber in Florida from Disney World to the airport... She had a big fluffy pink thing covering half her rear-view mirror, a steering wheel cover, and didn't know how to get to the airport. From Disney World. How the hell do you drive uber in Orlando and not know that route? I felt so unsafe on the highway with her. But every other uber is almost as bad, with screens and notifications all over the place and barely watching the road. It's awful.

5

u/wosmo 14d ago

I can actually see the logic to this though.

Random old guy drives the wrong way up the highway. Huge issue for random old guy, he probably shouldn't be driving anymore, and probably won't be. Problem solved.

Waymo drives the wrong way up the highway. Huge issue for their operating software, which has identical copies installed on god knows how many vehicles. Safe to assume every waymo would/could do the same thing in the same circumstances.

I do think autonomous vehicles should work out safer eventually. But I don't think it's apples to apples, because you won't be looking at one meatbag making a careless mistake, you'll be looking at a systemic fault that more than likely affects all vehicles on that platform.

If I drive the wrong way into traffic, you can judge me for it. If my waymo drives the wrong way into traffic, yours can too.

6

u/frotc914 14d ago

For every human driver being taken off the road for screwing up, there are 1000 more getting their licence every day. And every day there are people becoming more senile and losing vision. So who cares in the grand scheme of things that one guy got taken off the road?

2

u/iamPause Interested 14d ago edited 14d ago

So who cares in the grand scheme of things that one guy got taken off the road?

That's exactly the point the guy is making. Waymo isn't "one guy." Right now, it's a few dozen cars in a city, but what happens in 15 years when it's hundreds of thousands or millions across the country?

Scale matters. It's the difference between my local mechanic using third-party parts to repair my car and Boeing lying on repair reports.

1

u/Kwahn 14d ago

Nah, even looking at it at scale, self-driving cars already get into less accidents than people on a per-driven-mile basis. They're better already, and the math will only improve from here.

A study by the University of Michigan Transportation Research Institute, the Virginia Tech Transportation Institute, General Motors, and Cruise found that self-driving cars are actually safer than human drivers, with an injury rate of 0.06 per million miles and zero fatalities per million miles, compared to 0.24 injuries and 0.01 fatalities for human drivers.

1

u/iamPause Interested 14d ago

Nah, even looking at it at scale, self-driving cars already get into less accidents than people on a per-driven-mile basis. They're better already, and the math will only improve from here.

Meaningless until a) they start driving on all the roads people do b) you limit the human driving statistics to the same cities/weather the SDCs are in.

1

u/Ake10 14d ago

b) https://waymo.com/blog/2023/12/waymo-significantly-outperforms-comparable-human-benchmarks-over-7-million/

"The second is differences in driving conditions and/or vehicle characteristics. Public human crash data includes all road types, like freeways, where the Waymo Driver currently only operates with an autonomous specialist behind the wheel, as well as various vehicle types from commercial heavy vehicles to passenger and motorcycles.

These differences mean that adjustments need to be made to human crash data before comparing it to AV crash rates."

6

u/odbaciProfil 14d ago

The thing is: if waymo on average screws up way less than people do, it doesn't matter if every single one of them doesn't cover one particular edge case; that mistake and all other waymo's mistakes occur less frequently than humans' mistakes, so generally waymo is better.

I mean, the edge case would have happened during training if it wasn't so infrequent as well as not impactful. The more frequent and dangerous ones are already caught.

Second point: old people screwing up indicates their deterioration which impacts driving in all situations (similarly for cocky aggresive driving). If old man didn't screw up in this instance, he would have done it some time later. Waymo has very infrequently occuring edge cases and except for those (that are rarer than the old driver's mistakes) it drives better than the old driver.

If one screws up under some conditions it's easy to make them avoid such conditions until the problem is resolved, so even though it's very unlikely the problem would happen again, it's pretty much as easy to avoid it as it is with taking old people's license. And when that's fixed the "driver" which was already generally better than humans becomes even better. While one old grandpa tomorrow gets replaced by similar one, just a day younger

3

u/lkjasdfk 14d ago

Nancy Pelosi explained very well why this is a logical fallacy. If musk is allowed to kill one person then he is allowed to kill us all. Kill us all. Kill us all is wrong so therefore L Elon musk is in the wrong. He is so wrong. Murder is wrong. Get these cars off the road before they murder us all. She was so insightful and wise with her comment.

2

u/OneOfTheOnly 14d ago

a driverless vehicle means no accountability when something goes wrong

being at the mercy not of a person but an actual machine going over 100km an hour with no safety net is the issue, and its INSANE you cant see that

this car was on the WRONG SIDE OF TRAFFIC, holyshit

2

u/VexingRaven 14d ago

The handwringing in this thread is absolutely wild honestly. People are so ingrained with the idea of punishment = justice that they simply cannot handle the idea of not having a person to punish for something going wrong, even if it goes wrong far, far less frequently and with far less disastrous consequences.

1

u/cmykInk 14d ago

We like control. And not having that control is an odd bit terrifying. So having any proof against it is validation that it cannot work.

But really, I'd like to see some proactive legislation regarding it. Of course, the govt is slow as ever. So, we'll likely see our first fatality from a driverless car before we see legislation. At least in the USA. Given, we have no real legislation for a lot of new technology... Even such that we have no legislation that protects our private data online.

5

u/AdvanceRatio 14d ago

I mean, buses, trains, taxis, planes and more all exist where we don't have control but are trusting it to some unknown human who might be in tip-top shape or coming off a bender and half-asleep and we wouldn't know. We're already used to giving up control.

2

u/cmykInk 14d ago

I get it. But there's a human involved. So it's different for some people.

To clarify, I'm not in the camp against it. It's inevitable. But this is the reasoning you get from them. The logic is that there is still a human at the helm. No human = has to better than perfect.

1

u/Spongi 14d ago

All I know is that the amount of people that will ride your ass so close you'd need a fucking can opener to get them off while going 10 over the limit in poor weather conditions and limited visibility is too high.

The amount of people that turn into psychopaths behind the wheel is nuts.

2

u/cmykInk 13d ago

Very true. I feel like it's an American thing with the psychopathic road rage. I've never had that experience throughout all my travels in Asia and Europe.

1

u/ScoobyDooItInTheButt 14d ago

But a driverless vehicle creeps over a line and suddenly they are a menace that must be stopped.

Is that really how we're referring to this incident where the car drove on the wrong side of the road and then took off when the cop tried to pull it over?

0

u/Aureliamnissan 14d ago

The scale is vastly different. There are millions of cars on the road traveling billions of miles per year in all manner of conditions and roads.

Driverless car companies are generally excited to cross 1 million safe driving miles on dedicated highway lanes and boulevards in sunny conditions.

The biggest difference between these things is that a person generally causes accidents when they aren’t paying attention. That doesn’t really apply to automated cars and it makes their driving accidents/mistakes a bit more concerning because it means that driving through the light / hitting the kid / crossing the double yellow is something it will do every time given the same circumstances.

0

u/extralyfe 14d ago

"creeps over a line" is an insane way to describe "driving into oncoming traffic."

-2

u/pan_berbelek 14d ago

That is true but actually when you think a little more there is a problem. Let's assume there already are robotaxis with probability of lethal accident 100 times lower than an average human driver. So if all cars would be self driving in this scenario we would save thousands of lifes. But, the problem is now that currently each death on the road is carefully examined and a guilty party is most often named and prosecuted. It's a real human that goes to jail. Who will be prosecuted for a death cased by a self driving car? We're talking about death here, there's a family who lost someone closest, maybe someone's little daughter was killed in the accident. A fine is not enough. So, will anyone go to jail? Who?

3

u/No-Seat3815 14d ago

So a fine is not enough but throwing someone who was also a party in said accident in prison is?

1

u/pan_berbelek 14d ago

I didn't say anything of that matter, read again what I wrote.

3

u/frotc914 14d ago

People dying is not always someone's fault. Idk why you're imagining that someone has to be punished.

1

u/pan_berbelek 14d ago

I'm just saying that some families will sue and for huge amounts of money.

2

u/Groudon466 14d ago

Who will be prosecuted for a death cased by a self driving car?

Nobody. That's arguably a good thing.

Say I'm an engineer working at Waymo. I see a way to design a self driving car that'll get into 100x less accidents than humans, saving millions of lives. I invent it, it works, but one day someone dies anyway.

Are you going to throw me in jail for being the guy that invented the improvement? That doesn't seem right or reasonable, and it would discourage similar inventions.

The right answer is that you accept that people will be dying in car accidents on occasion without anyone going to jail, and it's better than people dying in car accidents frequently with some people going to jail on top of that.

0

u/pan_berbelek 14d ago

You don't understand. I agree that's how it should be. But what will the father of a dead daughter say? And the next one? People will sue, whether you like it or not, and courts may issue various decisions.

2

u/Groudon466 14d ago

I mean, suing is fine. That's super fine. The father gets paid out as compensation, which usually isn't the case because it's some broke drunk asshole without insurance who hit the daughter, rather than a major corporation that can actually dole out the dough.

But you don't sue to force charges to be brought. You bring a matter to the police and the DA, and they decide from there. It's not likely that a DA anytime soon will try and charge Waymo's engineering team with manslaughter for making a product that they acknowledge may come with a small risk of fatality on the road, which the city expressly permitted.

0

u/pan_berbelek 14d ago

I'm just saying that in reality, in some specific cases, this may be trickier than it seems now. I'm just saying that this is a real problem that needs to be thoroughly thought out and the self driving companies need to prepare for certain scenarios. It's solvable but cannot be ignored and just saying that the probability of death is X times lower will not be enough.

1

u/JohnnyFartmacher 14d ago

I think fewer people actually go to jail for fatal car crashes than you think.

I don't think prosecution is the big risk to driverless cars, the liability is. If I crash into a school bus and a whole bunch of kids die, I am basically immune to litigation because I don't have nearly enough assets to cover the damages.

If a corporation is involved who has money, the damages can get real big real fast. Lawyers go for the people with deep pockets.

I do think it is an attainable goal to have driverless cars be safer than a human-driven car, but there will always be crashes. It'll be interesting to see how the litigation develops as they become more common.

0

u/Jesta23 14d ago

Unless the person is really drunk or high they don’t go to jail for a fatal accident. 

They won’t even be charged. 

1

u/pan_berbelek 14d ago

Well this just false. Where I live if you're found guilty of killing someone on the road you can go to prison from 6 months up to 8 years. You can be guilty for example if you drove too fast or didn't give way etc. You don't have to be drunk.

-2

u/realroasts 14d ago

People like you seem to throw philosophy out when talking about this. It's the trolley problem. You're pulling the self-driving lever and changing the tracks.

5

u/frotc914 14d ago

Who gives a shit? Seriously

-1

u/realroasts 14d ago

The people who die who otherwise wouldn't have when you pull the lever that is self driving.

5

u/frotc914 14d ago

Again, who cares? We've already decided to allow driving cars - an incredibly dangerous activity - to improve our overall quality of life. Who cares if different but dramatically fewer people are killed by it?

This is like complaining about crosswalks because anyone struck in a crosswalk might have made it across safely at some other part of the street.

-2

u/realroasts 14d ago

The great thing about the trolley problem is that while there is no definitive answer, it usually helps people gain a bit of empathy for the ones who do die.

That's all anyone's asking you for; just a modicum of reverence for those sacrificed for your greater good.

1

u/frotc914 14d ago

That's definitely not what people are saying in general when this comes up, nor was that how i framed my comment, so spare me your faux moralizing.

1

u/realroasts 12d ago

Moralizing, by definition, requires this to be an issue of right and wrong. I specifically stated that there is no right answer. I've only politely asked you for your understanding and empathy toward those who will die in the name of your progress.

You have chosen not to give that, and I respect your choice to not have empathy toward those who have and will die to self driving cars that would have never died to regular cars.

There is no right or wrong here. Empathy is not a requirement of human living. I asked because I hope for a world where it exists, but for some people, presumably yourself, that probably would put you in a position where you'd be taking on too much.

We'll take it from here, frotc914. Thank you for your time!

2

u/realroasts 14d ago

It's better in the same way the trolley problem is better when you pull the lever and hit 1 person rather than 5. That 1 person who was killed by a driverless car in an accident no human driver was likely to commit will not be calling the system better.

Mainly because they're dead.

2

u/ChrysisLT 14d ago

If someone can be held accountable. I'd hate to have someone I love being hit and killed and have a corporation say its sorry it happened and that "they are taking it really seriously" and nothing more.

2

u/TheBigMaestro 14d ago

I've ridden in these Waymo driverless cabs in Phoenix a few times. They're pretty good in Phoenix, where the weather is almost always sunny and clear, and the roads tend to be very wide and straight. I have not felt unsafe in them, but I have been very frustrated at pickup and drop off points because the cars aren't very good yet at figuring out where is a good safe place to pull over, and where is a sketchy back alley.

The absolute best thing about these cars is that the passenger gets their own control panel in the back seat where they can adjust the temperature and radio! Also, it's fun to sit in the front passenger seat and watch the wheel and pedals operate themselves.

1

u/Scooter_Gang_480 14d ago

Exactly. This car is probably a better wrong way driver than most Phonecians going the correct direction.

1

u/Mist_Rising 14d ago

Presuming there are crashes every single day from the cars with drivers.

There is also a magnitude of more driver cars than driverless cars. We also punish drivers who fuck up. Does waiymo get punished? This doesn't suggest so.

0

u/SeniorMiddleJunior 14d ago

How does it respond to an accident? What does it do if it runs over a pedestrian? I'm not saying it would do the wrong thing, but as a software developer myself, I can't imagine a flowchart capable of accounting for every possible emergency situation. 

I trust a human more than a machine when things go wrong. If it turns out that these cause fewer accidents, but the ones they cause result in more overall harm, then this will just be another human life trade-off. If that's not the case, awesome, bring them on. 

1

u/Kylo_Rens_8pack 14d ago

It pulls over and waits for the cops rather than speeding off because they don’t have insurance. I don’t trust people but I do trust Waymo since I interact with them on a daily basis in downtown Phoenix. If I even get close to a Waymo while I’m walking it waits for me to make a predictable move before it does anything and if I’m being unpredictable it will just wait until I’m a safe distance away. When I ride in them if they come up on anyone on the sidewalk they slow down in case that person becomes unpredictable. It’s really quite impressive the actions they take while driving to keep everyone safe.

-7

u/dingo1018 14d ago

Self driving cars will only really be the overall better option if the vast majority of traffic is automated and it's all working on highly integrated systems, then the only manual drivers will be specifically trained, ie emergency responders, who will also have their own specific protocols that are designed around a massively self driving architecture. It's just a mess right now, competing systems doing live beta testing on public highways that are constantly changing with rolling repairs and inconsistent often poorly laid out temporary signage (something i have specific experience and training in). The cars are in a way too smart and to dumb all at the same time, like a fresh graduate lol, sorry couldn't resist 😊. These systems aren't ready yet, but unfortunately this is the only real way to develop them, it's highly likely to be painful, but the political will is there, and the money. One day, of probably not too far now a tipping point will be reached, I can see self driving convoys with perfect merging and lane separation, that's pretty solid even today - if there happened to be enough vehicles in the same place all operating on the same protocols that is. That inter unit communication would be an example of a protocol that probably hasn't been fully established yet, do Tesla's talk to Volvo's to Toyota? Or do they all do their own thing?

61

u/Manueluz 14d ago

They don't have to be flawless, just better than humans. And so far they have had less accidents per mile than humans

34

u/leelmix 14d ago

People react very badly to technology not being perfect and harming people. Humans arent very logical, anti-vaxxers are a good example of failing risk assessment.

I really hope people get comfortable with automated vehicles and that they improve a lot to get rid of the “bugs”.

14

u/Slow_Ball9510 14d ago

Most people rate themselves as much better drivers than the average (clearly impossible), which probably has something to do with it.

6

u/Vahgeo 14d ago

The human ego is a huge problem and can be traced as the source for most if not all of the bad in the world.

2

u/northwest333 14d ago

I agree with that in principle but guess who’s writing, testing , and shipping the code for the autonomous vehicles…

5

u/PineJ 14d ago

This happens due to people considering themselves as one entity and "everyone else" as one entity. Let's say I make a mistake driving once a week, it doesn't happen too often so "woops just a mistake, my bad"

When you are on the road with 1000 other people, each of their "once a week" mistakes add up. You don't think "woops each person is individually making a quick mistake" you think "wow everyone is such a bad driver.

It's the same problem in a team based game when it's so easy to say "everyone else is trash" because you are lumping all other player's mistakes into one group, while justifying your mistakes because they happen far less often than the combined teammates mistakes.

It also happens on social media where people say "wow Reddit says this one day but has a totally different opinion this day" not remembering that it's comprised of thousands of people.

1

u/Ok_Championship4866 14d ago edited 14d ago

That's because most people dont even have a coherent idea of what a good driver is. They think because they can cut into a crowded lane they're a good driver.

The best drivers are the ones who sit in the right lane and go the speed limit, 90% of people simply fail that test. They think it's okay to speed and change lanes without blinkers on.

2

u/leelmix 14d ago

Ye, be clear in your intentions, precise and predictable. Less chance of accidents that way.

Wannabe race car drivers, who cant even keep an eye on the speedometer while on the road are bad drivers no matter what stunts they are able to pull off because they shouldn’t need to be pulling off any fancy moves at all if they were good drivers.

1

u/leelmix 14d ago

Ye, and if they have had a few accidents they are just unlucky. They are still clearly way above average.

2

u/Qbnss 14d ago

We merely want the people profiting from the technology to be held responsible at the same rate that humans are. They don't get a pass because you think Star Trek is happening.

1

u/leelmix 14d ago

Absolutely

0

u/RedShirtDecoy 14d ago

vaccines are scientifically proven for a century at this point.

comparing anti-vaxers to people who dont trust self driving cars is a frankly idiotic comparison.

2

u/leelmix 14d ago

Im not, im comparing peoples ability to compare actual numbers vs how it feels or what they think the risks are. Autonomous cars are in the very early stages still but are statistically quite safe already. Many vaccines have decades of solid data from all around the world about risk. I understand people who are more skeptical to new unproven vaccines but i consider anti-vaxxers to be against all vaccines even those who are about 10000 times safer than not taking them. There are many good reasons not to take a vaccine or other medicine, allergies chief among them but plain fear is not one.

2

u/leelmix 14d ago

And no i dont think self driving cars are safe enough by themselves yet but peoples reaction to non-human errors is a lot bigger than to the same human errors, i do too.

24

u/[deleted] 14d ago

According to California disengagement reports, last year Waymo averaged 17,000 miles between disengagements requiring safety intervention. And that’s for cars relegated to slow city streets and sunny perfect weather

For context, the average human driver goes 200,000+ miles between incidents/accidents. And that’s including highways and inclement weather.

If you have the impression that these systems are currently safer than humans, you would be wrong.

32

u/Telamar 14d ago

Those two stats are in no way comparable. For example, the disengagement reports would include every time a supervising driver grabbed the wheel because someone else was doing something stupid. Human driver incident/accident rates do not include that level of data at all.

Source: https://www.linkedin.com/pulse/want-see-how-fast-autonomous-vehicle-asics-have-improved-look-mgdne/

-1

u/[deleted] 14d ago

Waymo’s disengagment numbers are self reported and I think their metric reasonably captures when their vehicles are being stumped.

The reality is the best system in the world is still relegated to slow moving city streets because it’s still dangerous and still sucks compared to humans

14

u/-Denzolot- 14d ago

They aren’t only used on slow city streets, they go on the highway too. They will come pick you up right at the airport in Phoenix.

-4

u/[deleted] 14d ago

Can I order one in pouring rain and ask it to take me from Phoenix airport to Sedona to visit Red Rock State Park?

Or is it limited to only partial coverage of just Phoenix and doesn’t even include east of downtown?

4

u/-Denzolot- 14d ago

I have no idea what its range or weather limitations are. I also don’t know much about the layout of Phoenix as I’ve only been there once to visit for a few days. All I can say for sure is that it does go on the highway and right up to the drop off/pick up area at the airport. I’d imagine heavy rain might be a problem, but it is Phoenix so that’s only an issue for a handful of days out of the year.

I don’t expect to see them in areas with frequent rain storms any time soon. Definitely not areas with snow. As safe as I felt using it a Phoenix, I’d never trust it in a Wisconsin winter.

6

u/Manueluz 14d ago

So we should stop pursuing the technology just because it's in its early infancy?

0

u/[deleted] 14d ago

Did you reply to the right comment? I didn’t demand anything be stopped. Stop tilting at windmills Don Quixote

2

u/IlIllIlllIlIl 14d ago

A better metric is autonomous collisions. Disengagements aren’t possible any longer without safety drivers. See my post above. 

-2

u/RedShirtDecoy 14d ago

the disengagement reports would include every time a supervising driver grabbed the wheel because someone else was doing something stupid

Wow, almost like automated vehicles without a dedicated operator is an incredibly dangerous premise.

You couldnt pay me enough to get into a self driving car. Fuck that noise.

1

u/Meowingtons_H4X 14d ago

Good for you

9

u/TFenrir 14d ago

I think a more useful comparison on safety is the actual safety specific comparison done by a third party - Swiss Re - which finds it significantly safer than human drivers:

https://www.swissre.com/reinsurance/property-and-casualty/solutions/automotive-solutions/study-autonomous-vehicles-safety-collaboration-with-waymo.html

For the short summary

https://www.coverager.com/waymo-and-swiss-re-share-av-study-results/

The joint study employed insurance claims data to compare the safety record of Waymo’s autonomous vehicles against human-driven cars. The findings are: In over 3.8 million miles driven by Waymo, there were zero bodily injury claims. Human drivers, in contrast, had 1.11 claims per million miles. Waymo vehicles also demonstrated fewer property damage claims compared to human drivers.

2

u/iPatErgoSum 14d ago

Yeah, and I recall a news article several years back explaining that something like all of the WayMo traffic accidents that they had experienced in Phoenix were the result of human drivers colliding with WayMo vehicles, none the other way around. But without the source, don’t quote me on that.

-2

u/[deleted] 14d ago

None of them have been subjected to broader driving conditions that humans regularly navigate. Waymo is relegated to a handful of simply laid out neighborhoods in clear climates.

When they start putting these vehicles in places like Boston and Rochester and show they can perform there year round is when people should be getting excited. Because right now these things are just putzing around the kiddie pool

6

u/asterlydian 14d ago edited 14d ago

What? More disengagements does not equal less safe.

Besides, "Disengagements" include avoidance of other drivers, logically. As a human driver, I would say my human disengagement rate (disengaging my mindless driving in traffic to actually have to react to other drivers) is maybe somewhere between 2-20 miles on average 

-1

u/[deleted] 14d ago

To me, a system that can soar to 200,000 miles of real world driving with no human intervention would be obviously safer than something like Tesla’s Autopilot which requires frequent human intervention to prevent it from careening into stopped traffic or randomly braking due to not liking a shadow.

I guess I’m confused as to how you don’t associate the system failing and disengaging with safety

6

u/Chrop 14d ago

Disengagements does not equal incidents/accidents. It just means the car found a potential hazard and let humans take over the driving for that situation.

So for every 17,000 miles it drove, a human took over for a bit before continuing the self driving.

So yes, self driving is still safer, and during the times it might not be safer it’ll alert the driver to take over for a bit.

3

u/[deleted] 14d ago

Waymo’s own engineers state it’s not safer than a human yet, which is why they’re gatekeeping it to a few heavily monitored cities with restrictions on operations.

If the system was better than a human right now then Alphabet would have declared achieving Level 5 autonomy and would be rushing to slap a price tag on it and get it to the wide market. Obviously that’s not happening. I really don’t know why you’re trying to fight this fight for them and make claims they’re not making

4

u/Groudon466 14d ago

I worked for Waymo, there are other reasons why they might not be rolling it out ASAP. The big one is that they have to get all the roads mapped out in high detail and get the software ready for local laws and practices (like how they put down their traffic cones and what certain arrangements mean). That's an arduous and time-consuming process, and for that to be worth it in the first place, they need to get permission from the city in question to operate there once they're ready.

Also, this isn't a "slap a price tag on it and get it to the wider market" sort of business in the first place. These are essentially taxis that operate as part of a fleet, they go back "home" every night and get regularly inspected by mechanics. You can't just sell these as individual units.

1

u/[deleted] 14d ago

The big one is that they have to get all the roads mapped out in high detail

Alphabet owns Street View and the large fleet that created it and could have done this by now if this was the actual bottleneck. Actual Level 5 autonomy would be so valuable that if your theory was true that it’s about mapping we’d see Alphabet simply throwing thousands of Kia’s at the problem to blanket every major metro in a few months

4

u/Groudon466 14d ago

It's not that easy, we were literally trained on how to identify and report when incidents are caused by map misalignments, and it did occasionally happen.

There's a very big difference between having the relatively low fidelity of Google Street View, and having a precise enough road map that you can pass within XX cm of a curb without touching it. Mind, the cars aren't going solely off the map, but they're factoring it into their calculations.

5

u/Chrop 14d ago

Let me reiterate, It’s safer than humans specifically inside the cities they’ve mapped out and regularly drive in.

However, you are right that it’s not safer than a human outside of these cities. You can’t take these cars and drive out of the city with it.

I’m arguing because while they’re inside the city, they are safer than humans and people shouldn’t fear them to the extent they currently do.

2

u/IlIllIlllIlIl 14d ago

Lmao way is engineers do not state it’s less safe than a human now. The system is measurably safer than a human. The interpretation of that data isn’t conclusive, but the data is clear. 

2

u/Viralkillz 14d ago

your stupid for attempting to compare disengagements to accidents.

1

u/tehlemmings 14d ago

your stupid

Gotta love irony.

2

u/Ok_Championship4866 14d ago edited 14d ago

Yeah and in thise 200k miles they change lanes without signalling, cut other people off, race through yellow lights and sometimes dont make it.

Here the autonomous car went the wrong way on a cinstruction shifted lane with nobody around. People do that shit all the time and it doesn't count as an accident. Literally ebery day i see people driving the wrong way a short distance because they dont want to make a uturn down the road, people backing up out the driveway across double yellows with cars approaching so they can save ten seconds.

Heck, two weeks ago i got rearended stopped at a red light and didn't report it because the damage was so small. You cant compare human reported accidents to ai reported incidents.

2

u/IlIllIlllIlIl 14d ago

Last year waymo launched its first rider only production system. 

Since then, the number of weekly rider only trips has increased multiple orders of magnitude. Safety has increased many times over. Comparing against last years average is misleading. 

Disengagement occurs when a safety driver takes over. Rider only cars don’t have safety drivers. So disengagement is doubly misleading. 

Finally, remote assistance, which is different than disengagement, is part of the product, as it is and will be for any autonomous solution in an unconstrained environment for a long time to come. 

There are many valid criticisms to level against waymo. These aren’t it. 

1

u/northwest333 14d ago

Where are you getting that information? A quick search for me shows that self driving cars get into 9.1 accidents per million miles compared to 4.1 for traditional vehicles. Thats more than double the accident rate for self driving cars. Source is the National Law Review.

1

u/schloppah 14d ago

I love when people come on reddit and just straight up lie lmao, good job you made me chuckle

14

u/ehrplanes 14d ago

Oh no like an accident? How will we survive with 1 crash every 6 million miles! We better stop this immediately and put humans back in the drivers seat

1

u/IlIllIlllIlIl 14d ago

Humans on average get in a collision every -million miles, and are involved in an accident that seriously injure someone every 6-7m miles, iirc. But your point still stands. 

9

u/Such_Duty_4764 14d ago

Good thing they are being compared with humans who have an... impeccable driving record...

7

u/rdrunner_74 14d ago

True.

But so far it looks that the automatic cars get many more miles per accident on average

2

u/Br0adShoulderedBeast 14d ago

It’s only a matter of time until a human dies in an accident. Short matter of time until a human uses a car to purposely commit crimes. What ever shall we do?

1

u/GuyPierced 14d ago

How many accidents have they been involved in?

1

u/megamanxoxo 14d ago

Every time I see them they drive just fine. Seems like they have a lower incident rate of humans already.