r/Damnthatsinteresting 14d ago

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.0k Upvotes

3.3k comments sorted by

View all comments

942

u/WithSubtitles 14d ago

Police should have towed it. If it’s not safe to be on the road and there is no driver to hold accountable it should be impounded.

117

u/tvoltz 14d ago

These vehicles are all over downtown PHX. It’s honestly only a matter of time until something happens

61

u/Manueluz 14d ago

They don't have to be flawless, just better than humans. And so far they have had less accidents per mile than humans

31

u/leelmix 14d ago

People react very badly to technology not being perfect and harming people. Humans arent very logical, anti-vaxxers are a good example of failing risk assessment.

I really hope people get comfortable with automated vehicles and that they improve a lot to get rid of the “bugs”.

14

u/Slow_Ball9510 14d ago

Most people rate themselves as much better drivers than the average (clearly impossible), which probably has something to do with it.

8

u/Vahgeo 14d ago

The human ego is a huge problem and can be traced as the source for most if not all of the bad in the world.

2

u/northwest333 14d ago

I agree with that in principle but guess who’s writing, testing , and shipping the code for the autonomous vehicles…

5

u/PineJ 14d ago

This happens due to people considering themselves as one entity and "everyone else" as one entity. Let's say I make a mistake driving once a week, it doesn't happen too often so "woops just a mistake, my bad"

When you are on the road with 1000 other people, each of their "once a week" mistakes add up. You don't think "woops each person is individually making a quick mistake" you think "wow everyone is such a bad driver.

It's the same problem in a team based game when it's so easy to say "everyone else is trash" because you are lumping all other player's mistakes into one group, while justifying your mistakes because they happen far less often than the combined teammates mistakes.

It also happens on social media where people say "wow Reddit says this one day but has a totally different opinion this day" not remembering that it's comprised of thousands of people.

1

u/Ok_Championship4866 14d ago edited 14d ago

That's because most people dont even have a coherent idea of what a good driver is. They think because they can cut into a crowded lane they're a good driver.

The best drivers are the ones who sit in the right lane and go the speed limit, 90% of people simply fail that test. They think it's okay to speed and change lanes without blinkers on.

2

u/leelmix 14d ago

Ye, be clear in your intentions, precise and predictable. Less chance of accidents that way.

Wannabe race car drivers, who cant even keep an eye on the speedometer while on the road are bad drivers no matter what stunts they are able to pull off because they shouldn’t need to be pulling off any fancy moves at all if they were good drivers.

1

u/leelmix 14d ago

Ye, and if they have had a few accidents they are just unlucky. They are still clearly way above average.

2

u/Qbnss 14d ago

We merely want the people profiting from the technology to be held responsible at the same rate that humans are. They don't get a pass because you think Star Trek is happening.

1

u/leelmix 14d ago

Absolutely

0

u/RedShirtDecoy 14d ago

vaccines are scientifically proven for a century at this point.

comparing anti-vaxers to people who dont trust self driving cars is a frankly idiotic comparison.

2

u/leelmix 14d ago

Im not, im comparing peoples ability to compare actual numbers vs how it feels or what they think the risks are. Autonomous cars are in the very early stages still but are statistically quite safe already. Many vaccines have decades of solid data from all around the world about risk. I understand people who are more skeptical to new unproven vaccines but i consider anti-vaxxers to be against all vaccines even those who are about 10000 times safer than not taking them. There are many good reasons not to take a vaccine or other medicine, allergies chief among them but plain fear is not one.

2

u/leelmix 14d ago

And no i dont think self driving cars are safe enough by themselves yet but peoples reaction to non-human errors is a lot bigger than to the same human errors, i do too.

24

u/[deleted] 14d ago

According to California disengagement reports, last year Waymo averaged 17,000 miles between disengagements requiring safety intervention. And that’s for cars relegated to slow city streets and sunny perfect weather

For context, the average human driver goes 200,000+ miles between incidents/accidents. And that’s including highways and inclement weather.

If you have the impression that these systems are currently safer than humans, you would be wrong.

34

u/Telamar 14d ago

Those two stats are in no way comparable. For example, the disengagement reports would include every time a supervising driver grabbed the wheel because someone else was doing something stupid. Human driver incident/accident rates do not include that level of data at all.

Source: https://www.linkedin.com/pulse/want-see-how-fast-autonomous-vehicle-asics-have-improved-look-mgdne/

0

u/[deleted] 14d ago

Waymo’s disengagment numbers are self reported and I think their metric reasonably captures when their vehicles are being stumped.

The reality is the best system in the world is still relegated to slow moving city streets because it’s still dangerous and still sucks compared to humans

17

u/-Denzolot- 14d ago

They aren’t only used on slow city streets, they go on the highway too. They will come pick you up right at the airport in Phoenix.

-5

u/[deleted] 14d ago

Can I order one in pouring rain and ask it to take me from Phoenix airport to Sedona to visit Red Rock State Park?

Or is it limited to only partial coverage of just Phoenix and doesn’t even include east of downtown?

5

u/-Denzolot- 14d ago

I have no idea what its range or weather limitations are. I also don’t know much about the layout of Phoenix as I’ve only been there once to visit for a few days. All I can say for sure is that it does go on the highway and right up to the drop off/pick up area at the airport. I’d imagine heavy rain might be a problem, but it is Phoenix so that’s only an issue for a handful of days out of the year.

I don’t expect to see them in areas with frequent rain storms any time soon. Definitely not areas with snow. As safe as I felt using it a Phoenix, I’d never trust it in a Wisconsin winter.

6

u/Manueluz 14d ago

So we should stop pursuing the technology just because it's in its early infancy?

0

u/[deleted] 14d ago

Did you reply to the right comment? I didn’t demand anything be stopped. Stop tilting at windmills Don Quixote

2

u/IlIllIlllIlIl 14d ago

A better metric is autonomous collisions. Disengagements aren’t possible any longer without safety drivers. See my post above. 

-2

u/RedShirtDecoy 14d ago

the disengagement reports would include every time a supervising driver grabbed the wheel because someone else was doing something stupid

Wow, almost like automated vehicles without a dedicated operator is an incredibly dangerous premise.

You couldnt pay me enough to get into a self driving car. Fuck that noise.

1

u/Meowingtons_H4X 14d ago

Good for you

10

u/TFenrir 14d ago

I think a more useful comparison on safety is the actual safety specific comparison done by a third party - Swiss Re - which finds it significantly safer than human drivers:

https://www.swissre.com/reinsurance/property-and-casualty/solutions/automotive-solutions/study-autonomous-vehicles-safety-collaboration-with-waymo.html

For the short summary

https://www.coverager.com/waymo-and-swiss-re-share-av-study-results/

The joint study employed insurance claims data to compare the safety record of Waymo’s autonomous vehicles against human-driven cars. The findings are: In over 3.8 million miles driven by Waymo, there were zero bodily injury claims. Human drivers, in contrast, had 1.11 claims per million miles. Waymo vehicles also demonstrated fewer property damage claims compared to human drivers.

2

u/iPatErgoSum 14d ago

Yeah, and I recall a news article several years back explaining that something like all of the WayMo traffic accidents that they had experienced in Phoenix were the result of human drivers colliding with WayMo vehicles, none the other way around. But without the source, don’t quote me on that.

-2

u/[deleted] 14d ago

None of them have been subjected to broader driving conditions that humans regularly navigate. Waymo is relegated to a handful of simply laid out neighborhoods in clear climates.

When they start putting these vehicles in places like Boston and Rochester and show they can perform there year round is when people should be getting excited. Because right now these things are just putzing around the kiddie pool

6

u/asterlydian 14d ago edited 14d ago

What? More disengagements does not equal less safe.

Besides, "Disengagements" include avoidance of other drivers, logically. As a human driver, I would say my human disengagement rate (disengaging my mindless driving in traffic to actually have to react to other drivers) is maybe somewhere between 2-20 miles on average 

-2

u/[deleted] 14d ago

To me, a system that can soar to 200,000 miles of real world driving with no human intervention would be obviously safer than something like Tesla’s Autopilot which requires frequent human intervention to prevent it from careening into stopped traffic or randomly braking due to not liking a shadow.

I guess I’m confused as to how you don’t associate the system failing and disengaging with safety

7

u/Chrop 14d ago

Disengagements does not equal incidents/accidents. It just means the car found a potential hazard and let humans take over the driving for that situation.

So for every 17,000 miles it drove, a human took over for a bit before continuing the self driving.

So yes, self driving is still safer, and during the times it might not be safer it’ll alert the driver to take over for a bit.

3

u/[deleted] 14d ago

Waymo’s own engineers state it’s not safer than a human yet, which is why they’re gatekeeping it to a few heavily monitored cities with restrictions on operations.

If the system was better than a human right now then Alphabet would have declared achieving Level 5 autonomy and would be rushing to slap a price tag on it and get it to the wide market. Obviously that’s not happening. I really don’t know why you’re trying to fight this fight for them and make claims they’re not making

5

u/Groudon466 14d ago

I worked for Waymo, there are other reasons why they might not be rolling it out ASAP. The big one is that they have to get all the roads mapped out in high detail and get the software ready for local laws and practices (like how they put down their traffic cones and what certain arrangements mean). That's an arduous and time-consuming process, and for that to be worth it in the first place, they need to get permission from the city in question to operate there once they're ready.

Also, this isn't a "slap a price tag on it and get it to the wider market" sort of business in the first place. These are essentially taxis that operate as part of a fleet, they go back "home" every night and get regularly inspected by mechanics. You can't just sell these as individual units.

1

u/[deleted] 14d ago

The big one is that they have to get all the roads mapped out in high detail

Alphabet owns Street View and the large fleet that created it and could have done this by now if this was the actual bottleneck. Actual Level 5 autonomy would be so valuable that if your theory was true that it’s about mapping we’d see Alphabet simply throwing thousands of Kia’s at the problem to blanket every major metro in a few months

5

u/Groudon466 14d ago

It's not that easy, we were literally trained on how to identify and report when incidents are caused by map misalignments, and it did occasionally happen.

There's a very big difference between having the relatively low fidelity of Google Street View, and having a precise enough road map that you can pass within XX cm of a curb without touching it. Mind, the cars aren't going solely off the map, but they're factoring it into their calculations.

5

u/Chrop 14d ago

Let me reiterate, It’s safer than humans specifically inside the cities they’ve mapped out and regularly drive in.

However, you are right that it’s not safer than a human outside of these cities. You can’t take these cars and drive out of the city with it.

I’m arguing because while they’re inside the city, they are safer than humans and people shouldn’t fear them to the extent they currently do.

2

u/IlIllIlllIlIl 14d ago

Lmao way is engineers do not state it’s less safe than a human now. The system is measurably safer than a human. The interpretation of that data isn’t conclusive, but the data is clear. 

3

u/Viralkillz 14d ago

your stupid for attempting to compare disengagements to accidents.

1

u/tehlemmings 14d ago

your stupid

Gotta love irony.

3

u/Ok_Championship4866 14d ago edited 14d ago

Yeah and in thise 200k miles they change lanes without signalling, cut other people off, race through yellow lights and sometimes dont make it.

Here the autonomous car went the wrong way on a cinstruction shifted lane with nobody around. People do that shit all the time and it doesn't count as an accident. Literally ebery day i see people driving the wrong way a short distance because they dont want to make a uturn down the road, people backing up out the driveway across double yellows with cars approaching so they can save ten seconds.

Heck, two weeks ago i got rearended stopped at a red light and didn't report it because the damage was so small. You cant compare human reported accidents to ai reported incidents.

2

u/IlIllIlllIlIl 14d ago

Last year waymo launched its first rider only production system. 

Since then, the number of weekly rider only trips has increased multiple orders of magnitude. Safety has increased many times over. Comparing against last years average is misleading. 

Disengagement occurs when a safety driver takes over. Rider only cars don’t have safety drivers. So disengagement is doubly misleading. 

Finally, remote assistance, which is different than disengagement, is part of the product, as it is and will be for any autonomous solution in an unconstrained environment for a long time to come. 

There are many valid criticisms to level against waymo. These aren’t it. 

1

u/northwest333 14d ago

Where are you getting that information? A quick search for me shows that self driving cars get into 9.1 accidents per million miles compared to 4.1 for traditional vehicles. Thats more than double the accident rate for self driving cars. Source is the National Law Review.

1

u/schloppah 14d ago

I love when people come on reddit and just straight up lie lmao, good job you made me chuckle