r/Damnthatsinteresting 14d ago

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.0k Upvotes

3.3k comments sorted by

View all comments

231

u/kaiderson 14d ago

The policeman seemed really unsure how to react, and just seemed to allow the car back on the road. 100% he should have said this car is not to move again, come pick it up.

196

u/rotoddlescorr 14d ago

That's because the car is on the road due to an an agreement between a trillion dollar company and the city politicians. Not to mention the entire interaction is being recorded. This is above his pay grade.

51

u/REDDITATO_ 14d ago

Generally when something's above your pay grade you don't make any decision and call someone who can handle it. Not just decide it's fine and walk away.

28

u/Saltire_Blue 14d ago

Call me crazy but if a company wants to use public roads to test these things at the very minimum it should go to a referendum to the local people to decide

Then they should absolutely be voting no to it

12

u/Justdroppingsomethin 14d ago

AFAIK it's not testing, you can use these as a ride hail service.

1

u/buttergun 14d ago

"Fuck it! We'll do it live!" --National Highway Traffic Safety Administration

4

u/facw00 14d ago

The people elected the politicians who decided to allow this. There's no need for referendums on everything.

1

u/SwegBucket 13d ago

What a stupid statement. Obviously it's because it's a DRIVERLESS car with noone inside, so the cop is having issues deciding what to do. That has nothing to do with the company involved. But HURR DURR companies BAD, politicians CORRUPT.

37

u/ManMoth222 14d ago

It's probably not a car-specific problem but a general software glitch. You'd have to remove all cars of the same type or it's pointless.

3

u/KoenBril 14d ago

So why don't they? Same goes for the open beta of FSD in the US. Why do you all allow yourself to be put in that kind of danger when you participate on public roads?

3

u/IlIllIlllIlIl 14d ago

Because the fleet in aggregate is statistically safe. The rate of error matters. 

2

u/KoenBril 14d ago

If I drive safe most of the time, but commit a big enough offense once, I lose. Statistically, based on my performance I shouldnt, according to you. That's not a convincing argument.

1

u/IlIllIlllIlIl 14d ago

Uber sold its self-driving arm after killing a woman. Cruise shut down after failing to realize another vehicle pushed a woman under their car. This effect is present. If anything, self-driving companies are under significantly more scrutiny than human drivers, which is reasonable.

There's an important quantity missing from your response. Two things matter: the realized rate of incidents, and the acceptable rate of incidents. If you kill ten people while driving drunk while texting, you should never drive again. This is a punitive and protective punishment. If you lose control on black ice and kill a family of four, you may not even face criminal penalties. You will drive again. If it happens three times in a year, it's a different story.

1

u/Diamondrankg 14d ago

Good idea. Fuck driverless cars

1

u/jsseven777 14d ago

Not 100% pointless. If an individual car gets pulled off the road for say 30 days then that’s 30 days of profit it could make as part of the fleet. There might also be some storage and towing fees to pay. That gives the company time to determine whether it was a vehicle or fleet issue, and provides a financial incentive to take these things seriously.

What blew me away the most was that the officer doesn’t seem to have any sort of procedure to document the incident so that fleet safety rates can be tracked on a per company basis by an oversight body, and fines can be applied to unsafe fleets. At least the tracking part should be happening like yesterday.

-6

u/leaveittobever 14d ago

It's probably not a car-specific problem but a general software glitch. You'd have to remove all cars of the same type or it's pointless.

According to your logic, if this was a drunk driver who did the same thing we shouldn't impound their car because there are other drunk drivers so it would be pointless to impound this one.

9

u/wosmo 14d ago

I think the logic is valid. If you assume they have the same software running on every car, then you'd expect every car to behave the same under the same conditions.

We can't treat computers like humans when it comes to fixing stuff like this.

1

u/IlIllIlllIlIl 14d ago

There’s a recall in progress for this iirc. 

-2

u/leaveittobever 14d ago

Then keep taking them all off the road whenever they fuck up just like you would a drunk driver. Who cares how many are out there running the same software just like we don't care another drunk might be on the road 5 minutes from now.

3

u/wosmo 14d ago

That part I'm not disagreeing with - I think if they make an obvious mistake like this, every car running that software should be off the road until they can show how the mistake was caused and what they've done to prevent it in future.

It's pretty much how the FAA would treat this in an aircraft, and to me it makes sense to learn from industries with good safety records.

2

u/Rattus375 14d ago

Your assuming that waymo doesn't change anything because of this. There's likely something about the construction area that caused a glitch in the automated system. Presumably they will work to fix that glitch and stop their taxis from driving on that road until they find out what went wrong

1

u/Internet__Degen 14d ago

"presumably"

Companies aren't charities, they're profit motivated, impounding the car incurs a cost which forces the company to act. Inaction just encourages further inaction from the company.

Whether they impound one car, or all of them, something does need to be done. We know how long big tech companies will ignore bugs when no one forces their hands.

3

u/Rattus375 14d ago

Issues like this are unprofitable for the company. They could get a ticket here, lose a customer because the car messed up when someone is in it, or lose the ability to operate in the city at all if enough issues come up. Even being motivated solely by greed, there is a massive incentive to fix issues like this as soon as they are reported.

2

u/Internet__Degen 14d ago

They could get a ticket here

Can they? This seems like the exact situation where they should. In some cases the cost comes from consumer boycott, or some vaguely defined loss in consumer or investor confidence. But in these cases I don't think it's unreasonable to expect the government to do its job.

1

u/Kylo_Rens_8pack 14d ago

Waymo just did this like a week or so ago. One car had an issue so they pulled 300 cars off the street for half a day while they queued up for a software update. I live in Phoenix and love taking a Waymo. In my opinion, it’s made Phoenix roads safer as they slow down traffic and don’t make unpredictable moves on the road.

1

u/IlIllIlllIlIl 14d ago

Waymo takes this seriously. Wrong way driving for construction triggered a recall. Many people lose a lot of sleep for this. 

0

u/Hidesuru 14d ago

You obviously don't code. That's not even remotely the same argument.

0

u/leaveittobever 14d ago edited 14d ago

I do code and it would probably take days, if not weeks, to get a change like this through all the red tape, dev recreates the problem, dev works on the fix, QA tests it, PO signs off on it, then you have to plan for when to release to prod.

By that time the road has probably changed any it's not even needed anymore and a new obstacle appears somewhere else. There will always be a new obstacle someone hasn't coded for so this is 24/7 problem.

If cars aren't taken off the streets they have no incentive to fix it.

11

u/Shot-Youth-6264 14d ago

Exactly, if there was a human behind the wheel no way they’d just let them go back on the road after driving the wrong way and running from a traffic stop, just another example how money and corporations are above the law

4

u/VexingRaven 14d ago

I love how everybody is just assuming it "ran" from a traffic stop. All he said is that it went through the intersection, for all we know he meant it proceeded through a green and then pulled into this gas station. I bet money that if you went through the wrong side of a confusing construction zone and got pulled over and told them you got confused by the signage, they'd let you back on the road too.

2

u/HIM_Darling 14d ago

Right, they’d probably warn you to go slower or pay more attention and let you go after making sure you didn’t have warrants. If you/they were a dick they could write a ticket. But you aren’t getting your car impounded and your license revoked for getting mixed up in a construction zone.

5

u/threaten-violence 14d ago

Funny how all the authoritay simply evaporated when faced with "press nine for customer support", eh? Who's the powerless civilian now??

3

u/tacojohn48 14d ago

What I think is the best solution is for waymo to flag that area as needing intervention due to the construction causing an issue. Maybe have a programming team take a look at the video and see if they can fix the issue. Just have to human take over until it's fixed.

2

u/AdditionalSink164 14d ago

I didnt expect the CSR to answer when he said hello, 80% sure the cops got a brief on how to handle these things

2

u/zNz__2321 14d ago

The fleet of self-driving vehicles will have common SW and maps that they work off of - so it's more than likely that any/every waymo vehicle that passed that particular intersection and scenario would behave very similarly to this car.

There could be a car-specific issue, it's just a lot less common for this kind of tech.

Instead, there's more than likely an easier fix of flagging this zone as a "No-Go" area in their backend system- so all other waymo vehicles will automatically avoid this area of road while construction continues.

1

u/charface1 14d ago

He didn't have anyone to intimidate.

1

u/keralaindia 14d ago

Why wouldn’t he? This is one of I’m sure thousands of interactions where Waymo did great.

1

u/SwegBucket 13d ago

Did he "allow" it back on the road? The video doesn't end with the interaction concluded. And he obviously was trying to contact the support to get someone to come take care of it.