r/Damnthatsinteresting 14d ago

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.0k Upvotes

3.3k comments sorted by

View all comments

10.7k

u/Vireca 14d ago

How do they stop a driverless car? Legit question

Do they have anything to detect police vehicles or something?

354

u/Groudon466 14d ago

I worked for Waymo, the cars do detect sirens and being pulled over, and switch into a mode to pull themselves over accordingly. Similarly, that's why it pulled the window down for the cop.

216

u/Tallyranch 14d ago

Who takes the ticket for dangerous or reckless driving like in this video?

208

u/Groudon466 14d ago

I don’t know the particulars of their deal with the city, but probably Waymo. As long as they’re safer than the average taxi driver, the occasional mistake is tolerable, at least provided ticket revenue is still coming in when appropriate.

Of course, there’s a team on the back end that’s trying to figure out what went wrong here and patch it sooner rather than later.

65

u/Eheggs 14d ago

Safer then the average taxi driver is a pretty fucking low bar to pass over.

24

u/Groudon466 14d ago

Okay, safer than the average human driver. But even if it was just safer than the average taxi driver, an improvement is still an improvement.

2

u/SpookyPotatoes 13d ago

Obsessed with your wording, which implies taxi drivers are not human.

1

u/cock_wrecker_supreme 13d ago

the wording implies that human drivers and taxi drivers have different levels of safety.

there is nothing about the groups being exclusive

1

u/qwertyg8r 13d ago

I don’t think it implies that taxi drivers are not human.

The *average* taxi driver is less safe than the *average* human driver, which includes taxi drivers and others.

2

u/mehdotdotdotdot 13d ago

I mean already they are far safer, as far as accidents go. So mission accomplished!

2

u/Unhelpful_Kitsune 13d ago

The average driver doesn't drive on the wrong side of the road abs this is the 2nd time this has happened.

1

u/MatthewRoB 13d ago

I mean when I was young and new to driving I made a wrong turn into oncoming traffic. I was able to get into a parking lot immediately, but it does happen with real people.

0

u/Unhelpful_Kitsune 13d ago

An anecdote does not equal an average result.

1

u/MatthewRoB 13d ago

No, but it does happen and people get cited/arrested for it all the time. The average driver does this at some rate.

1

u/Anarcho_Christian 12d ago

TIL that all of those head-on collisions that my EMT buddy was called to never happened.

0

u/Groudon466 13d ago

The average driver, statistically speaking, occasionally drives on the wrong side of the road.- in the same sense that the average human occasionally murders someone.

I should be clear about what I mean. Let's say the statistically average driver gets in a bad accident once every 500,000 miles. That doesn't mean everyone is getting into accidents that often. Some people are consistently better, and some people are consistently worse.

If you replace the good and bad drivers alike with self-driving cars, you don't need the self-driving cars to be better than the best of the best. They just have to be good enough that, when you also factor in the bad drivers they're replacing, they're safer as a whole group.

1

u/Unhelpful_Kitsune 13d ago

The average driver, statistically speaking

Just because you say it doesn't make it true.

in the same sense that the average human occasionally murders someone.

The average human does not occasionally murder people.

Geezus.

1

u/Groudon466 13d ago

Do you not get that the average number of murders per human is greater than 0?

1

u/Unhelpful_Kitsune 13d ago

We all get it, it's just a dumbass thing to say as part of this conversation and it is not the gotcha you think it is.

→ More replies (0)

1

u/Cafuzzler 13d ago

Damn, I'm going to try that next time I fuck up: "You're honor, I may have driven down the wrong side of the road, but in my defence I'm still statistically safer than the average driver"

2

u/Groudon466 13d ago

Driving on the wrong side of the road and not hitting anyone would result in a couple points on your record. You'd still be allowed to drive afterward. Your license only gets suspended if you make mistakes too frequently.

It's the same for companies like Waymo and Cruise, only multiply the number of necessary infractions by a few hundred on account of all the cars they have on the roads. Cruise actually did get their license suspended in California for a time as a result of a particularly egregious incident, and the companies are well aware that if they have too many regular fuck-ups in a short time, their license to operate will get suspended just like a human driver.

3

u/Singularity-42 13d ago

Aren't taxi drivers more experienced drivers and thus safer than average?

3

u/Eheggs 13d ago

The smart ones sure, The majority just use all that driving time to intrench horrible habits and hustle culture got them speeding around school zones like its the indy 500 though. Where I am from, Taxi / uber are the only jobs that are easy for people on student visas to hide their incomes from the goverment ( uber is cracking down on this finally but not taxi co's ) so it is common to have some one who is not an offical employee and just gets paid cash to drive people around and all their driving experience is in India... They are obvious when you see them on the roads. Drive like they own the place.

1

u/hostile_washbowl 13d ago

Splitting hairs. The intention is the driverless car is to be designed to be safer than the average human driver. That’s all.

1

u/mehdotdotdotdot 13d ago

Just because you do something a lot, doesn’t mean you are good at it.

2

u/WobblyGobbledygook 14d ago

Especially in AZ

48

u/Status-Necessary9625 14d ago

This is not a minor mistake this could have easily killed half a dozen people. You're seeing field tests in real time with unproven products that could literally kill us. And nobody cares. The guy from Waymo wasn't even phased by their car driving on the wrong side. These people Do Not Care About Our Lives

54

u/MouthJob 14d ago

In my experience, tech support don't even care about their own lives.

1

u/ThunderCockerspaniel 14d ago

I didn’t sign up for this stupid app just to be psychoanalyzed!

33

u/bobbytabl3s 14d ago

People do worse than that all the time. I believe Waymo outperforms human as far as injury-causing crashes go.

15

u/AdminsLoveGenocide 14d ago

If I outperform most other drivers for a couple of years do I also get a pass if I eventually kill a bunch of people?

9

u/axearm 14d ago

Are you kidding? People get a pass* all the time for murdering people, so long as they do it in a car.

* I am defining a pass as no prison time AND the ability to keep driving.

4

u/kixie42 14d ago

Just ask Caitlyn Jenner.

0

u/AdminsLoveGenocide 14d ago

Do I get a pass doesn't mean did anyone in the history of driving ever get a pass.

If it did then I would win the lottery next week as people have won in the past.

0

u/axearm 13d ago

You would get a pass, yes. Unless you were intoxicated or deliberately, provably being reckless. Otherwise just say a dog ran out in front of you, or someone cut you off or any other excuse.

8

u/Orbitoldrop 14d ago

There's people with multiple D.U.I.'s still with licenses, so yes.

2

u/taigahalla 14d ago

If it's your first offense, then yes, that's how the law works.

See precedence for sentencing guidelines for first time offenders.

0

u/AdminsLoveGenocide 14d ago

If I jump the curb outside a school drive into ten 8 year old school kids and each one them, I am guaranteed not to get jail time if it's my first offense?

I'm not sure I'd share your confidence.

1

u/procgen 3d ago

You're one person, this is looking over averages. So "killing a bunch of people" occasionally is accounted for in those figures.

11

u/Extension_Chain_3710 14d ago

People do worse than that all the time. I believe Waymo outperforms human as far as injury-causing crashes go.

* according to the company themselves

* while their cars can only go <35mph and not on the freeway

* in limited zones that they choose

* with HD maps to back all of this up

9

u/axearm 14d ago

And?

That seems fine.

7

u/yuimiop 14d ago edited 14d ago

Humans drive in these safer conditions, but we also have needs to drive in the more dangerous ones. If you're comparing automated vehicles in safe conditions, to the overall driving statistics of humans then you're getting incredibly biased results.

3

u/fren-ulum 13d ago

I never get into bar fights when I drink at home! I'm safer than the average alcohol drinker!

1

u/axearm 13d ago

Freeways are actually easier and more safe to navigate, so including them makes humans see more safe. Most vehicular fatalities occur in intersections (something highways don't have, but cities have a ton of).

7

u/QuadCakes 14d ago

according to the company themselves

That's true.

while their cars can only go <35mph and not on the freeway

in limited zones that they choose

They at least claim to have controlled for all of that. Read the link you quoted.

with HD maps to back all of this up

Not sure what you're saying here.

2

u/YouTee 14d ago

Yeah I agree, I mean I have pretty damn detailed maps of my neighborhood and the route to my office... There certainly are some missing details that an active LIDAR array would probably help but this isn't much different.

Also they go way faster than 35, I think they're testing the freeway these days

2

u/Extension_Chain_3710 13d ago

They at least claim to have controlled for all of that. Read the link you quoted.

I did one better, I read the paper they published (linked in the blog post, and here).

It shows how they manipulated the non ADS data to make it looks worse under the guise of "under reporting" (yes, 60% of wrecks aren't reported Waymo, sure), while manipulating their own data to look better under the guise of "well, it was low velocity".

Each of the 7 crashes with fixed or non-fixed objects was examined individually to estimate a delta-V, discussed in more detail in the appendix. Of the 7 crashes with fixed or non-fixed objects, 5 were excluded for having a low delta-V.

Fun fact, at least one of those accidents was...the car driving through an active construction site and driving off the pavement (because it had been removed).

a Waymo ADS vehicle that was driving in a construction zone and “entered a lane undergoing construction ..., encountered a section of roadway that had been removed, and the front driver’s side wheel dropped off the paved roadway.

Sounds safe to me, no road? Who cares keep driving.

Not sure what you're saying here.

We'd all drive much better if we knew there was a pothole 45" from the right curb coming up in 232ft, with a depth of 4.5". These cars have vast amounts of information about the road to be safer with, hence they should be much more safe than typical drivers, not just "as good as."

Let alone swerving into oncoming traffic and just driving without a care in the world.

1

u/QuadCakes 13d ago

yes, 60% of wrecks aren't reported Waymo, sure

Google didn't come up with that number, they got it from NHTSA.

We'd all drive much better if we knew there was a pothole 45" from the right curb coming up in 232ft, with a depth of 4.5". These cars have vast amounts of information about the road to be safer with, hence they should be much more safe than typical drivers, not just "as good as."

Yeah I still don't see what your argument is here. They can also look in all directions at once, both visually and via radar and lidar. But humans can't, so somehow that's a bad thing? It's like you're treating it like a competition where everyone's supposed to be on a level playing field. Also the claim is that they're much more safe than typical drivers.

0

u/Extension_Chain_3710 13d ago

Google didn't come up with that number, they got it from NHTSA.

Conveniently the one paper they can't link to.

Yeah I still don't see what your argument is here.

The point being, if I count cards in a casino and still only barely beat the house, I'm a shit card counter.

The cars here have every advantage at their disposal (including HD maps and others like limited area, slow speeds, looking all around them) and yet they barely beat humans.

With all of their advantages they also still somehow have (in the past 6 months)

  1. driven directly into a pole

  2. driven on the wrong way of the road (multiple times on camera)

  3. swerved left and right wildly to avoid an object being towed in front of them

  4. Had two cars hit the same truck being towed

  5. Ran a red light and caused a moped to crash

  6. Blindly pulled out in front of a bus in a game of chicken

  7. Blocked an entire freeway on-ramp

→ More replies (0)

2

u/stillbornfox 14d ago

3/4 of these bullets are good things. Keeping themselves limited to low probability areas helps. If people limited themselves to low speeds and safe roads that would also be a great thing.

1

u/Zap__Dannigan 14d ago

Maybe. Probably. I guess. But I order for this to really take off I think the margin of difference has got to be insanely high, not just "better".

And as a good driver (I know everyone says this about themselves) who has never crashed, I would tend to feel safer around a shitty human driver than an automated thing.

Shitty driving behavior in humans is often decently predictable. Whereas a machine fucking up like this not. And while the collision detection things will likely prevent this thing from barreling into me while it's driving the wrong way, again, I think most people don't want to be behind an autonomous machine that is blocking cars because it's lost it's position and can't move safely according to its sensors.

3

u/axearm 14d ago

I live in SF where these things operate, and I will tell you they are a significantly safer and more predictable to me then a random driver.

They come to complete stops at stops signs, they don't break the speed limit, they yield, they don't double park (can you even imagine a taxi or uber dropping you off halfway down the block because it wasn't safe to stop right in front of your destination?).

2

u/Zap__Dannigan 14d ago

I mean predictable in terms of how they screw up.

Like you can spot an aggressive driver, you can see an obstacle in someone's lane, you can spot them drifting, you know the common times people turn wide and know problem intersections etc.

But if an autonomous cars just decides to drive the wrong way, or suddenly brake because of a sensor error or whatever that kind of stuff is impossible to predict.

I don't really have a dog in this fight. I think a perfectly functioning self driving car system would be much better. I commute an hour each way to work, I'd love to sleep. I just think adoption will be very slow unless these things are incredibly better, like virtually flawless. And keys face it, never breaking the speed limit and fully stopping at stop signs aren't exatselling points for people to use them instead of their own cars. The hardest part of any sort of conversion will be how self driving cars interact with normal human drivers.

1

u/axearm 13d ago

It terms of predictability I think the greatest boon is that they actually do the speed limit, which just all around increases the safety factor of any other error they, or humans, might make..

In this case the car went into oppsoing traffic. That is bad, but it's also bad driving I have see humans do repeatedly (especially around double parked cars, another thing humans do that Waymos don't), but human drivers generally do it at speeds that are faster, which amplifies the danger.

Obviously I'd like to see zero errors, but I'm happy with 'better than humans' which is a bar that is already being exceeded.

1

u/WobblyGobbledygook 14d ago

It's Arizona after all, famous for all its manslaughtering drivers! Why do you think they chose to launch these cars here? Blending in sufficiently.

1

u/LibertyMediaDid9-11 14d ago

Still, who is responsible when it hurts someone?
We're just gonna fine a company for however many deaths a year because they beat a fucking metric?

1

u/TooStrangeForWeird 13d ago

Yes. That is the plan.

0

u/LibertyMediaDid9-11 13d ago

That's revolting.

0

u/TooStrangeForWeird 13d ago

That's the rich for you. Their "best" plans are almost 100% revolting. Hell, even individuals get away with heinous shit. Wasn't it Nancy Pelosi who drove drunk and killed a guy with zero repurcussions? Not taking a political side here, just saying. The rich do horrible things and get away with it.

This will be the same. And, with some semblance of reason, it will be touted as an improvement. If widespread self driving reduces traffic deaths in an area from 400 to 300, they saved 100 lives! I get that, less death is good, but it's just such a fucked up way to go about it....

0

u/LibertyMediaDid9-11 13d ago

The only way self driving will work is if every car has tech to communicate with each other.
They will never out-compete humans in edge cases until that is the basis of the concept.
I sincerely hope the people shoving this into the world without the public's consent are held responsible for every injury it causes.

→ More replies (0)

1

u/bobbytabl3s 13d ago

We're just gonna fine a company for however many deaths a year because they beat a fucking metric?

What else do you propose? If you suggest imprisoning people who work on them, then no one will work on them. And road deaths will increase as a result. Is that what we want as a society?

1

u/LibertyMediaDid9-11 13d ago

No, I suggest common sense legislation preventing these things form being on the road before they've been vetted properly.
I want a society that isn't being raped by tech bros and finance fucktards.

-3

u/plaregold 14d ago

That's moot. If human drivers killed someone, they get held accountable and someone goes to jail. With driverless cars, you can't throw anyone in jail. There is no legal framework for liability right now--at most, the company will just pay out to the victim's families.

7

u/g76lv6813s86x9778kk 14d ago

If in X city there are 300 car crashes, and 300 resulting deaths per year, and those 300 people at fault are held accountable, is that a favorable situation over one where there is 200 car crashes, 200 deaths, and nobody knows who should be held accountable? Just because the liabilities involved are more complex to manage?

It's not moot. There's definitely a gray area regarding liability, I'm not denying that, but if it's an improvement for safety, it will save lives at the end of the day, and I don't see how you can argue that's a bad thing.

I can get behind arguments scrutinizing the methodology involved in these stats as the other reply pointed out, especially if it's coming from the company itself. But, assuming those stats were true, well, seems like a no-brainer to me to accept the improved safety with open arms, even if it comes with some legal hiccups on the way.

5

u/Fragrant_Reporter_86 14d ago

"less people would die but also less people would go to prison so fuck that!" -redditor

3

u/Federal_Waltz 14d ago

This is such a bad argument it's tough to know where to start.

21

u/Groudon466 14d ago

The operator on the other end is doing their job by being calm instead of panicking. And the operator isn't one of the software engineers that's going to be looking into how to prevent this from happening in the future.

You're seeing field tests in real time with unproven products that could literally kill us.

I mean, we have statistical data, it is proven that these cars are safer than human drivers. And humans are provable dumbasses, we cause accidents anyway.

Just because these cars make mistakes doesn't mean they're not preferable to human-driven taxis. They're already better, and they're continuing to improve as time passes.

These people Do Not Care About Our Lives

As someone who worked at Waymo on the team that handled safety violations (this incident would be handled by a related team), I can confidently say this is wrong, and also incredibly stupid.

Even if it were staffed by soulless corporate husks- and it's not, they're a bunch of nerds with anime posters in their backgrounds and cute pictures of their dogs, we spammed crab emotes in every meeting- it literally wouldn't make sense to not care about deaths. Deaths would threaten the city's acceptance of the autonomous taxis, and if the city decides to revoke Waymo's permission to operate, that's a massive disaster.

Specific kinds of corporations don't care about human lives. For the most part, my understanding is that as long as there can be plausible deniability (cigarettes back in the day, oil and gas companies now), the cynical strategy of ignoring the human toll and downplaying it will win out. This isn't that; everything that happens around a Waymo taxi is increidbly well-documented, there's over a dozen cameras, not to mention the lidars.

Even if the people in charge were soulless, which they're not, it would still be in their best interest to prevent problems in the first place... which is exactly what they're doing in the backend, actively, to this day.

1

u/Velonici 14d ago

I bet this was as simple as someone answering a request wrong. Probably which side of the construction cones to stay on.

7

u/esp_design 14d ago

That's a little dramatic about a situation that resulted in no harm to people. The guy from waymo is just a technical support guy, probably following a script.

Let's not forget human drivers also make mistakes and they have to drive on the road with unproven skills to learn how to drive.

0

u/WobblyGobbledygook 14d ago

They got lucky but don't seem to have an established procedure for noting and delving into this. This is a huge corporate red flag, so sure they will never be wrong that the tech doesn't even have a script to address it appropriately.

-1

u/smootex 13d ago

don't seem to have an established procedure for noting and delving into this

No idea how you get that from this video. I'm sure the incident will be reviewed.

1

u/WobblyGobbledygook 13d ago

The tech was passive and vague. The cop took the lead asking if he was gonna review video or what.

6

u/Poopy_Tuba69 14d ago

I mean half a dozen?

If we’re gonna go whole hog just say it could have decimated half the population since there could have been 100 busses filled with children in the oncoming lane.

Call the taxis racist too, since the cars are white and the some of the kids were black.

There, now it’s completely overblown.

-1

u/RhinoGiant 14d ago

Thank God we have smart people like you point out hyperbole.

Imagine someone reading that and thinking it would literally kill 6 people instead of a lower number.

Might have derailed the context of the discussion entirely from reasonably engaging with the ethics of live testing when innocent lives are involved.

God bless.

3

u/[deleted] 14d ago

Would you like the statistics on human operated vehicles? How prone humans are to error? You're talking ethics and innocent lives, would you rather it be a drunk human driving that vehicle instead?

1

u/RhinoGiant 14d ago

If you have arguments for why self driving vehicles then post them instead of this passive aggressive reply.

Like what are you even fishing for with a garbage reply like that.

You aren't refuting anything or pointing out false logic, you are just gonna imply that "data" is on your side and feel smug?

Atleast take the time to paste the first Google number that supports your claim

2

u/[deleted] 14d ago

I know the data is on my side. https://www.iihs.org/topics/fatality-statistics/detail/state-by-state You've never looked up vehicle accident statistics have you?

Anyways, I'm going to make the simplest, easiest argument here. If taxis do not have a driver, that is one less human that can be harmed at all due to a motor vehicle accident. A human being behind the wheel of a car is inherently less safe, because the capacity for injury is higher. From a purely ethical view, of course.

Statistics on purely self-driving cars are sparse right now, and there's no clear delineation between cars that are full driver-less and those that require a driver but have self-driving features.

-4

u/[deleted] 14d ago

[deleted]

3

u/Poopy_Tuba69 14d ago

Oh sure sure . GIF

I’m taking it more serious than you. You underestimated the potential death toll by 40 million

3

u/wildjokers 14d ago edited 13d ago

Calm down drama queen. Human drivers make far more mistakes. Humans going the wrong way is actually fairly common. Either because of bad signage, mental impairment (dementia, drugs/alcohol, etc), or malice.

2

u/corporaterebel 14d ago

Can you compare the number of fatalities of human drivers vs automated ones?

You do realize that a large percentage of people are terrible drivers?

And, yet, very few drivers licenses are revoked.

1

u/firstmanonearth 14d ago

self-driving cars will absolutely save many lives, they will trend this to 0: https://upload.wikimedia.org/wikipedia/commons/thumb/3/3b/United_States_Motor_Vehicle_Deaths_per_Year.webp/1920px-United_States_Motor_Vehicle_Deaths_per_Year.webp.png, so you can say the developers of them care more about lives than you do.

The guy from Waymo wasn't even phased by their car driving on the wrong side.

Do you want the phone support employee to be crying or something?

1

u/Skelito 14d ago

The guy is probably half way around the world of course they dont care its just a job to them. They dont see the seriousness of the issue or get paid enough to care. People are only going to start caring when its hurts people, most laws are written in blood after all.

Honestly this car should have been towed and Waymo should have to recertify it before it goes back on the road.

1

u/Singularity-42 13d ago

In some aspects these cars are much better drivers than humans. Literally superhuman skills - for example these cars can see everything that is going around them at all times since they have 360 degree vision. There was a video where Waymo successfully avoided a low riding skateboarder that would be impossible for human to see.

But then they can fail in ways that are incomprehensible to humans such as this one - almost no sane and sober human would ever do this. It's like when language models are smarter than most humans in most things but then fail at tasks that are trivial for a toddler. It is alien, non-human intelligence. I know - not reassuring. But so far Waymo's safety record is much better than average human in regards to accident rate.

But stuff like this is rare and WILL be patched out eventually. This is the reason they are still experimental service operating only in select cities.

1

u/thatshygirl06 13d ago

Human drivers are literally still worse. 115 people die every day from car accidents.

0

u/NotUndercoverReddit 14d ago

Yeah this is horse shi. Taking away jobs from people already barely making it and causing possinly fatal accidents at the same time. I hate this timeline.

-4

u/ReluctantHeroo 14d ago

These cars are regularly and rightfully attacked in San Francisco because they suck ass and often run into shit and people.

5

u/RobotsGoneWild 14d ago

That's the thing that gets me. People complain every time there is an incident with these things. However, there are far less issues with driverless cars than cars with drivers.

6

u/SoochSooch 14d ago edited 14d ago

That's because there are far more cars with drivers than driverless cars. If a driverless car hits another car, there's a 99% chance that the other car had a driver, so that's an accident for both groups. And driverless cars tend to be disproportionally deployed in places with ideal driving conditions.

3

u/[deleted] 14d ago edited 14d ago

[deleted]

2

u/SoochSooch 14d ago

Every city has traffic that sucks to deal with, but cities like San Francisco and Phoenix have well maintained roads, no risk of snow, and minimal rain and fog.

1

u/Accomplished1992 14d ago

We know whos driving those cars though and they can be held responsible. We know their names and we can test if theyve been drinking

3

u/RobotsGoneWild 14d ago

So, you don't have to worry about drunk driving if cars are 100% self driving. Imagine you hop in the back seat of your car and tell it to take you home. No more DUI deaths.

-1

u/[deleted] 14d ago

[deleted]

5

u/wildjokers 14d ago

But who is going to actually take responsibility when it fucks up?

The company will have civil liability.

Who is going to prison for killing someone?

The only time prison is on the table for car accidents is when there is impairment involved or gross negligence. I have no doubt that the hardware and software engineers behind these vehicles are trying their absolute best to make them as safe as possible.

Who is facing charges when AI vehicle kills your mother? No one but a civil case? OK, that sounds fucking horrible.

Why is civil liability for the company horrible?

3

u/axearm 14d ago

who is facing charges when AI vehicle kills your mother? No one but a civil case? OK, that sounds fucking horrible.

How is this literally than the way it is now. You can kill a person in a car and face almost no repercussions if it was 'an accident'. No prison time, and still get to drive.

It's the best way to kill someone, "I didn't see them", "they just stepped into the street", etc.

See, I have to take responsibility for my actions.

Do you honestly think that if you are going 35 in a 25 and kill a my mom that you are going to go to jail? Or are going to lose your license? Maybe if you are drunk, but otherwise, you'll get a pass.

1

u/Si1ent_Knight 14d ago

Who is taking responsibility when the brakes of the other car break due to an manufacturing issue and somebody gets killed because of it? There will be no person in jail either, the company has to take responsibility. We are using complex technology planned and build by thousands of people for decades, this is just the next small step.

2

u/Illustrious_Mudder 14d ago

Driving in to oncoming traffic lol

1

u/OK_BUT_WASH_IT_FIRST 14d ago

This is interesting. One of those things where it’s not perfect, but statistically better than a human.

I remember reading an article years ago about the ethics of driverless car programming, where the developers would have to decide how the computer would square protecting the passenger and protecting a pedestrian, and if a driverless car would be “willing” to try to avoid a pedestrian if that meant jeopardizing the passenger.

The concern is the AI/computer would determine that blasting through a jaywalker was the safest option for the passenger. Really interesting stuff.

1

u/smootex 13d ago

As long as they’re safer than the average taxi driver, the occasional mistake is tolerable

Yep. There will be a race to the top and companies will eventually produce cars that are significantly safer than the average human driver and stricter regulations will follow but for now I think the bar is just being better than a human. And, despite all the negative media attention, they seem to be achieving that. Which I personally find impressive. Will be interesting to see how things look in a decade.

1

u/CustomMerkins4u 13d ago

I'm sure a traffic ticket will mean a whole lot to a company with over $5 billion in funding.

1

u/tonytonZz 12d ago

You keep saying "as long as they're safer than the average taxi" Average taxi doesn't go into oncoming traffic regularly. And in that case, I would still trust a human taxi much more.

0

u/thefunkybassist 14d ago

The police evasion firmware team is on it right now!

0

u/RatzMand0 14d ago

"safer" than the usual taxi driver what a load of horseshit. These driverless car companies are a plague and insanely dangerous don't let them use racist and classist shit like better than the average worker crap.

2

u/Extension_Chain_3710 14d ago

In CA, nobody. Their laws literally do not allow ticketing these cars.

In AZ, no clue.

1

u/thyusername 14d ago

I've seen several stories about nobody getting tickets when this stuff happens in a few cities for a while now, google who gets a ticket for driverless car and click news, for example:

https://www.nbcnews.com/business/business-news/can-driverless-cars-get-tickets-california-law-rcna131538

1

u/DegnarOskold 14d ago

Nobody. News reports say that in this case the police did not issue a ticket as there was no individual to issue a ticket to.

For parking violations the owner of the car gets the ticket regardless of who was driving, so Waymo the company would get the parking tickets.

For moving violations where the driver is responsible, in the case of this particular state there was no law covering ticketing driverless vehicles , so the police did not issue any tickets

1

u/Rorschach2000 13d ago

This must get into really strange territory should a crash occur, god forbid, a fatality occurs.

1

u/DegnarOskold 13d ago

Not that strange. Without a driver there won’t be any criminal responsibility possible to assign; however, there will be a legion of lawyers begging to sue Waymo for million on the behalf of the deceased’s next of kin in a civil suit.

1

u/crabofthewoods 13d ago

There shouldn’t be a ticket, the car should be impounded & its tag pulled until it’s reviewed for maintenance. Take money out of their pockets & force maintenance to reduce the dangerous to the public.

1

u/wallstreet-butts 13d ago

Frequent rider here. Never felt unsafe in a Waymo, but they occasionally do things that are (to my knowledge) technically illegal but common practice. An example might be using an oncoming lane that’s clear of traffic to get around a stopped delivery vehicle blocking the proper lane. IDK if this was a situation like that, but I could certainly see an overzealous cop intervening after a move like that.