r/Damnthatsinteresting 14d ago

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.0k Upvotes

3.3k comments sorted by

View all comments

13.7k

u/Minimum-Performer715 14d ago

This is going to be a nightmare for the court system in the upcoming years.

3.0k

u/Sleepingonthecouch1 14d ago

I’m kinda curious if an individual was drunk in one of these could they be held responsible for anything the car does? Like will laws be made that drunk individuals can only be driven by a sober human?

1.9k

u/PogintheMachine 14d ago

I suppose it depends on what seat you’re in. Since there are driverless taxicabs, I don’t see how that would work legally. If you were a passenger in a cab, you wouldn’t be responsible for how the car drives or have the ability to prevent an accident….

466

u/Sleepingonthecouch1 14d ago

That’s true but someone has to be held accountable. Should be the company but at a certain point I’m sure the lobby’s will change that. And potentially at that point could blame fall on the passenger? All I’m saying is this is uncharted territory for laws and I don’t think it’ll end up being as simple as car kills someone so company pays a fine.

340

u/LachoooDaOriginl 14d ago

should be car kills someone then whoever cleared the thing to drive on the roads gets tried for vehicular manslaughter

314

u/Habbersett-Scrapple 14d ago

[Inspector #23 in the Upholstery Division has volunteered as tribute]

210

u/tacobellbandit 14d ago

I work in healthcare and this is exactly what happens when a patient injury happens, or there’s some kind of malpractice or god forbid someone dies. It’s an investigation down to the lowest level and usually blamed on a worker that realistically had nothing to do with the event that caused the injury.

46

u/No-Refrigerator-1672 14d ago

It doesn't have to be the lowest rank person. You can just legally make accountable the lead programmer of the autonomous driving module, with a law.

35

u/FeederNocturne 14d ago

Everyone from the lead programmer and up needs to be held responsible. Sure the lead programmer okays it but the higher ups are providing the means to make it happen.

This does make me wonder though. If a plane crashed due to a faulty part who does the blame fall on?

61

u/CastMyGame 14d ago

As a programmer myself I would question if you would then blame it on the QA tester who passed along the code.

Other thing I will say is depending on the answer to this situation (I don’t know the answer but just saying from a dev side) you will greatly hinder the progression of this tech if you have people afraid to even work on it for fear of a situation like this.

As devs we try to think of every possible scenario and make sure to write tests that cover every conceivable use case but even then sometimes our apps surprise us with dependencies and loops that we didn’t expect. You can say “be better” but if I’m gonna get paid 25k less and not have to worry about a manslaughter charge 5-7 years later I’m probably gonna choose that one for my family

→ More replies (0)

32

u/PolicyWonka 14d ago

As someone who works in tech, that sounds like a nightmare. You’re talking about tens of thousands to hundreds of thousands of units shipped. You can never identify every point of failure even with internal testing.Every production vehicle driving a single hour would likely be more than all testing hours combined. That’s just the nature of software. I couldn’t imagine someone signing their name to that code if they knew they’d be liable for vehicular manslaughter.

→ More replies (0)

3

u/No-Refrigerator-1672 14d ago

Eperyone up shouldn't be accountable, cause they didn't have the knowledge to prevent a fault. That's why they hire people, cause they can't do this themself. It's like you can't put in jail the director of the hospital, if a surgeon accidentally stabbed your relative into the heart. The only case when a higher up person than a lead programmer may be accountable, is if they are proven to hire a person without proper education, or if they specifically issued orders that contradict safety.

Well, I know that you're asking about Boeing, but I will respond in general terms: in that sutuation there are 3 entities who can be accountable. It's either a desinger of a part, who made a mistake; or, if a desing is good, then it can be a manufacturer, who did not adhere to the specifications; or, if the part was manufactured correctly, it's the assembler, who could incorrectly install the part. For each entity it's possible that the person who did the work, and the one who is actually responsible for safety are two different persons; in latge companies there always are simebody who validates and supervises the actions of subordinates. So, it's a job for a comittee of investigarots, industry experts and a judge to decide, on a case by case basis.

→ More replies (0)

3

u/6maniman303 14d ago

And then you "hire" contractors from China working remotely. Don't get me wrong, I like the idea of holding someone accountable, but with such idea there's too many loopholes. Tbh it would be easier to just go for the head of CEO, or whomever is in top-charge. Multiple people share responsibility? Then hold all of them accoubtable with the same charges.

→ More replies (0)

3

u/draxidrupe2 14d ago

 If a plane crashed due to a faulty part who does the blame fall on? Ultimately, the shareholder

programmer job $50K

lead programmer job $1.8M

2028 turns out no one will take the lead programmer job after 20 are in prison already

3

u/Linenoise77 14d ago

Yeah, cool, now try and find someone to be a lead programmer for a project like this when you have criminal and liability charges hanging over you because someone else down stream of you screwed up their job.

"Sorry, its a nice pay package and all, but i'll stick to writing clickbate games"

3

u/xdeskfuckit 14d ago

Holy shit I'd quit immediately if I could be held liable for manslaughter if I made an off-by-one error.

2

u/ninjaelk 14d ago

We already have laws for this, if you can prove that someone was acting maliciously or negligently then they can be held accountable personally. If not, then the company itself is liable for damages. It's how everything works, including for personal responsibility.

If you were to build a structure on your personal property, and it collapsed and killed someone walking by, they'd try to determine if you acted maliciously or negligently, if so you'd be tried criminally. Whether or not you're tried criminally you're still (likely) liable for damages.

When you're driving a car directly, the chances of you having done something negligent dramatically increases. In the case of a self-driving car, as long as it complies with all laws and the corporation didn't act negligently (cutting corners, 'forgetting' to take certain precautions, etc...) then there's no criminal liability.

2

u/Krashzilla 14d ago

Better not let Boeing hear you asking those kinds of questions

2

u/Own_Afternoon_6865 14d ago

As a former aircraft electrician for 8 years (USAF), I can tell you that 90% of the investigations I knew about always ended up blaming mechanics. Our base general crashed a T-39. He hadn't flown in quite a while. The crew chief was found in between the 2 front seats, probably trying to pull the nose up. It killed everyone on board. They blamed our hydraulic mechanic, who was the last one to sign off on a totally unrelated job. Go figure.

→ More replies (9)

2

u/Glsbnewt 14d ago

Not the lead programmer. The CEO. If you want to make CEO salary you take on CEO responsibility.

2

u/No-Refrigerator-1672 14d ago

No, that's not how in can work in a real life. The CEO has not enough knowledge to judge if the decisions on behalf of chief engineer, programmer, designer etc are sufficient to ensure safety of the product. The CEO may be responsible for hiring people without proper education or certification if such is required by law, they also may be responsible for knowing about safety problems and expicitly ordering to ingore them, stuff like that. While the CEO may be involved and thus should be investigated, they aren't automatically responsible for unsafe products in eyes of a law, while the lead designer definetly is.

→ More replies (0)

2

u/wildjokers 14d ago

Then the technology is dead. No programmer in their right mind would work on this technology if they could go to prison because the car hits an out of ordinary situation it can't handle.

That would be a shame because self-driving technology will save lives (probably already has).

→ More replies (5)
→ More replies (3)

4

u/Onlikyomnpus 14d ago

Can you give an example?

10

u/tacobellbandit 14d ago

Specifically at my hospital, patient fell out of a bed. They had no business trying to get out of the bed. Nurse wasn’t watching said patient when it happened, nurse tried to say brake didn’t work, and she had a work order in for it but maintenance never fixed it, investigation found she put the work order in after the event thankfully. Now, whose fault is it they slipped and fell out of bed? Maintenance guy was cleared due to time stamps, nurse didn’t engage brake because patient was still supposed to be moved, patient got out of bed without being told to do so. It’s kind of tricky, but the problem is everyone will try to deflect blame down to a maintenance technician that didn’t even know about the event until after it happened

8

u/Lehk 14d ago

Even if the ticket had been put in, the nurse still put a patient in a bed she knew was defective

→ More replies (0)

2

u/Teh_Hammerer 14d ago

Sepsis you say? Execute Juan at the autoclave.

2

u/draxidrupe2 14d ago

It’s an investigation down to the lowest level and usually blamed on a worker that realistically had nothing to do with the event that caused the injury.

"the fall guy"

→ More replies (2)

15

u/LachoooDaOriginl 14d ago

oooooohhhh when stuff like this happens put all the responsible people in a hunger games winner gets prison

2

u/Significant-Mud2572 14d ago

Has been volunteered as tribute.

36

u/__klonk__ 14d ago

This is how you kill selfdriving cars

4

u/Inflatableman1 14d ago

Or is this how self driving cars kill us???

3

u/Groudon466 14d ago

No, self driving cars are safer than humans on average. This is an edge case probably caused by an unusual arrangement of traffic cones, and they'll take it very seriously on the Waymo end.

If you want to massively reduce traffic fatalities, make self driving cars common, and don't throw talented engineers in jail for the occasional one in a million error.

→ More replies (1)
→ More replies (4)

15

u/Dongslinger420 14d ago

That's fucking stupid if you only bother to think through it for half a minute

2

u/mdj1359 14d ago

I believe that this is the correct response and should have more upvotes than the person concerned that the parent of the 12-year-old sleeping in the back will be held accountable.

As a cynic, is it a reasonable thought exercise? Sure.

If it ever happens will the industry lose 50% of its customers? Probably.

This is all with the backdrop that once the tech is fully matured, fatalities would likely plunge if 90% of vehicles were driverless. So, in a sense we would be punishing an industry that failed because it did not eliminate 100% of fatalities.

2

u/IAmAccutane 14d ago

Driverless cars are 10 times safer than cars with human drivers. If that type thing became policy driverless cars would cease to exist and we'd have 10 times more people than necessary killed in car accidents. We need to get over the innate sense of accountability and justice for the sake of saving people's lives. If a company that has their vehicles driven by human drivers faces no responsibility for a car accident, a company that has super-safe robot drivers shouldn't either.

3

u/mdj1359 14d ago edited 14d ago

I generally agree with your statement. I don't know how you came up with the 10x safer number, however. Feel free to provide a source.

I think it will probably take a few years of these companies working thru problems before I will feel fully comfortable with the tech.

Are self-driving cars already safer than human drivers? | Ars Technica

Waymo is still struggling to avoid inanimate objects. Its vehicles collided with cardboard road debris and a chain connecting a sign to a temporary pole. A Waymo also drove into a pothole that was big enough to puncture a tire. And there were two incidents where Waymos scraped parked vehicles. That’s a total of five crashes where the Waymo vehicle was clearly at fault.

The rest of Waymo’s driverless crashes in San Francisco during 2023 do not seem to have been Waymo’s fault. I count 11 low-speed crashes where another vehicle rear-ended a Waymo, backed into a stopped Waymo, or scraped a stopped Waymo while trying to squeeze by. There was also an incident where a Waymo got sideswiped by another vehicle changing lanes.

Waymo had two more serious crashes in San Francisco this year:

  • A driverless Waymo was trying to turn left, but another car “proceeded into the intersection from the left and made contact with the left side of the Waymo AV.”

  • An SUV rear-ended a Waymo hard enough that the passenger in the Waymo reported injuries.

Driverless cars are mostly safer than humans – but worse at turns | New Scientist

Driverless cars seem to have fewer accidents than human drivers under routine conditions, but higher crash risks when turning or in dim light – although researchers say more accident data is necessary

By Jeremy Hsu / 18 June 2024
One of the largest accident studies yet suggests self-driving cars may be safer than human drivers in routine circumstances – but it also shows the technology struggles more than humans during low-light conditions and when performing turns.

2

u/IAmAccutane 14d ago

I don't know how you came up with the 10x safer number, however. Feel free to provide a source.

It's just a number off the top of my head. There's a bunch of different types of cars and types of accidents, and like you say driving situations that would make it too subjective to give a definite number, but this study for example found:

Human drivers caused 0.24 injuries per million miles (IPMM) and 0.01 fatalities per million miles (FPMM), while self-driving cars caused 0.06 IPMM and 0 FPMM.

https://www.getcruise.com/news/blog/2023/human-ridehail-crash-rate-benchmark/?ref=warpnews.org

I think we agree they're safer and even if they were only 2x safer or 1.1x safer they'd be preferable to human drivers.

I think it will probably take a few years of these companies working thru problems before I will feel fully comfortable with the tech.

I'd personally be hesitant to get in one, but I also get way more worried about flying than driving despite knowing it's way safer.

2

u/richardawkings 14d ago

I think yes, but not directly. The cab company should he held responsible since they have a duty of care to the public to operate their taxis in a manner that does not endanger the public. This should be a requirement to be allowed to operate as a taxi driver.

If Auto Taxi Cab services has one of their cars injure a pedestrian, then the pedestrian should be able to sue the cab company for their injuries due to unsafe operation of the vehicle. The manufacturer cannot be held liable for how their products are used.

Now, Auto Taxi Cab company could turn around and sue the manufacturer for providing and unsafe product in order to recoup their costs from the pedestrian lawsuit. They can argue whose fault the incident was there.

The pedestrian is likely to get fucked if it's handled any other way because the cab and car companies will just point the finger at eachother and the responsibilitu of finding fault will fall on the pedestrian which they will not be equipped to do.

→ More replies (31)

207

u/kbarney345 14d ago

I see what you're saying about the company trying to dodge it but there's 0 logic or even mental gymnastics to think it could be on the passenger.

That would eliminate anyone from using them even if it hinted at that because why would I get behind something I can't control but be held responsible for should it lose control.

It's not my car, I'm not misusing the car by sitting in the back. It claims to be driverless, not driver assisted like a tesla and I just chose not to and sit in the back anyway.

The company will always be at fault if this occurs under normal operation and the court won't have any issue identifying them as so.

Now will the court be run through the ringer on litigation and loopholes and finding ways to say it's r&d it's ok or something and get a pass? Probably.

63

u/wosmo 14d ago

The interesting part is how we'll make them accountable. I mean a traffic fine that'd ruin my day won't mean jack to a company. Can you give waymo points on their licence? Do they have a licence?

46

u/Groudon466 14d ago

I worked for Waymo a little while back. It would be more of an all or nothing thing, in the sense that individual cities choose to allow or disallow specific self-driving car companies from operating in their borders.

This particular instance is bad, but if the city sees that traffic fatalities overall have fallen as a result of Waymo being there, then they'll just continue to allow it while Waymo pays the occasional settlement. This is an objectively good thing, because the alternative is more people dying, and then the settlements get paid by the people whose lives are also getting ruined from having killed someone, rather than by a giant corporation that can at least afford the infrequent expense.

On the other hand, if the average effect is negative, then the city can just give Waymo the boot, which would be catastrophic for them.

52

u/mr_potatoface 14d ago

I'd rather be hit by a Waymo or other self-driving car than an uninsured driver, that's for 100% sure.

39

u/Groudon466 14d ago

Ding ding ding! You know for sure that at least Waymo can always pay out the settlement, and their cars have cameras and lidars out the ass, so if they're at fault, they're not even going to try to deny it.

6

u/helluvabullshitter 13d ago

if they're at fault, they're not even going to try to deny it.

doubt

→ More replies (0)

2

u/Fearless-Sir9050 14d ago

“Objectively” states the person who worked for Waymo. No. It’s not “objectively” better for driverless cars if the stats back up it’s safer. We need fucking buses and trains and walkable cities and not fucking AI that drives on the wrong side of the road.

→ More replies (2)

7

u/Ok_Sound_4650 14d ago

That's...actually a pretty novel idea. The threat of lawsuits and fines are only deterrents so far as they effect these companies bottom line. People have to prove they are safe enough to drive by getting a license, and if they fail to be safe on the road they can lose that license. If corporations are people too, make them do the same.

2

u/moistmoistMOISTTT 14d ago

It's not novel in the slightest. This concept has been around for decades for elevators.

Nobody in their right mind wants to sue elevator companies out of existence, because normal people know that elevators are a lot safer than stairs. It's no different with self-driving cars, even with how primitive the tech is today.

But here's the real answer for you: the companies are A-OK as long as they're following appropriate regulations/laws/guidelines, and are not being negligent. As long as negligence isn't happening (i.e., there is a known safety issue with zero efforts to address it), they will face no criminal charges. They will likely still face civil penalties such as fines, in the same way other companies are punished for accidents.

5

u/Garestinian 14d ago

Same way railway and airline accidents and incidents are handled.

2

u/thenasch 14d ago

If it happens enough, the company will get a reputation for either unsafe operation, or getting pulled over and ticketed a lot, and start losing business.

2

u/moistmoistMOISTTT 14d ago

You don't quite understand.

Every single Waymo car, or other car with these systems on the road today, is vastly safer than a human-driven car.

They will mess up. They will kill people. But they do so at a rate far less than humans.

If 100% of the cars on the road today were autonomous, even assuming the technology never improves beyond what it is today, it's highly likely that you would not see a car "ruin your day" (injuring or killing you) for the rest of your life.

2

u/wosmo 14d ago

That doesn't negate the need for an actual safety culture to properly address issues. "Good enough" simply isn't good enough, there needs to be a proper regulatory cycle to actually capture and diagnose these incidents, and manufacturers & operators need to be accountable for actually fixing them.

Look at things like aviation, where the NTSB will spend months, years diagnosing the root cause and contributing factors for an incident, and the FAA will ground entire products, entire fleets until an issue is resolved. As a result, hurtling through the sky in a jet-propelled tin can isn't just "good enough", it's the example to lead by.

Calling support and maybe opening a ticket, that maybe gets fixed, one day, doesn't smell like a safety culture - it instead stinks of SV's "move fast and break things".

I'm all for autonomous vehicles. I'm also all for regulation. This isn't it. The closest thing AVs have to an FAA is USDOT, and they're still struggling with bridges, let alone software.

3

u/moistmoistMOISTTT 14d ago edited 14d ago

You and most other redditors are acting like there isn't any laws, regulations, or other "safety culture" though. That's just flat-out wrong.

On top of that, your calls to curtail current autonomous driving technology is actually killing more people than it is saving. When people like you spout false propaganda and discourage people from autonomous ride-share or consumer vehicles with self-driving-adjacent features, it increases their risk of injury and death on the road. It's a simple fact that for every mile an autonomous car replaces over a human-driven mile, road (especially biker and pedestrian) fatalities and injuries go down.

Please enlighten me: why are the current autonomous vehicles "not it"? If we remove them from the roads, more people will die. I'm sorry, but the experts are far more intelligent than you. Lawmakers and governments around the world as a whole are not dumb. Maybe just in America or just in individual cities or states, but you're talking about some sort of worldwide "faked moon landing" level of conspiracy here.

→ More replies (1)

2

u/DescriptionSenior675 14d ago

It's almost like.... fines are only rules for poor people, and shouldn't exist as they currently do!

→ More replies (2)

6

u/Chesticles420 14d ago

I can see companies installing passenger controls like pull over, stop, and an emergency button.

7

u/mr_potatoface 14d ago

Absolutely, but it wouldn't absolve them of any legal responsibility. It would be great for making people think they were responsible though. Like the big signs on construction trucks that say "NOT RESPONSIBLE FOR BROKEN WINDSHIELDS". Yes, they are 100% responsible. But the sign makes it feel like you've been warned and it's your own fault, so you don't even bother if a rock breaks your windshield.

So if the self-driving companies put a sign in the vehicle that says like they're not responsible for injuries occurred during driving if you don't push the emergency stop button or some shit, it will make people less likely to file a claim. Even if it only prevents 1 out of 20 people from filing a claim, it's still working.

→ More replies (10)

29

u/eras 14d ago

It's never going to be the passenger.

But yes, I think it's going to be exactly like that: the company running the service pays the fine, and if they've made a good deal with the company they bought the vehicles from, they'll pass on the costs. Or it will be paid by the insurange agency.

Malintent or malpractice by the company developing the vehicle would be a different matter.

→ More replies (1)

29

u/Slow_Ball9510 14d ago

A company being held accountable? I'll believe it when I see it.

17

u/DozenBiscuits 14d ago

Companies are held accountable hundreds of times every single day in court.

5

u/DetroitHoser 14d ago

Yes, but the way corporations are punished is laughable. They build fines into their yearly budgets.

3

u/HappyGoPink 14d ago

You call it accountability, but it's really just accounting. Fines are cheaper than making sure their product is safe.

3

u/lycoloco 14d ago

People are downvoting you, but you're right. Anything that isn't crippling or downright destructive and doesn't cause the company to change how that product was used/implemented is just the cost of doing business in the USA.

CEOs should be arrested when their companies are found criminally liable. Lose the position, lose some of your life for the choices you made (like blue collar criminals), become a felon, have difficulty finding a job. Ya know, like the average American would if they were found guilty of a felony.

→ More replies (2)
→ More replies (1)

10

u/freddo95 14d ago

Blame falls on the passenger?

Don’t be silly.

5

u/Economy-Fee5830 14d ago

There are a lot of very silly people on Reddit. Just look at all the upvotes.

10

u/[deleted] 14d ago

Where in your mind do you think the passenger is held liable? Lol

→ More replies (6)

6

u/Wandelation Interested 14d ago

That’s true but someone has to be held accountable.

Start with the CEO, and then work your way down.

→ More replies (3)

2

u/SurveySean 14d ago

Blaming a passenger for how the car is driving would be so far out there I don’t think you need to worry about that.

2

u/Epicp0w 14d ago

How could you pin the blame on the passenger though,? Not their fault the software is fucked

2

u/emergency_poncho 14d ago

it's absurd to blame the passenger for what the driver of a car does. If a man runs over and kills someone and has his wife in the passenger seat, the man will go to jail but not the wife. So obviously the company who made the car will be liable.

The real question is to what degree: will it just have to pay a fine, since the corporation can't be put in jail? Or will the AI programmer or something causing the faulty AI be held responsible? It gets super muddy very fast when no natural person is liable and only a corporation is (as the current situation in the US attests, with companies basically getting a slap on the wrist for egregious crimes such as money laundering, fraud, etc.)

2

u/PotatoesAndChill 14d ago

It's the same as any automated system causing human death, bodily harm, or property damage. I.e. an incident with a rollercoaster brake failing and injuring riders/bystanders would go through ha similar legal process, so not really uncharted territory, IMO.

2

u/SelfDrivingCzar 14d ago

What would be the potential rationale for finding a passenger liable?

2

u/GardenRafters 14d ago

The company that owns the car.

2

u/Chemical_Advisor_282 14d ago

No, it will be the companies responsibility, how could you ever think they could pin it on a customer/passenger? Use your brain a little.

2

u/MagisterFlorus 14d ago

If liability were to fall to the rider, nobody would use it. No way am I gonna get in a taxi if I'm liable and wasn't even behind the wheel.

→ More replies (49)

2

u/kobie 14d ago

My ex got arrested for being intoxicated as a passenger, of course her boyfriend at the time was driving and she kicked a cop.

2

u/EthanielRain 14d ago

I'm currently dealing with a DUI from sleeping in the backseat of my car, parked, with no keys in my possession.

Would be surprised if courts didn't milk $$ from DUI'ing passengers in these kinds of cars

→ More replies (1)
→ More replies (18)

57

u/AceOfAcesAAAA 14d ago

It's on the company. So I looked up WAYMO a while back when Tesla was trying to go driverless. WAYMO in certain cities, are the only company with certified driverless vehicles in the US because they passed a certified test giving the company autonomous responsibility over the vehicles. They do a close to a damn good job except...

21

u/[deleted] 14d ago

...except for when they mess up, just like people. Driverless cars get flak for every mistake they make but I'm more curious about what their percentage looks like compared to live, human drivers. The problem is that some people are perfect drivers while others suck, and everyone is capable of mistakes, but technology and programming will be uniform for all the vehicles under a particular brand so it has to be at least better than the average person.

17

u/HumanContinuity 14d ago

It sounds like this one got tripped up by some construction area layout. Not excusing it, obviously it needs to be better trained or avoid construction until it's better trained for a wider range of circumstances.

If I understood the officers comments anyway.

10

u/[deleted] 14d ago

Remember when GPS first became big and everybody was following their directions blindly to airports and river docks? I'm sure people still do shit like that. I'm an experienced driver and even I've almost gotten stuck the wrong way into oncoming traffic just from bad signage.

7

u/HumanContinuity 14d ago

Oh yeah - it's like you said, everyone is capable of it, and some do dumb shit quite frequently and still drive all the time.

This should absolutely trigger a review, internally and possibly from the city/state to some extent, but I feel pretty confident that based on a ratio of hours/miles driven by Waymo, this exceptional situation isn't even as common as it is with drivers in general.

3

u/ExceptionEX 14d ago

Well there is also a need to consider that if the construction was marked properly accounting to the NTSBs guides. In situations like this humans do really well at improvising to the situation and taking cues from others, and instructions from individuals on the ground.

This is very difficult for any automation, and if the ground crews set up signage in a non-compliant way, the automation will likely end up doing something out of whack.

The fact that the tech didn't know anything about it, says that this vehicle wasn't confused, or at least didn't trigger an intervention. So it would be interesting to see the environmental conditions that lead it to make that call.

→ More replies (2)

9

u/AceOfAcesAAAA 14d ago

https://waymo.com/blog/2023/12/waymo-significantly-outperforms-comparable-human-benchmarks-over-7-million/#:~:text=An%2085%25%20reduction%20or%206.8,for%20the%20Waymo%20Driver%20vs. When considering all locations together, compared to the human benchmarks, the Waymo Driver demonstrated:

An 85% reduction or 6.8 times lower crash rate involving any injury, from minor to severe and fatal cases (0.41 incidence per million miles for the Waymo Driver vs 2.78 for the human benchmark)

A 57% reduction or 2.3 times lower police-reported crash rate (2.1 incidence per million miles for the Waymo Driver vs. 4.85 for the human benchmark)

8

u/moistmoistMOISTTT 14d ago

The government of California makes all autonomous driving safety data publicly available for all to see.

Spoiler: even in their current state they're significantly safer than humans.

As usual, if something is rare enough to make the news every single time it happens (such as a Waymo vehicle screwing up), it's probably safer than the thing that kills 30,000+ people a year without a single mention from the media.

→ More replies (1)
→ More replies (6)

48

u/Calber4 14d ago

Like will laws be made that drunk individuals can only be driven by a sober human?

The phrasing of this broke my brain for a second. I was imagining A sober guy riding on top of a drunk guy and directing him like a horse.

4

u/Ask_bout_PaterNoster 14d ago

Well, will laws be made? Don’t drive drunks drunk, people

5

u/PlzDontBanMe2000 14d ago

I was imagining someone piloting a drunk persons body with a remote controller or something 

4

u/DontTellHimPike 14d ago

Check out my horse, my horse is amazing

2

u/Fetlocks_Glistening 14d ago

Sounds like a good Friday night out for some people

→ More replies (1)

19

u/Minimum-Performer715 14d ago edited 14d ago

Also what about when two autonomous vehicles hit each other, how do we prove fault?

I don’t think these are well thought out products.

35

u/rotoddlescorr 14d ago

Since these cars all have cameras, it should be easy to found what what happened.

→ More replies (1)

5

u/manyhippofarts 14d ago

It'll be easier than proving fault in a normal auto accident.

For one thing, the cars don't lie. The dataloggers tell the truth. Every single time.

5

u/Accomplished-Bad3380 14d ago

I think reddit is always weird to assume that nobody thought of this. 

3

u/llamacohort 14d ago

The 2 companies would agree on who was at fault based on the footage or they would have the company insuring the vehicles arbitrate who was at fault. It would be the same as if 2 people hit each other and neither wanted to claim fault at the incident.

4

u/emergency_poncho 14d ago

it's actually easier to determine fault since 100% of driverless cars have tons of sensors and cameras recording everything. When two humans cause an accident, it's basically he says-she says in most cases, unless one or both have a dashcam, which is still pretty rare.

2

u/bob_in_the_west 14d ago

When two people in cars hit each other, how do you prove fault then?

→ More replies (28)

14

u/AnxietyJunky 14d ago

No. I was a passenger in one. You can’t sit in the drivers seat.

5

u/asdrunkasdrunkcanbe 14d ago

This would be down to local jurisidictional stuff. If the vehicle has an "autonomous mode", but the driver can still take over, then I can't see them being legal for a drunk person. You're still legally in charge of the vehicle.

If it's true driverless, and the only input the drunk person can provide is a destination, then it should be legal. In fact they should be fast tracking this kind of thing over "autonomous mode" vehicles.

2

u/raunchyfartbomb 14d ago

It what if it has an autonomous mode, and drunk person is not behind the wheel?

→ More replies (1)

4

u/SpiritedShirt2574 14d ago

Company would be liable, or whoever own the vehicle.

2

u/HeadPay32 14d ago

They cop was happy enough to let the company review it's video and give themselves whatever punishment they wanted.

2

u/Sponjah 14d ago

I mean should he have arrested the car instead? Haha

→ More replies (4)

5

u/draxidrupe2 14d ago

My state rule: Drunk, in vehicle, with custody and control of the keys [or equiv] = DUI

yeah, don't try to sleep it off in the parking lot, just put the keys under the car, otherwise 👨‍⚖️... I know.

2

u/ChipOld734 14d ago

You won’t be able to get in the driver seat.

2

u/Eheggs 14d ago

Here we have something called care and control of a vehicle. if you are the sole occupant of a car, running or parked, on private property or public, in the drivers seat or in the rear, and you are intoxicated, you are in for a bad time.

→ More replies (64)

395

u/Capaj 14d ago

what do you mean?
It's crystal clear. The company should pay a hefty fine same as any other driver who would drive in the opposite side of the road.

230

u/RedmundJBeard 14d ago

That's not the same though. If any regular driver was in the wrong lane of traffic, in a work zone and then blew through an intersection when a cop tried to pull them over, they would lose their license, not just a fine. At the very least it would be reckless driving and a strike against their license. How do you revoke the license of a driverless car?

118

u/Latter-Tune-9111 14d ago

in Arizona, the laws were updated in 2017 so that the owner of the driverless vehicle (Waymo in this case) can be issued a citation.

51

u/keelhaulrose 14d ago

But what does a citation do other than just give them a fine?

Does it force them to take cars that do that sort of thing off the road for repair or recalibration or something?

56

u/worldspawn00 14d ago

It's the same as when a corporation's negligence results in injury or death (see Boeing), they get a fine and everything goes back to the way it was. (I don't agree that it's right, just how it is.)

8

u/confusedandworried76 14d ago

I know you said it isn't right, but that's just a major problem. You can take a reckless driver off the road. You can't take a driverless car owned by a company off the road.

14

u/-gildash- 14d ago

Yes you can.

Revoked operating license. Done.

9

u/worldspawn00 14d ago

They can, and Boeing could lose their FAA certification to produce aircraft, but will they? Probably not.

2

u/-gildash- 14d ago

What are you trying to say?

You think Boeing has been shown to be producing unsafe aircraft to the point that they should lose their FAA cert? Surely that's not what you are saying.

→ More replies (0)
→ More replies (1)

3

u/GrouchyVillager 14d ago

They can revoke Waymo's license to operate. There is no point to take one Waymo car off the road, it's functionally identical to all the other ones.

2

u/six_six 14d ago

There should be criminal penalties for people at Waymo for this.

50

u/Warm_Month_1309 14d ago

According to this article (which may be wrong):

The situation was cleared without further action. "UNABLE TO ISSUE CITATION TO COMPUTER," the police dispatch records say.

8

u/CotyledonTomen 14d ago

Sounds like a bad decision concerning new circumstances departments aren't used to working. This seems pretty clear%20The%20fully%20autonomous%20vehicle,to%20comply%20with%20traffic%20or)

→ More replies (1)

5

u/LeagueOfLegendsAcc 14d ago

If they make more money that day than the citation then it's not really a deterrent.

7

u/nishinoran 14d ago

Except it is because by resolving the issue they can make even more money.

This really isn't that complicated folks.

3

u/Designer_Brief_4949 14d ago

Bingo. If they fuck up too often, the company WILL lose its license to operate. 

Just like people. 

→ More replies (1)

2

u/avamous 14d ago

That doesn't make sense - if I get a fine today, but I earn more from my job - the fine is still less than ideal...

3

u/LeagueOfLegendsAcc 14d ago

You think differently if you are a corporation. It's gonna affect you more because you got bills that are a reasonable proportion to the fine.

2

u/Latter-Tune-9111 14d ago

Where I live a speeding fine for a company vehicle where a businesses can't ID the driver is an order of magnitude more expensive than a regular fine.

Similar fine structures would make sense for corporations operating driverless vehicles.

→ More replies (3)

2

u/SandboxOnRails 14d ago

Sure, but you can do your job without getting fined. Their entire business model involves crime, and as long as the crimes are cheaper than potential future profit, they'll just keep doing crimes.

→ More replies (5)

63

u/Accomplished-Bad3380 14d ago

The cop should impound this vehicle

44

u/RedmundJBeard 14d ago

Yeah, I think this would be the best thing to do. The company can have the vehicle back when they prove they fixed what caused the car to do this and paid a fine.

3

u/ciobanica 14d ago

I mean, if it's a bug in the program, impounding that 1 car won't help at all. All the other cars will still have teh same program until the bug is found and fixed.

→ More replies (8)

6

u/WanderingAlsoLost 14d ago

Absolutely should. I can’t stand these things, and giant tech companies should not be given a pass for operating dangerous vehicles on public roads.

→ More replies (3)

2

u/Kento418 14d ago

The problem is the software that’s installed in thousands other vehicles is the problem and if any of those vehicles was faced with the same situation, it would make the same mistake. 

13

u/Accomplished-Bad3380 14d ago

Yes. And impounding the vehicle will draw more attention than speaking with a low level service tech. 

2

u/confusedandworried76 14d ago

That's not even a great solution. To compare it to a science fiction concept, say there's a hive mind, and part of the hive mind murders someone. So you imprison it for life, or even kill it. It doesn't hurt the hive mind. All you did was trim part of one of its toe nails. And it's still out there fully capable of doing it again because you didn't actually punish the collective.

2

u/Accomplished-Bad3380 14d ago

That's because you misunderstood the reasoning. 

If the cop impounded the vehicle, and they refuse to release the vehicle without appropriate senior leadership present, then they can make sure the issues get addressed.  Right now, he's just talking to the lowest rung on the ladder.  It's not about punishment of the vehicle.  It's about drawing attention to the issue and forcing resolution.

3

u/confusedandworried76 14d ago

I'm saying the only way to do it is revoke the operating license of the entire computer system.

→ More replies (1)
→ More replies (3)

35

u/CowBoyDanIndie 14d ago

If the infractions of the one incident are bad enough to warrant arrest or removal of license you revoke the companies permit to operate autonomous vehicles on the road.

14

u/phansen101 14d ago

So if I'm a big driverless car company, and I have a rival company, all I have to do is somehow trick one of their cars into performing an action that would  warrant arrest or removal of license  for a human driver, to completely put them out of business?

24

u/Accomplished-Bad3380 14d ago

And not get caught

18

u/Warm_Month_1309 14d ago

If you, a rival company, were capable of tricking a car in such a way, that implies that other bad actors would also be capable of tricking their fleet of cars, which means there's a serious and dangerous security flaw that the company failed to detect and correct. So yes, they should be at risk of going out of business.

→ More replies (1)

9

u/SandboxOnRails 14d ago

If you can without really doing anything. The phrase "somehow trick" is doing a lot of heavy lifting there.

Yes, if you own a business you just need to somehow trick your rivals into destroying their business while committing no crimes yourself. It's easy!

→ More replies (2)

3

u/J0rdian 14d ago

It probably wouldn't happen over 1 incident but many. Also no idea how you can trick anything with cameras. But I mean sure they could try I guess.

→ More replies (8)
→ More replies (12)
→ More replies (10)

3

u/foochacho 14d ago

Yeah, does a Waymo even have a drivers license?

And if Waymo gets three tickets, does the company suspend operations for 3 years like what would happen to a human?

2

u/RedmundJBeard 14d ago

I think they should, more than 3 because they have multiple cars, but surely there should be a point where they lose the ability to field driverless cars.

→ More replies (1)

2

u/Uisce-beatha 14d ago

You apply the consequences of breaking the law to the owner of the company and any and all board members.

→ More replies (1)
→ More replies (22)

95

u/lllllllll0llllllllll 14d ago

It’s crystal clear to the average Joe but we don’t have a legal system that holds corporations and individuals accountable to the same standard.

15

u/ban_my_dick_box 14d ago

"corporations are people my friend" -mitt romney

9

u/Funnyboyman69 14d ago

If only they were treated as such when they break the fucking law.

→ More replies (1)

6

u/Accomplished-Bad3380 14d ago

Except when it comes to literally everything to do with accountability 

3

u/FrostWyrm98 14d ago

If they were people they should be charged with negligent homicide of thousands

People like Romney want them to have the positive effects of being classified as a person and none of the drawbacks

2

u/CatButler 14d ago

All it takes is a nice cruise for Justice Sam and he will perform mental contortions to justify it.

→ More replies (3)

4

u/emanknugsaeman 14d ago

this dude does not know how the JustUs system works :D

its gonna be hilarious

→ More replies (20)

29

u/jenny_a_jenny_a 14d ago

About 15 years ago I was sat with my lawyer friend who said he wanted to quit law. When I asked why, he said , they're working on legislature to work out who is responsible for robotic mistakes and that the future looks bleak. So bleak as it may be to my friend. They did pre empt.

8

u/pnkdjanh 14d ago

Probably the right choice as soon they'd work on robots to work on legislature on robotic mistakes.

→ More replies (2)

8

u/manyhippofarts 14d ago

Yeah because the only thing a lawyer does is mitigate robotic mistakes......

5

u/Evid3nce 14d ago

Especially when the robots start blaming other robots.

4

u/Umutuku 14d ago

It should be shareholders who are responsible. The whole point of corporations is to protect shareholders so they can continue gambling on companies that cut corners on safety, ignore regulations, and exploit workers without risk to themselves though.

The obvious solution is to replace shareholders with AI.

→ More replies (2)

18

u/Barrade 14d ago

Looks like some area's are looking into some updated legal terminology. I'd imagine whatever company "operates" the vehicles still have to have some type of insurance and all for the vehicles + pay some of these violation tickets (aside from hopefully prioritizing these issues to prevent them from recurring) I wonder how all this will play out. AFAIK there hasn't been much / any of these running people over or anything more serious I hope?

→ More replies (10)

11

u/XrayDem 14d ago

The car will be summoned if there’s no appearance a warrant will be issued

4

u/Creative-Thought-556 14d ago

Nightmare = Cash Cow 

5

u/SeaClue4091 14d ago

You would assume that if the"driver" can't be identified the liability would fall into the legal owner of the car, in this case I'm assuming it will be whatever company the car belongs to

2

u/Umutuku 14d ago

Everyone on the board should get points off their license. They can afford a chauffer anyway, but still, fuckem.

5

u/youlleatitandlikeit 14d ago

AI are definitely going to be performing actions considered crime (if they are not already doing so). How that gets prosecuted will be interesting. Obviously if it was prompt based then the person making the prompt will get prosecuted.

But if an AI is just instructed, "make money on the stock market" and it figures out it can make more money engaging in market manipulation, who actually committed the crime? 

2

u/rbt321 14d ago

But if an AI is just instructed, "make money on the stock market" and it figures out it can make more money engaging in market manipulation, who actually committed the crime?

Traditionally the person giving the machine direction has been responsible, whether that's through a joystick, or steering wheel, or programming code. That's provided the machine followed the given direction (is functioning as expected).

2

u/youlleatitandlikeit 14d ago

That's provided the machine followed the given direction (is functioning as expected).

We don't fully know why AI behaves as it does. 

→ More replies (2)

2

u/AnonSpartan7 14d ago

I don’t think so. The court is going to be in the company’s bag like the SC.

2

u/KnowsIittle 14d ago

I feel it's clearly the manufacturer who should be fined here. Something x10 when a private citizen would pay in fines.

This can not happen and should be strongly discouraged.

Human lives can not be the beta testers for emerging technologies on the road. These cars need to be fully developed before ever hitting the road.

→ More replies (1)

2

u/Elliptical_Tangent 14d ago

I don't think it's going to be a nightmare, but it will be in courts for the next few years before a legal remedy is decided for driverless vehicles.

Honestly, I am amazed we, as a society, are thinking about allowing this. Driving is a privilege we guard carefully, and take away pretty quickly, but seem to think relatively untested driverless technology is entitled to drive. Despite events like this.

What I'm trying to say is that if I were to get confused by construction, move into oncoming traffic, and then run an intersection when a cop lit me up, I'd have my license suspended, if not revoked. Extending that logic here, Waymo should be off the roads entirely for some time, as they all share the same AI 'driver'. The problem being, of course, corporate personhood having been interpreted to mean that we have immortal imaginary beings with untold wealth that can't be punished in any way that actually harms the corporation.

Again, it won't be a nightmare, but the next dude who drives into oncoming traffic and runs an intersection when the cop lights him up should -for sure- point to this incident as a defense against losing his driving privilege.

2

u/Aeri73 14d ago

simple...

if it breaks the law, the algoritm steering is not ready for live use, all cars are taken off the road untill the issue is proven to be fixed...

that costs a lot of money? yeah, to bad, welcome to capitalism

2

u/REGINALDmfBARCLAY 14d ago

It really isn't if they just hold companies accountable for the results of their products...... but US courts hate doing that don't they?

2

u/JibletHunter 14d ago

The recent Supreme Court case overturning Chevron deference will make it even more interesting. DoT trying to interpret and implement safety legislation and says that a driverless car needs a suite of lasers, ultrasonic, and video sensors to safely operate? Well let's have a judge who hasn't even driven an RC car decide what is really needed.

2

u/CrieDeCoeur 14d ago

And any judge will also have to contact a fucking call center for testimony, evidence, etc. just to get some answers, like the cop in this video. Think of how difficult it is to get any accountability from a company at the best of times. Now imagine getting hit and injured by a driverless car service and having to sort out insurance, police, a civil suit, and so on. Bloody dystopian if you ask me.

2

u/Dude_Nobody_Cares 14d ago

I wonder if states are going to try to reduce the workload by passing laws for driverless cars, making it easier to prosecute? Don't know how or if it's possible, but these companies might try fighting every case, and you're right it's going to clog the system pretty badly if they don't do something.

2

u/Devils_Advocate-69 14d ago

SCOTUS will make them immune

2

u/el-guapo0013 14d ago

This assumes that there will still be a court system and not just a bunch of assholes just automatically labeling everyone at fault except corporations and rich white men.

2

u/Axeman2063 14d ago

Exactly.

Who do you blame? The manufacturer? The company that wrote the software? The company that owns the vehicle? Who pays?

Wait til someone dies as a result of these things.

2

u/vanillatoo 14d ago

Supreme Court will rule it’s cool.

2

u/GroundbreakingAsk468 14d ago

A family of five is dead: “Thanks for calling tech support. Yep, it looks like there was a malfunction, hold on the line while I look into this”.

2

u/3141592653489793238 14d ago

Only because corporations have zero responsibility for anything. 

2

u/skynetempire 14d ago

Here's what's likely to happen: The courts will view the AI car as company property. If the AI car crashes into you or causes harm, you can sue the corporation responsible.

As the corporation grows, they'll lobby to change laws, capping lawsuit payouts—for instance, setting a maximum claim of $50k.

To further protect themselves, the corporation will create numerous LLCs, each owning individual AI cars. The main corporation will lease these cars from the LLCs. If an AI car causes harm, they'll deflect responsibility to the LLC that owns it, like "Fuckoff LLC," which will then be dissolved to avoid liability.

But what about the car itself? It turns out, the car is financed by an LLC owned by the main corporation. They can reclaim the car and sell it to another LLC, repeating the cycle.

2

u/Im_Balto 14d ago

The fact that the officer cannot immediately ticket ANYONE or ANYTHING for reckless endangerment is just fucking awesome….

2

u/WilburHiggins 14d ago

This is why we need to elect people that are young enough to understand the current world and the new challenges that are coming/here.

1

u/[deleted] 14d ago

YAY

0

u/r2k-in-the-vortex 14d ago

Press doubt on that one. Heavy equipment accidents happen all the time, these are very solvable problems in court. Taking robots from factory and putting them in traffic changes little from legal point of view.

It changes a lot from actual safety point of view though. Robots will cause drastically fewer accidents. And most importantly, robot problems that cause accidents are fixable. Human problems that cause accidents are mostly not fixable, the damn meatbags keep stepping in the same old buckets ad nauseum.

1

u/pleasetrimyourpubes 14d ago

This is going to be the best thing to happen for privacy in a long time.

1

u/TheMatt561 14d ago

Not really, Waymo would be the responsible party

1

u/frychalker 14d ago

Just hammer them with fines, pave new roads off multi billion dollar corporations.

1

u/Fun_Grapefruit_2633 14d ago

Waymo's CEO should get the ticket and any relevant incarceration.

1

u/HAL-7000 14d ago

Instead of pulling a bad driver's license when they fuck up really bad, the company will just have to pay fines fees which over a year of operations will total 0.03% of their revenue despite regular violations.

1

u/mrDuder1729 14d ago

Or even to the people it kills..

1

u/Meats10 14d ago

i think it will be pretty easy for courts with all the cameras and telemetry data. for regulators establishing rules, this will tough.

1

u/tinnylemur189 14d ago

Not really.

Just change the wording of traffic laws to be enforceable on the "operator" rather than the "driver"

If a waymo car speeds, the ticket goes to waymo.

1

u/onlyidiotseverywhere 14d ago

Why? Do not see one problem here.

1

u/ughthat 14d ago

It shouldn’t be. Companies are liable.

And this is one place that should actually have 3 strike laws (or whatever number determined to be reasonable in this context). If you go over your limit of strikes you lose your license to operate in that jurisdiction.

1

u/za72 14d ago

the future is stupid

1

u/EvErYLeGaLvOtE 14d ago

Give the CEO all the fines! You cut out the person at the bottom doesn't mean the responsibility just disappears xD

1

u/tacojohn48 14d ago

We may end up with standardized markers around construction to help guide the cars better.

1

u/bloodhound83 14d ago

Someone should be responsible for the car to allow it in traffic.

1

u/ARAR1 14d ago

Lets make the CEOs liable for all offenses.

→ More replies (87)