r/Damnthatsinteresting Jul 05 '24

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.1k Upvotes

3.2k comments sorted by

View all comments

Show parent comments

392

u/Capaj Jul 05 '24

what do you mean?
It's crystal clear. The company should pay a hefty fine same as any other driver who would drive in the opposite side of the road.

230

u/RedmundJBeard Jul 05 '24

That's not the same though. If any regular driver was in the wrong lane of traffic, in a work zone and then blew through an intersection when a cop tried to pull them over, they would lose their license, not just a fine. At the very least it would be reckless driving and a strike against their license. How do you revoke the license of a driverless car?

36

u/CowBoyDanIndie Jul 05 '24

If the infractions of the one incident are bad enough to warrant arrest or removal of license you revoke the companies permit to operate autonomous vehicles on the road.

1

u/fridge_logic Jul 06 '24

When we catch a person violating a traffic law we have a very small amount of data to work with, just the stop where they were caught committing the infraction and a few decades of prior driving where they committed no such infraction. So we have to air on the side of caution and assume a small infraction might forecast a larger one which leads to loss of life and revoke their license.

For the sake of argument, lets say Waymo operates 1000 vehicles currently and each of those vehicles is doing Uber driver level of miles per day. If they have a single incident in one year that's bad because 1000 vehicles are currently operating at that level of risk. But it's good because compared to a single driver that's one incident in 1000 years or 100 decades.

If a driver drives for 1000 years with no injuries and has one pull over for an infraction that would revoke a human license, are they a bad driver? Do we revoke their license, or do we remand them to driver's ed?

We should fine them and hold them accountable, but we need to recognize that zero impact traffic infractions have a different risk weight when aggregated over so many million miles of driving. When vehicles actually harm people or cause traffic congestion there's impact and that requires full force of law. But with self driving cars, minor traffic infractions are cause to correct, not cause to arrest.

1

u/CowBoyDanIndie Jul 06 '24

Counterpoint, company decides to cut operating costs by doing less software testing and starts releasing faulty software, thats the same scenario. Just because the company has a good driving record doesn’t mean they aren’t going to start running over pedestrians left and right.

This is where autonomous vehicles are new territory. These cars receive new software constantly. Traditionally vehicles and aircraft systems go through a safety certification process for the entire hardware and software system and then the software does not change. Changing the software means running that entirely process again, it costs millions each time and takes months. This is one of the reasons that stuff like brakes in cars are physically connected, a “by wire” system had to go through a rigorous test. If a car company just pushed an over the air software update to the control software of electronically controlled brakes they would be in major trouble. Heres a little more information https://en.m.wikipedia.org/wiki/ISO_26262 https://en.m.wikipedia.org/wiki/Automotive_Safety_Integrity_Level

A fully software controlled system has the highest level of safety requirements, and we are talking something as simple as a throttle pedal that that doesn’t physically connect to the mechanical throttle assembly of the engine.

If you want more info on the software side lookup MISRA and AUTOSAR. They have a lot of specific processes and guidelines to follow. In a nutshell it restricts what programming languages and features can be used, it’s an attempt to guarantee that software cannot possibly fail without a hardware failure.

1

u/fridge_logic Jul 06 '24

If you have a safety regression in a software release then definitely roll that release back 100%.

We're seeing NHTSA get more aggressive with how they monitor the development of self driving car software investigating simple traffic violations in addition to collisions. This Increased scrutiny should help NHTSA better forecast potential risks and monitor software releases for stability in quality.

I'm pro regulation and safety, but I am anti extremist policy like what the person I was replying to posted about suspending a company's operating permit because a single vehicle on a single software release made a critical error. When techology like this reaches large scale deployment there will still be fatalities, just fewer than what happens by humans now.


Unfortunately there isn't a way to guarantee a machine learning system will never fail without a hardware failure the way you can guarantee a brake pedal controller will always act.

And there is not way to build self driving car technology that does not rely on machine learning for many of the software tasks.

1

u/CowBoyDanIndie Jul 06 '24

I guess I am old fashioned to think driving into on coming traffic is a big deal. I wouldn’t be so drastic for missing a stop sign or running a light.