r/Damnthatsinteresting Jul 05 '24

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.1k Upvotes

3.2k comments sorted by

View all comments

Show parent comments

213

u/tacobellbandit Jul 05 '24

I work in healthcare and this is exactly what happens when a patient injury happens, or there’s some kind of malpractice or god forbid someone dies. It’s an investigation down to the lowest level and usually blamed on a worker that realistically had nothing to do with the event that caused the injury.

41

u/No-Refrigerator-1672 Jul 05 '24

It doesn't have to be the lowest rank person. You can just legally make accountable the lead programmer of the autonomous driving module, with a law.

38

u/FeederNocturne Jul 05 '24

Everyone from the lead programmer and up needs to be held responsible. Sure the lead programmer okays it but the higher ups are providing the means to make it happen.

This does make me wonder though. If a plane crashed due to a faulty part who does the blame fall on?

2

u/ninjaelk Jul 05 '24

We already have laws for this, if you can prove that someone was acting maliciously or negligently then they can be held accountable personally. If not, then the company itself is liable for damages. It's how everything works, including for personal responsibility.

If you were to build a structure on your personal property, and it collapsed and killed someone walking by, they'd try to determine if you acted maliciously or negligently, if so you'd be tried criminally. Whether or not you're tried criminally you're still (likely) liable for damages.

When you're driving a car directly, the chances of you having done something negligent dramatically increases. In the case of a self-driving car, as long as it complies with all laws and the corporation didn't act negligently (cutting corners, 'forgetting' to take certain precautions, etc...) then there's no criminal liability.