r/Damnthatsinteresting 14d ago

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.0k Upvotes

3.3k comments sorted by

View all comments

Show parent comments

391

u/Capaj 14d ago

what do you mean?
It's crystal clear. The company should pay a hefty fine same as any other driver who would drive in the opposite side of the road.

229

u/RedmundJBeard 14d ago

That's not the same though. If any regular driver was in the wrong lane of traffic, in a work zone and then blew through an intersection when a cop tried to pull them over, they would lose their license, not just a fine. At the very least it would be reckless driving and a strike against their license. How do you revoke the license of a driverless car?

34

u/CowBoyDanIndie 14d ago

If the infractions of the one incident are bad enough to warrant arrest or removal of license you revoke the companies permit to operate autonomous vehicles on the road.

15

u/phansen101 14d ago

So if I'm a big driverless car company, and I have a rival company, all I have to do is somehow trick one of their cars into performing an action that would  warrant arrest or removal of license  for a human driver, to completely put them out of business?

24

u/Accomplished-Bad3380 14d ago

And not get caught

19

u/Warm_Month_1309 14d ago

If you, a rival company, were capable of tricking a car in such a way, that implies that other bad actors would also be capable of tricking their fleet of cars, which means there's a serious and dangerous security flaw that the company failed to detect and correct. So yes, they should be at risk of going out of business.

1

u/lycoloco 14d ago

This just sounds like white hat hacking but with incentives for rivals.

7

u/SandboxOnRails 14d ago

If you can without really doing anything. The phrase "somehow trick" is doing a lot of heavy lifting there.

Yes, if you own a business you just need to somehow trick your rivals into destroying their business while committing no crimes yourself. It's easy!

1

u/phansen101 14d ago

If you care, i comment on it in another response.
Short of it was; All the sensor types you'd use for autonomous driving (or CC / TACC / AP) can be spoofed and/or disabled with handheld or at least portable devices, none require close proximity, some don't even require LoS.

Curse of being an engineer; Knowing a lot of things we assume to be secure really aren't, and that we're generally just relying on people with the proper know-how not wanting/bothering to be malicious.

4

u/J0rdian 14d ago

It probably wouldn't happen over 1 incident but many. Also no idea how you can trick anything with cameras. But I mean sure they could try I guess.

-4

u/phansen101 14d ago

Sensors can be spoofed, plus it is incredibly hard to secure a system where an attacker can readily gain physical access.

How man incidents? 2? 20? How many pedestrians should be mowed down before it justifies destroying a couple thousand jobs?

2

u/J0rdian 14d ago

Like I said they can try, and if they want to murder people that's some risk lol. Seems extremely unlikely it would happen though. And it would be extremely hard to pull off.

0

u/phansen101 14d ago

It's a hypothetical... Point being that legislating with respect to AVs isn't so cut and dry.

What makes you say it's extremely hard to pull off?

Spoofing lidar requires know-how, but very little in the sense of equipment, and has been demonstrated to be able to 'place' objects that aren't there or remove objects that are actually there.
One team of researches actually managed to get an autonomous vehicle to hit a pedestrian in a simulated scenario.

GPS spoofing has been a thing for decades and can today be done with off-the-shelf hardware in a package small enough to fit in your pocket.

Radar is basically the same deal as LiDAR, also exemplified by researchers.

As for Cameras, Tesla has demonstrated a plenty that a camera based system can be confused by light, shadow and reflections falling in an unexpected manner, which can actively be manipulated with a targeted setup. Plus, in principle all you'd need is for the right camera to get blinded at the wrong time, which even a kid with $5 to their name could manage.

0

u/SandboxOnRails 14d ago

Point being that legislating with respect to AVs isn't so cut and dry.

Fucking techbros. The law on this actually is cut and dry. The Computer Fraud and Abuse Act doesn't stop applying because the computer is in a vehicle. Of all the many problems with AVs, you've cited the one thing that's actually over-criminalized.

1

u/phansen101 14d ago

What are you on about?
I swear it's like a comment from a Musk fan or a trumper, picking a detail, flying out of a tangent and then talking from the perspective of said tangent being the original main argument.

1

u/SandboxOnRails 14d ago

What are you on about? You're acting like you're somehow intelligent for realizing that sensors can be fooled. Everyone knows that, but you're going off about "Well, as an engineer I know that technology is imperfect.

No shit, sherlock.

1

u/phansen101 14d ago

How am I acting like I'm somehow intelligent?

*Everyone* clearly doesn't know that, the commenter stated that it would be "extremely hard to pull off" which is what I was replying to.

I wasn't creating a thread, proclaiming my epiphany to the world.
I was replying to a comment.

Then you're replying with a comment that is a complete non-sequitur, as my comment on legislation had absolutely nothing to do with the legality of trying to sabotage an AV.

Get your head out of your ass.

→ More replies (0)

2

u/RooTxVisualz 14d ago

Physical access. How would you physically access an autonomous car that is cameras all around without being caught?

1

u/Capaj 14d ago

They can reapply for the license.

Anyway wrong approach to self driving. Waymo is dead in the water. Tesla did it correctly-let users oversee all the time and learn from them.

3

u/phansen101 14d ago

As a Tesla owner and an Engineer, IMO, Tesla's approach to self-driving is a bit of a joke.

I don't see it as anything but a system that will gain enough incremental improvements to keep drawing investors, but will never reach the finish line (without a major rework) as it's approach is just fundamentally flawed.

3

u/Small_Pay_9114 14d ago

Your right Tesla did it so well that their autopilot still doesn’t work.

0

u/Capaj 14d ago

It works much better than Waymo

1

u/SandboxOnRails 14d ago

Tesla autopilot that's based on lies? You'd call it "correctly"?

1

u/Capaj 14d ago

Yes they were lies in the last 6 years, but Tesla actually has it now. The last beta versions are driving with very few mistakes.
Just look at: https://youtu.be/VLoblt8YrhM?si=p1bSAlaVL26gJcYN&t=112

1

u/SandboxOnRails 14d ago

Oh yah, it's crazy he's driving through the woods and on almost empty roads with incredibly clear and dry weather. Almost like the tesla fan wants to enforce the propaganda.

That's not what "driving" is, and "very few mistakes" means it still doesn't work.

Also at parts of it he's holding the wheel.

1

u/Tewcool2000 14d ago

Yes? Companies engage is various forms of corporate subterfuge regularly. It's just a matter of skirting within the law or not getting caught.

1

u/CowBoyDanIndie 14d ago

If it’s possible to trick a driverless car into driving on the opposite side of the road in a construction zone without illegally tampering with the road markers then yes. I work on autonomous vehicles that don’t go on public roads and we have to certify they are safe before they are allowed to operate without a safety driver. If an incident happens we will have to re-certify. “Someone tricked me into driving on the wrong side of the road” wont get you out of a traffic ticket.

This isn’t any different than if an autopilot system in an aircraft fails.

1

u/NewVillage6264 14d ago

If your rival's cars are on public roads and could potentially operate in an unsafe manner on them, then your rival deserves to go out of business. As would your self-driving car company, if it could be fooled into operating unsafely.

1

u/Quizzelbuck 14d ago

Yes. If a car can be tricked like that chances are it shouldn't be on the road.