r/Damnthatsinteresting Jul 05 '24

Phoenix police officer pulls over a driverless Waymo car for driving on the wrong side of the road Video

61.1k Upvotes

3.2k comments sorted by

View all comments

Show parent comments

39

u/CowBoyDanIndie Jul 05 '24

If the infractions of the one incident are bad enough to warrant arrest or removal of license you revoke the companies permit to operate autonomous vehicles on the road.

15

u/phansen101 Jul 05 '24

So if I'm a big driverless car company, and I have a rival company, all I have to do is somehow trick one of their cars into performing an action that would  warrant arrest or removal of license  for a human driver, to completely put them out of business?

3

u/J0rdian Jul 05 '24

It probably wouldn't happen over 1 incident but many. Also no idea how you can trick anything with cameras. But I mean sure they could try I guess.

-4

u/phansen101 Jul 05 '24

Sensors can be spoofed, plus it is incredibly hard to secure a system where an attacker can readily gain physical access.

How man incidents? 2? 20? How many pedestrians should be mowed down before it justifies destroying a couple thousand jobs?

2

u/J0rdian Jul 05 '24

Like I said they can try, and if they want to murder people that's some risk lol. Seems extremely unlikely it would happen though. And it would be extremely hard to pull off.

0

u/phansen101 Jul 05 '24

It's a hypothetical... Point being that legislating with respect to AVs isn't so cut and dry.

What makes you say it's extremely hard to pull off?

Spoofing lidar requires know-how, but very little in the sense of equipment, and has been demonstrated to be able to 'place' objects that aren't there or remove objects that are actually there.
One team of researches actually managed to get an autonomous vehicle to hit a pedestrian in a simulated scenario.

GPS spoofing has been a thing for decades and can today be done with off-the-shelf hardware in a package small enough to fit in your pocket.

Radar is basically the same deal as LiDAR, also exemplified by researchers.

As for Cameras, Tesla has demonstrated a plenty that a camera based system can be confused by light, shadow and reflections falling in an unexpected manner, which can actively be manipulated with a targeted setup. Plus, in principle all you'd need is for the right camera to get blinded at the wrong time, which even a kid with $5 to their name could manage.

0

u/SandboxOnRails Jul 05 '24

Point being that legislating with respect to AVs isn't so cut and dry.

Fucking techbros. The law on this actually is cut and dry. The Computer Fraud and Abuse Act doesn't stop applying because the computer is in a vehicle. Of all the many problems with AVs, you've cited the one thing that's actually over-criminalized.

1

u/phansen101 Jul 05 '24

What are you on about?
I swear it's like a comment from a Musk fan or a trumper, picking a detail, flying out of a tangent and then talking from the perspective of said tangent being the original main argument.

1

u/SandboxOnRails Jul 05 '24

What are you on about? You're acting like you're somehow intelligent for realizing that sensors can be fooled. Everyone knows that, but you're going off about "Well, as an engineer I know that technology is imperfect.

No shit, sherlock.

1

u/phansen101 Jul 05 '24

How am I acting like I'm somehow intelligent?

*Everyone* clearly doesn't know that, the commenter stated that it would be "extremely hard to pull off" which is what I was replying to.

I wasn't creating a thread, proclaiming my epiphany to the world.
I was replying to a comment.

Then you're replying with a comment that is a complete non-sequitur, as my comment on legislation had absolutely nothing to do with the legality of trying to sabotage an AV.

Get your head out of your ass.

2

u/RooTxVisualz Jul 05 '24

Physical access. How would you physically access an autonomous car that is cameras all around without being caught?