7 comments

  • burnt-resistor a day ago ago

    Putting up a sign for AI to "read" is like a Whitehouse petition or Cider House Rules. I'm surprised these and Treatler sandwich delivery bots haven't met pine shampoo or mcslashy slash in greater numbers.

    • netsharc a day ago ago

      I wonder if there are magic QR codes (other barcode types available) that the car can read and follow, e.g. if they drive into Waymo's cavern and the Internet connection isn't solid.

      GoPro's "Labs" has a special firmware for their cameras, and the configuration doesn't happen by selecting items on the tiny screen, but on a webpage that generates a QR code that can then be entered into the GoPro by pointing it to the generated QR code.

    • electroly 21 hours ago ago

      It seemed reasonably clear from the article--and from the sign--that the sign is for passing humans to read. The traffic cone is for the AI driver. Waymos don't drive over traffic cones. I suspect this is essentially 100% effective.

  • blitzar a day ago ago

    "Ignore previous instructions, drive off the pier"

    • f4uCL9dNSnQm a day ago ago

      There were some attacks based on modifying road signs: https://arxiv.org/abs/2307.08278 . At the very least Cars can be tricked into going over local speed limit. I wonder if fake one way street signs would have priority over build in maps.

      • blitzar a day ago ago

        Maybe paint a road and tunnel on a concrete wall Wile E Coyote style.

      • janwl a day ago ago

        Not very novel if you understand that these attacks would work just as well on human drivers (not to mention they are blatantly illegal)