In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    34
    ·
    4 months ago

    if it can actually sense a crash is imminent, why wouldn’t it be programmed to slam the brakes instead of just turning off?

    Do they have a problem with false positives?

      • FiskFisk33@startrek.website
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        4 months ago

        I don’t believe automatic swerving is a good idea, depending on what’s off to the side it has the potential to make a bad situation much worse.

        I’m thinking like, kid runs into the street, car swerves and mows down a crowd on the sidewalk

    • Whelks_chance@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      4 months ago

      I’ve been wondering this for years now. Do we need intelligence in crashes, or do we just need vehicles to stop? I think you’re right, it must have been slamming the brakes on at unexpected times, which is unnerving when driving I’m sure.