A Tesla was in its self-driving mode when it crashed into a parked patrol vehicle responding to a fatal crash in Orange County Thursday morning, police said.

The officer was on traffic control duty blocking Orangethorpe Avenue in Fullerton for an investigation into a suspected DUI crash that left a motorcyclist dead around 9 p.m. Wednesday when his vehicle was struck.

A Fullerton Police Department spokesperson said the officer was standing outside his vehicle around midnight when he saw a Tesla driving in his direction and not slowing down.

  • FiniteBanjo@lemmy.today
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    TBH if this process could work a little faster then maybe evolution could remove all the ai tech bros from the gene pool.

  • NameTaken@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Ugh I know people feel strongly about FSD and Tesla. As some one who uses it ( and still pays attention hands on wheels when activated) when FSD is active as soon as it sees anything resembling emergency lights it will beep and clearly disengage. I am not sure, but it’s possible this person probably is just using Tesla as a scape goat for their own poor driving. However in my experience it will force the driver to take control when emergency lights are recognized specifically to avoid instances like this.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Assuming something was on, I’m not even convinced it was FSD and it could have easily been AP.

      The media and police get that wrong more often than right, and the article isn’t even specifically naming either one.

    • Joelk111@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Doesn’t Tesla usually look at the logs for a situation like this, so we’ll know shortly?

      • Matty_r@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        “As you can see by looking at the logs, the FSD was disengaged 276ms prior to the crash, therefore the driver is at fault” /s

    • vxx@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Thanks for the tip, going to flash my blue flashlight at teslas from now on.

      • NameTaken@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        Yeah sure if that’s what makes you happy… 👍. Nothing like blinding random people in cars in your spare time.

        • vxx@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          5 months ago

          No, not the driver, the faulty sensors and programming that should’ve never been approved for the road.

          • NameTaken@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            Wait so how is it faulty and bad programming if it disengages when emergency vehicles are present? You’d prefer it to stay on in emergency situations?

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    It really doesn’t help that the media isn’t putting “Self-Driving” Mode in quotes since it isn’t fucking self-driving.

    • Empricorn@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      It’s “self-driving”, not “self-stopping”. Luckily the police were able to assist with cruiser-based rapid deceleration.

    • Lileath@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Technically it is self-driving but just in the sense that it doesn’t need any external power sources like horses to pull it.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        Tesla calls it “Full Self Driving” and it’s a lie. So capitalize it and put it in quotes, rather than call it self-drive mode like that’s an actual thing.

        • icy_mal@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          The actual name: Full self driving (supervised) is so shady. Supervised is just a less crappy sounding way to indicate that you will have to take over and drive sometimes. So sometimes the car drives itself and sometimes you drive. So partial self driving, partial human driving. I’m surprised they didn’t call it “Partial Full Self Driving”. That would certainly amp up the trolling factor and really separate the true believers who would come out defending it with Olympic level mental gymnastics.

  • Buffalox@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    5 months ago

    I just heard from Enron Musk that it crashed into the patrol car way more safely than a human would have done.
    Also according to Enron Musk Full self driving has been working since 2017, and is in such a refined state now, that you wouldn’t believe how gracefully it crashed into that patrol car. It was almost like a car ballet, ending in a small elegant piruette.

    As Enron Musk recently stated, in a few months we should have Tesla Robo Taxies in the streets, and you will be able to observe these beautiful events regularly yourself.

    Others say that’s ridiculous, he is just trying to save Enron, but that’s too late.

    • brbposting@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      All I do at night is open my garage door to let my car out. A few months later here I’m a millionaire. Thank you full self driving Roboenron 😍

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        Yes I remember that, and then he has repeated every year since, that they will be ready next year. But THIS year he changed his tune somewhat, and claimed it was a matter of months.
        How is this con man not in jail?

  • Sludgehammer@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    IMHO it’s the flashing lights. I really think they overload the self driving software somehow and it starts ignoring changes in driving conditions (like say an emergency vehicle parked in the road).

    • Geyser@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      I’ll bet you’re right that it’s the lights, but I don’t know about “overload” of anything.

      The problem with camera vision (vs human vision or LiDAR) is poor definition range. This means that pointing a light at it, like happens with emergency vehicle lights, can cause it to dim the whole image to compensate and then not see the vehicles. Same thing as when you take a backlit photo and can’t see the people.

  • BigMacHole@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    That must have been SO scary for the cop! He wouldn’t know whether to shoot the car or the passenger!