TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    8 days ago

    For what it’s worth, it really isn’t clear if this is FSD or AP based on the constant mention of self driving even when it’s older collisions when it would definitely been AP, and is even listed as AP if you click on the links to the crash.

    So these may all be AP, or one or two might be FSD, it’s unclear.

    Every Tesla has AP as well, so the likelihood of that being the case is higher.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 days ago

      In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

      I’d be more interested in how it changes over time, as new software is pushed. While it’s important that know it had problems judging distance to a motorcycle, it’s more important to know whether it still does

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        In this case, does it matter? Both are supposed to follow a vehicle at a safe distance

        I think it does matter, while both are supposed to follow at safe distances, the FSD stack is doing it in a completely different way. They haven’t really been making any major updates to AP for many years now, all focus has been on FSD. I think the only real changes it’s had for quite awhile have been around making sure people are paying attention better.

        AP is looking at the world frame by frame, each individual camera on it’s own, while FSD is taking the input of all cameras, turning into 3d vector space, and then driving based off that. Doing that on city streets and highways is only a pretty recent development. Updates for doing it this way on highway and streets only went out to all cars with FSD in the past few months. For a long time it was on city streets only.

        I’d be more interested in how it changes over time, as new software is pushed.

        I think that’s why it’s important to make a real distinction between AP and FSD today (and specifically which FSD versions)

        They’re wholly different systems, one that gets older every day, and one that keeps getting better every few months. Making an article like this that groups them together over the span of years muddies the water on what / if any progress has been made.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          8 days ago

          Fair enough!

          At least one of the fatalities is Full-Self Driving (it was cited by name in the police reports). The remainder are Autopilot. So, both systems kill motorcyclists. Tesla requests this data redacted from their NHTSA reporting, which specifically makes it difficult for consumers to measure which system is safer or if incremental safety improvements are actually being made.

          You’re placing a lot if faith that the incremental updates are improvements without equivalent regressions. That data is specifically being concealed from you, and I think you should probably ask why. If there was good news behind those redactions, they wouldn’t be redactions.

          I didn’t publish the software version data point because I agree with AA5B, it doesn’t matter. I honestly don’t care how it works. I care that it works well enough to safely cohabit the road with my manual transmission cromagnon self.

          I’m not a “Tesla reporter,” I’m not trying to cover the incremental changes in their software versions. Plenty of Tesla fans doing that already. It only has my attention at all because it’s killing vulnerable road users, and for that analysis we don’t actually need to know which self-driving system version is killing people, just the make of car it is installed on.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            8 days ago

            I’d say it’s a pretty important distinction to know if one or both systems have a problem and the level of how bad that problem is.

            Also are you referencing the one in Seattle in 2024 for FSD? The CNBC article says FSD, but the driver said AP.

            And especially back then, there’s also an important distinction of how they work.

            FSD on highways wasn’t released until November 2024, and even then not everyone got it right away. So even if FSD was enabled, the crash may have been under AP.

            Edit: Also if it was FSD for real (that 2024 crash would have had to happen on city streets, not a highway) then thats 1 motorcycle fatality in 3.6 billion miles. The other 4 happened over 10 billion miles. Is that not an improvement? (edit again: I should say we can’t tell it’s an improvement yet as we’d have to pass 5 billion, so the jury is still out I guess IF that crash was really on FSD)

            Edit: I will cede though that as a motorcyclist, you can’t know what the Tesla is using, so you’d have to assume the worst.

            • KayLeadfoot@fedia.ioOP
              link
              fedilink
              arrow-up
              0
              ·
              8 days ago

              Police report for 2024 case attached, it is also linked in the original article: https://www.opb.org/article/2025/01/15/tesla-may-face-less-accountability-for-crashes-under-trump/

              It was Full Self Driving, according to the police. They know because they downloaded the data off the vehicle’s computer. The motorcyclist was killed on a freeway merge ramp.

              All the rest is beyond my brief. Thought you might like the data to chew on, though.

              • NotMyOldRedditName@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                8 days ago

                The motorcyclist was killed on a freeway merge ramp.

                I’d say that means it’s a very good chance that yes, while FSD was enabled, the crash happened under the older AP mode of driving, as it wasn’t until November 2024 that it was moved over to the new FSD neural net driving code.. I was wrong here, it actually was FSD then, it just wasn’t end to end neural nets then like it is now.

                Also yikes… the report says the AEB kicked in, and the driver overrode it by pressing on the accelerator!

    • psivchaz@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      That’s not good though, right? “We have the technology to save lives, it works on all of our cars, and we have the ability to push it to every car in the fleet. But these people haven’t paid extra for it, so…”

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        Well, only 1 or 2 of those were in a time frame where I’d consider FSD superior to AP, it’s a more recent development where that’s likely the case.

        But to your point, at some point I expect Tesla to use the FSD software for AP for the exact reasons you mentioned. My guess is they’d just do something like disable making left/right turns , so you wouldn’t be able to use it outside of straight stretches like AP today.

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      I already do. Flip a coin: Heads, the car is operating itself and is therefore being operated by a moron. Tails, the owner is driving it manually and therefore it is being operated by a moron.

      Just be sure to carefully watch your six when you’re sitting at a stoplight. I’ve gotten out of the habit of sitting right in the center of the lane, because the odds are getting ever higher that I’ll have to scoot out of the way of some imbecile who’s coming in hot. That’s hard to do when your front tire is 24" away from the license plate of the car in front of you.

      • Lka1988@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        For me it depends which bike I’m riding. If it’s my 49cc scooter, I’ll sit to the very right side of the lane for a quick escape while watching my mirrors like a hawk. On my XR500, I’ll just filter to the front (legal in Utah).

        • Korhaka@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 days ago

          I filter to the front on my leg powered bike, most traffic light setups here have a region for bikes at the front of the cars.

  • captainastronaut@seattlelunarsociety.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).

    • ascense@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      Most frustrating thing is, as far as I can tell, Tesla doesn’t even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        Tesla’s argument of “well human eyes are like cameras therefore we shouldn’t use LiDAR” is so fucking dumb.

        Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.

        And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.

        • NABDad@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised

          A neural network that has been in development for 650 million years.

        • bluGill@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          9 days ago

          Anyone who has driven (or walked) into a sunrise/sunset knows that human vision is not very good. I’ve also driven in blizzards, heavy rain, and fog - all times when human vision is terrible. I’ve also not seen green lights (I’m colorblind).

          • explodicle@sh.itjust.works
            cake
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 days ago

            Bro I’m colorblind too and if you’re not sure what color the light is, you have to stop. Don’t put that on the rest of us.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              0
              ·
              9 days ago

              I can see red clearly and so not sure means I can go.

              I’ve only noticed issues in a few situations. When I’m driving at night and suddenly the weirdly aimed streetlight turns yellow - until it changed I didn’t even know there was a stoplight there. The second was I was making a left turn at sunset (sun behind me) and the green arrow came on but the red light remained on so I couldn’t see it was time/safe to go until my wife alerted me.

          • TheGrandNagus@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 days ago

            Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

            Human eyes are so far beyond it’s hard to even quantify.

            And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colourblind people.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              0
              ·
              9 days ago

              And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colour blind people

              Some lights are, but not all of them are. I often say I go when the light turns blue. However not all lights have that blue tint and so I often cannot tell the difference between a white light and a green light by color. (but white is not used in a stoplight and I can see red/yellow just fine) Where I live all stoplights have green on the bottom so that is always a cheat I use, but that only works if I can see the relative position - in an otherwise dark situation I only see a light in front of me and not the rest of the structure and so I cannot tell. I have driven where stoplights are not green on bottom and I can never remember if green is left/right.

              Even when the try though, not all colorblind is the same. There may not be a mitigation that will work from two different people with different aspects of colorblind.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              0
              ·
              9 days ago

              Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

              Why are you trying to limit cars to just vision? That is all I have as a human. However robots have radar, lidar, radio, and other options, there is no reasons they can’t use them and get information eyes cannot. Every option has limits.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      they originally had lidar, or radar, but musk had them disabled in the older models.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        They had radar. Tesla has never had lidar, but they do use lidar on test vehicles to ground truth their camera depth / velocity calculations.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      0
      ·
      9 days ago

      Accurate.

      Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

      1. The car’s cameras don’t detect the biker, or it just doesn’t stop for some reason.
      2. The driver isn’t paying attention to detect the system failure.
      3. The Tesla’s driver alertness tech fails to detect that the driver isn’t paying attention.

      Taking out the driver will make this already-unacceptably-lethal system even more lethal.

      • jonne@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago
        1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          9 days ago

          … Also accurate.

          God, it really is a nut punch. The system detects the crash is imminent.

          Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

          • jonne@infosec.pub
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 days ago

            Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.

              • jonne@infosec.pub
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 days ago

                If they ever fixed it, I’m sure Musk fired whomever is keeping score now. He’s going to launch the robotaxi stuff soon and it’s going to kill a bunch of people.

              • KayLeadfoot@fedia.ioOP
                link
                fedilink
                arrow-up
                0
                ·
                9 days ago

                NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

                The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

                • NotMyOldRedditName@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  8 days ago

                  So to drive with FSD is 8x safer than your average human driver.

                  WITH a supervising human.

                  Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.

                  Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it’s current state.

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          Even when it is just milliseconds before the crash, the computer turns itself off.

          Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.

      • br3d@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 days ago

        There’s at least two steps before those three:

        -1. Society has been built around the needs of the auto industry, locking people into car dependency

        1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
        • grue@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago
          1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody

          That’s a good thing, because the alternative would be flipping the notion of property rights on its head. Making the owner not responsible for his property would be used to justify stripping him of his right to modify it.

          You’re absolutely right about point -1 though.

          • explodicle@sh.itjust.works
            cake
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 days ago

            build, sell and drive

            You two don’t seem to strongly disagree. The driver is liable but should then sue the builder/seller for “self driving” fraud.

            • grue@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 days ago

              Maybe, if that two-step determination of liability is really what the parent commenter had in mind.

              I’m not so sure he’d agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.

                • grue@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  8 days ago

                  I mean, maybe, but previously when I’ve said that it’s typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it’s somehow suddenly too dangerous to allow owners to control their property just because software is involved.

  • Gork@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Lidar needs to be a mandated requirement for these systems.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 days ago

      Or at least something other than just cameras. Even just adding ultrasonic senses to the front would be an improvement.

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

      • Nastybutler@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        No, emergency braking with radar is mature and cheap. Lidar is very expensive and relatively nascent

    • TrackinDaKraken@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      How about we disallow it completely, until it’s proven to be SAFER than a human driver. Because, why even allow it if it’s only as safe?

      • explodicle@sh.itjust.works
        cake
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        As an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.

        • scarabic@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          9 days ago

          It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          8 days ago

          There’s been 54 reported fatalities involving their software over the years in the US.

          That’s around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.

          Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high. Most of the fatal stuff I’ve seen is always in the US.

          That equates to 1 fatal accident every 125.9 million miles.

          The USA average per 100 million miles is 1.33 deaths, so even doubling the deaths it’s less than the current national average. That’s the equivalent of 1.33 deaths every 167 million miles with Tesla’s software.

          Edit: I couldn’t math, fixed it. Also for FSD specifically, very few places have it. Mainly North America, and just recently, China. I wish we had fatalities for FSD specifically.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 days ago

    Hey guys relax! It’s all part of the learning experience of Tesla FSD.
    Some of you may die, but that’s a sacrifice I’m willing to make.

    Regards
    Elon Musk
    CEO of Tesla

      • JasonDJ@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        News on the first mission: Meteoroid crashes into full flying SpaceX rocket, killing all aboard.

    • Gammelfisch@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      +1 for you. However, replace “Regards” with the more appropriate words from the German language. The first with an S, and the second an H. I will not type that shit, fuck Leon and I hope the fucking Nazi owned Tesla factory outside of Berlin closes.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        Yes I’m not writing that shit, even in a sarcastic post. Bu I get your drift.
        On the other hand, since you are from Germany, VW group is absolutely killing it on EV recently IMO.
        They totally dominate top 10 EV here in Denmark, with 7 out of 10 top selling models!!
        They are competitively priced, and they are the best combination of quality and range in their price ranges.

  • Critical_Thinker@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Let’s get this out of the way: Felon Musk is a nazi asshole.

    Anyway, It should be criminal to do these comparisons without showing human drivers statistics for reference. I’m so sick of articles that leave out hard data. Show me deaths per billion miles driven for tesla, competitors, and humans.

    Then there’s shit like the boca raton crash, where they mention the car going 100 in a 45 and killing a motorcyclist, and then go on to say the only way to do that is to physically use the gas pedal and that it disables emergency breaking. Is it really a self driving car at that point when a user must actively engage to disable portions of the automation? If you take an action to override stopping, it’s not self driving. Stopping is a key function of how self driving tech self drives. It’s not like the car swerved to another lane and nailed someone, the driver literally did this.

    Bottom line I look at the media around self driving tech as sensationalist. Danger drives clicks. Felon Musk is a nazi asshole, but self driving tech isn’t made by the guy. it’s made by engineers. I wouldn’t buy a tesla unless he has no stake in the business, but I do believe people are far more dangerous behind the wheel in basically all typical driving scenarios.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      0
      ·
      8 days ago

      In Boca Raton, I’ve seen no evidence that the self-driving tech was inactive. According to the government, it is reported as a self-driving accident, and according to the driver in his court filings, it was active.

      Insanely, you can slam on the gas in Tesla’s self-driving mode, accelerate to 100MPH in a 45MPH zone, and strike another vehicle, all without the vehicle’s “traffic aware” automation effectively applying a brake.

      That’s not sensationalist. That really is just insanely designed.

      • Critical_Thinker@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        FTFA:

        Certain Tesla self-driving technologies are speed capped, but others are not. Simply pressing the accelerator will raise your speed in certain modes, and as we saw in the police filings from the Washington State case, pressing the accelerator also cancels emergency braking.

        That’s how you would strike a motorcyclist at such extreme speed, simply press the accelerator and all other inputs are apparently overridden.

        If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.

        The guy admitted to being intoxicted and held the gas down… what’s the self driving contribution to that?

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          8 days ago

          I know what’s in the article, boss. I wrote it. No need to tell me FTFA.

          TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car’s sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.

          I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.

          Here’s the manual, if you’re curious. It doesn’t work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that “The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control,” so it’s all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.

          https://www.tesla.com/ownersmanual/2012_2020_models/en_us/GUID-50331432-B914-400D-B93D-556EAD66FD0B.html#:~:text=Traffic-Aware Cruise Control determines,maintains a set driving speed.

          • Critical_Thinker@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 days ago

            So do you expect self driving tech to override human action? or do you expect human action to override self driving tech?

            I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that’s my assumption.

            With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn’t have happened. It wouldn’t have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.

            I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I’d trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.

            • KayLeadfoot@fedia.ioOP
              link
              fedilink
              arrow-up
              0
              ·
              8 days ago

              The driver being drunk doesn’t mean the self-driving feature should not detect motorcycles. The human is a fallback to the tech. The tech had to fail for this fatal crash to occur.

              If the system is advertised as overrriding the human speed inputs ( traffic aware cruise control, it is supposed to brake when it detects traffic, regardless of pedal inputs), then it should function as advertised.

              Incidentally, I agree, I broadly trust automated cars to act more predictably than human drivers. In the case of specifically Teslas and specifically motorcycles, it looks like something is going wrong. That’s what the data says, anyhow. If the government were functioning how it should, the tech would be disabled during the investigation, which is ongoing.

    • Gladaed@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      “Critical Thinker” Yikes. Somehow the right made that a forbidden word in my mind because they hide behind that as an excuse for asking terrible questions etc.

      Anyway. Allegedly the statistics are rather mediocre for self driving cars. But sadly I haven’t seen a good statistic about that, either. The issue here is that automatable tasks are lower risk driving situations so having a good statistic is near impossible. E.g. miles driven are heavily skewed when you are only used on highways as a driver. There are no simple numbers that will tell you anything of worth.

      That being said the title should be about the mistake that happened without fundamental statements (i.e. self driving is bad because motorcyclists die).

    • Nastybutler@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      He may not be an engineer, but he’s the one who made the decision to use strictly cameras rather than lidar, so yes, he’s responsible for these fatalities that other companies don’t have. You may not be a fan of Musk, but it sounds like you’re a fan of Tesla

    • WanderingThoughts@europe.pub
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      That’s why Tesla’s full self driving is officially still a level 2 cruise control. But of course they promise to jump directly to level 4 soon™.

  • lnxtx (xe/xem/xyr)@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Stop dehumanizing drivers who killed people.
    Feature, wrongly called, Full Self-Driving, shall be supervised at any time.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      I think it’s important to call out inattentive drivers while also calling out the systems and false advertising that may lead them to become less attentive.

      If these systems were marketed as “driver assistance systems” instead of “full self driving”, certainly more people would pay attention. The fact that they’ve been allowed to get away with this blatant false advertising is astonishing.

      They’re also obviously not adequately monitoring for driver attentiveness.

    • SouthEndSunset@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      If you’re going to say your car has “full self driving”, it should have that, not “full self driving (but needs monitoring.)” or “full self driving (but it disconnects 2 seconds before impact.)”.

    • TexasDrunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      I’m on mine far more often than I’m in a car. I think Tesla found out that I point and laugh at any cyber trucks I see at red lights while I’m out and is trying to kill me.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      As someone who likes the open sky feeling, this is why I drive a convertible instead.

        • Excrubulent@slrpnk.net
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 days ago

          I remember finding a motorcycle community on reddit that called themselves “squids” or “squiddies” or something like that.

          Their whole thing was putting road tyres on dirtbikes and riding urban environments like they were offroad obstacles. You know, ramping things, except on concrete.

          They loved to talk about how dumb & short-lived they were. I couldn’t ever find that group again, so maybe I misremembered the “squid” name, but I wanted to find them again, not to ever try it - fuck that - but because the bikes looked super cool. I just have a thing for gender-bent vehicles.

          • real_squids@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 days ago

            Calamari Racing Team. It’s mostly a counter-movement to r/Motorcycles, where most of the posters are seen as anti-fun. Their whole thing is that, not just a specific way to ride, they also have a legendary commenter that pays money for pics in full leather.

            • Excrubulent@slrpnk.net
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              8 days ago

              That’s the one! Thanks, that was un-googleable for me.

              I guess the road-tyres-on-dirt-bikes thing was maybe a trend when I saw the sub.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          8 days ago

          Bahaha, that one is new to me.

          Back when I worked on an ambulance, we called the no helmet guys organ donors.

          This comment was brought to you by PTSD, and has been redacted in a rare moment of sobriety.

          • mutual_ayed@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 days ago

            I also rammed 10cc spikes at the back of the bus, the world needs organ donors and motorcycles provide a great service for that. Hope your EMT career was short lived but rewarding.