The car came to rest more than 70 metres away, on the opposite side of the road, leaving a trail of wreckage. According to witnesses, the Model S burst into flames while still airborne. Several passersby tried to open the doors and rescue the driver, but they couldn’t unlock the car. When they heard explosions and saw flames through the windows, they retreated. Even the firefighters, who arrived 20 minutes later, could do nothing but watch the Tesla burn.

At that moment, Rita Meier was unaware of the crash. She tried calling her husband, but he didn’t pick up. When he still hadn’t returned her call hours later – highly unusual for this devoted father – she attempted to track his car using Tesla’s app. It no longer worked. By the time police officers rang her doorbell late that night, Meier was already bracing for the worst.

      • romantired@shibanu.app
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I am one of those who do not participate in the circus performance. I just sit in the front row and watch the clowns.

            • AreaSIX @lemmy.zip
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              Obviously you don’t know what a ‘quote’ is. Carlin never said such a thing. I don’t need to look it up to know he never said that, because Carlin was perhaps the best comedy writer of all time, and your ‘quote’ seems like it’s engineered to be unfunny.

    • Part4@infosec.pub
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Clearly this premise, upon which your further exchanges is based, is complete bullshit.

      You are a troll, presumably one for whom any response is a win. It gives you a little dopamine hit.

      What a pathetic place to get to - there are a million ways to get a dopamine hit less pathetic than this, including all of the major addictive drugs.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      The very least he could do is not sell unsafe vehicles. It’s literally the very least he can do but he can’t be asked to do that because of his ego. I condemn him for that.

  • TankovayaDiviziya@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    News of malfunctioning Tesla cars and Musk going crazy are still not enough to crash Tesla stocks to zero. Which I am hoping will happen not just to inflict sorrow on Musk and his wealth, but so that I could hedge against the stock 😂

  • stoy@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I have never ridden a Tesla, and I plan on requesting a non Tesla car from now on when I have to take a taxi.

    Cars in general, Teslas in particular, should have a standardized blackbox data recorder that third parties can open and access the logs, we have had this kind of tech on aircrafts for many decades.

    It is terrifying that Tesla can just say that there was no relevant data and the investigative agency will just accept that.

    I remember watching an episode of Air Crash Investigations, where a plane crashed, and they could not find an immediate cause, but the flight data recorder was able to be analysed far back, way before the accident flight, and they noticed that a mount for the APU turbine had broken many flights earlier, and the APU had broken free during the flight, causing the crash.

    It is not Tesla’s job to tell the investigators what is relevant and not, it is Teslas job to unlock all data they have and send it to the investigators, if they can’t or won’t, then Tesla should lose the right sell cars in Europe

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Cars do have that in what amounts to a TCU or Telematics Control Unit. The main problem here isn’t whether or not cars have that technology. It’s about the relevant government agency forcing companies like Tesla (and other automakers) to produce that data not just when there’s a crash, but as a matter of course.

      I have a lot of questions about why Tesla’s are allowed on public roads when some of the models haven’t been crash tested. I have a lot of questions about why a company wouldn’t hand over data in the event of a crash without the requirement of a court order. I don’t necessarily agree that cars should be able to track us (if I buy it I own it and nobody should have that kind of data without my say so). But since we already have cars that do phone this data home, local, state, and federal government should have access to it. Especially when insurance companies are happy to use it to place blame in the event of a crash so they don’t have to pay out an insurance policy.

  • Jo Miran@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I drive a BMW i4 and one of the reasons I prefer it is because it still uses a number of mechanical options like physical buttons and an actual door handle. I never trusted that flush handle from Tesla, even back when I liked Tesla.

    • teuniac_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Other road users don’t have anything to do with it though, including those who aren’t even driving

  • firepenny@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Seems like a lot of this technology is very untested and there are too many variables to make it where it should be out on the roads.

      • itsprobablyfine@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        It’s been a nightmare seeing tech companies move into the utility space and act like they’re the smartest people in the room and the experts that have been doing it for 100 years are morons. Move fast and break things isn’t viable when you’re operating power infrastructure either. There’s a reason why designs require the seal of a licensed engineer before they can be constructed. Applying a software development mentality to any kind of engineering is asking for fatalities

  • RunawayFixer@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    FYI, some numbers. The guardian article is still definitely worth reading, it just had no statistics.

    *Nationally (USA), Tesla drivers had 26.67 accidents per 1,000 drivers. This was up from 23.54 last year.

    The Ram and Subaru brands were again among the most accident-prone. Ram had 23.15 per 1,000 drivers while Subaru had 22.89.

    As of October 2024, there have been hundreds of documented nonfatal incidents involving Autopilot and fifty-one reported fatalities, forty-four of which NHTSA investigations or expert testimony later verified and two that NHTSA’s Office of Defect Investigations verified as happening during the engagement of Full Self-Driving (FSD).*

    https://www.forbes.com/sites/stevebanker/2025/02/11/tesla-again-has-the-highest-accident-rate-of-any-auto-brand/

  • ComradeSharkfucker@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    Article does not actually answer why Tesla vehicles crash as much as they do or how they crash frequency compares to other vehicles. Its more about how scummy tesla is as a company and how it witholds data from the public when it could incriminate them.

      • dickalan@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Yeah, it’s because they didn’t put a lidar on their fucking cars because they’re cheap, It’s not a mystery, why don’t you know this?

    • GroundedGator@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      In some ways that is the answer. Crashes keep happening because they are not being held accountable to regulators because they are not reporting these incidents and no one is exercising oversight to be sure the reporting matches reality.

      I think over the years, accurate reporting by manufacturers has been done because they generally do not want to be known as that car company that killed a child and it could have been prevented with a 50 cent bolt. As a result, regulators have been less hawkish. Of course there are probably political donations in the US to help keep the wheels turning.

  • shiroininja@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Bad code. Guinea pig owners. Cars not communicating with each other. Relying on just the car’s vision and location is stupid.

    • Andy@slrpnk.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Also, not only do they rely on “just vision”, crucially they rely on real-time processing without any memory or persistent mapping.

      This, more than anything else is what bewilders me most.

      They could map an area, and when observing a construction hazard save that data and share it with other vehicles so they know when route setting or anticipate the object. Not they don’t. If it drives past a hazard and goes around the block it has to figure out how to navigate the hazard again with no familiarity. That’s so foolish.

      • Pup Biru@aussie.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        and what’s even more ridiculous than that (imo) is that if every tesla mapped the area, you’d get it from loads of different angles: no more “oops 1 off computer vision edge case”

    • GenosseFlosse@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      If they would use lidar you would get speed and distance from surrounding objects, which seems like valuable data for a moving object. With cameras you get a 2d picture that can only guestimate distance using multiple cameras and software.

  • But_my_mom_says_im_cool@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Tesla’s garbage quality is sadly hurting the entire EV and self driving industry. Self driving cars will always have accidents. But a good self driving company will use every single accident to ensure that never happens again with their system. Humans can make the same error over and over but once self driving has been around a while, the rates of sef driving caused accidents will reduce more and more every year.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      We’ll never have self-driving cars en masse, because for some reason society has accepted that humans make mistakes and sometimes people die, but they can’t do the same for robots, even if they make far fewer of them.

  • This is the kind of shit that makes me worried even seeing someone else driving one of these deathtraps near me while I am driving. They could explode or decide to turn into me on the highway or something. I think I about this more than Final Destination when seeing a logging truck these days.

    • Joeffect@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      It’s one of those rules you make for yourself when you drive…

      Like no driving next to people with dents…

      Or

      Stay away from trucks with random shit in the back not strapped down …

      No driving near New cars, they are new and or it’s because they got into an accident so best just be safe…

      So

      No driving near a Tesla…

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Wait, I might know the answer. Is it because they don’t use LIDAR and they’re made by a company headed by some piece of shit who likes to cut costs? Haha, I was just guessing, but ok.