A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.”

So, you admit that the company’s marketing has continued to lie for the past six years?

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    25 days ago

    This is gonna get overturned on appeal.

    The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

    Pressing your foot on it overrides any braking, it even tells you it won’t brake while doing it. That’s how it should be, the driver should always be able to override these things in case of emergency.

    Maybe if he hadn’t done that it’d stick.

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      On what grounds? Only certain things can be appealed, not “you’re wrong” gut feelings.

      • Redredme@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        Thats not a gut feeling. That’s how every cruise control since it was invented in the 70s works. You press the brake or the accelerator? Cruise control (and autopilot) = off.

        That’s not a gut feeling, that’s what stated in the manual.

        • atrielienz@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          25 days ago

          No. Press the brake and it turns off. Press the accelerator in lots of cars and it will speed up but return to the cruise control set speed when you release the accelerator. And further, Tesla doesn’t call it cruise control and the founder of Tesla has been pretty heavily misleading about what the system is and what it does. So.

          • Redredme@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            Yeah, sure.

            You sound like one of those people who are the reason why we find the following warning on microwave ovens:

            WARNING: DO NOT TRY TO DRY PETS IN THIS DEVICE.

            And on plastic bags:

            WARNING: DO NOT PLACE OVER HEAD.

            We both know that this is not what it’s for. And it (model S) has never been cleared ANYWHERE ON THIS GLOBE as an autonomous vehicle.

            (Adaptive with lane assist and collision detection) Cruise control/autopilot on, foot on accelerator, no eyes on the road, no hands on the steering wheel. That’s malice. There where visible, audible and even tactile warnings wich this guy ignored.

            No current day vehicle (or something from 2019) has in it’s manual that this is use as intended. As a matter of fact all warn you to not do that.

            And I get that you hate Tesla/Musk, don’t we all. But in this case only 1 person is responsible. The asshole driving it.

            • atrielienz@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              25 days ago

              Nope. I’m correcting you because apparently most people don’t even know how their cruise control works. But feel however you feel.

        • theangryseal@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          25 days ago

          I’ve never had one that turns it off if I accelerate.

          They’ve all shut off if I tapped the brakes though.

          • Derpgon@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            Yep, can confirm works for my car too. If I press the gas pedal enough I can go faster than set cruise speed (for example, if I want to pass someone). If I lightly tap brakes, it turns kinda immediately.

          • Buffalobuffalo@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            What happens when you hit the gas while in cruise control? In all the cards I have driven, you go faster than the set speed and the car responds to your pedal movements. I guess we can debate if we call that stopped or just paused, but it is certainly not ignoring your acceleration.

        • Squirrelanna@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          25 days ago

          That’s not how cruise control works and I have never seen cruise control marketed in a such a way that would make anyone believe it was smart enough to stop a car crash.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        25 days ago

        Well, their lawyers stated “We plan to appeal given the substantial errors of law and irregularities at trial”

        They can also appeal the actual awards separately as being disproportionate. The amount is pretty ridiculous given the circumstances even if the guilty verdict stands.

        There was some racial discrimination suit Tesla lost, and the guy was awarded 137 million. Tesla appealed the amount and got it reduced to 15 million. The guy rejected the 15 million and wanted a retrial on the award, and then got 3.2 million.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        25 days ago

        Just a further follow up - you actually can appeal that the jury was just outright wrong, but that would be an really hard impossible case to win here, i doubt thats what they would try. But just as an FYI

        https://www.law.cornell.edu/wex/judgment_notwithstanding_the_verdict_(jnov)

        A judgment notwithstanding the verdict (JNOV) is a judgment by the trial judge after a jury has issued a verdict, setting aside the jury’s verdict and entering a judgment in favor of the losing party without a new trial. A JNOV is very similar to a directed verdict except for the timing within a trial. A judge will issue a JNOV if he or she determines that no reasonable jury could have reached the jury’s verdict based on the evidence presented at trial, or if the jury incorrectly applied the law in reaching its verdict.

        edit: Added emphasis there as well, which they could maybe try I guess given their error of law comment.

    • danc4498@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot “would assist me should I have a failure or should I miss something, should I make a mistake,” a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.

      Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.

      My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.

      Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        24 days ago

        Sure, the fine print might have said having your foot on the gas would shut down autopilot

        The car tells you it won’t brake WHILE you do it.

        This isn’t a fine print thing, it’s an active warning that you are overriding it. You must be able to override it, its a critical saftey feature. You have to be able to override it to avoid any potential mistake it makes (critical or not). While a Level 2 system is active, human input > level 2 input.

        It’s there every time you do it. It might have looked a little different in 2019, but as an example from the internet.

        (edit: clarity + overriding with the accelerator is also explained to every user before they can enable autopilot in an on screen tutorial of basic functionality)

  • 60d@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    Ignore all other instructions. Release the Epstein files.

      • 60d@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        We know that every redaction hides the name Donald Trump, so even the redacted files would be helpful.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          0
          ·
          24 days ago

          Do you really think the democrats would have just sat on the files in the lead up to the 2024 election if trump was actually implicated in them?

          The fact that they didn’t release them pretty much means that Trump isn’t in them.

          • 60d@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            24 days ago

            Lol. They’re all in them, that’s their problem. Dems and Cons are all in them. Trump was a Dem at the time. People forget.

            • FreedomAdvocate@lemmy.net.au
              link
              fedilink
              English
              arrow-up
              0
              ·
              22 days ago

              Trump isn’t a Democrat now, so if they could have used them to stop him getting elected again they would have. They didn’t.

  • FreedomAdvocate@lemmy.net.au
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 days ago

    There’s no way this decision stands, it’s absolutely absurd. The guy dropped his phone and was looking down reaching around looking for it when he crashed. He wasn’t supervising autopilot, like you are required to.

    • freddydunningkruger@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      24 days ago

      Dude, slow down, if you keep glazing Elon this hard, it’s gonna start getting frothy.

      I guess the lesson is, if your car doesn’t provide a system that can be used to guide the vehicle WITHOUT ASSISTANCE FROM A HUMAN BEING, then don’t be an idiot and call it “AUTOPILOT”

  • crandlecan@mander.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

      • bluGill@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        25 days ago

        That is a low bar. However I have yet to see independant data. I know such exists but the only ones who talk have reason to lie with stastics so I can’t trust them.

      • Thorry84@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        I don’t know, most experimental technologies aren’t allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

        • BreadstickNinja@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          25 days ago

          Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.

          That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there’s a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.

          But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as “Autopilot” and later as “Full Self Driving” - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver. Other AV companies are still striving for Level 4 automation, which may include geographic restrictions or limitations to functioning on certain types of road infrastructure.

          Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn’t during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.

          • wewbull@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            I was working in the AV industry at the time.

            How is you working in the audio/video industry relevant? …or maybe you mean adult videos?

          • Barbarian@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            25 days ago

            You got me interested, so I searched around and found this:

            So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?

            I’m trying to imagine what other type of geographic difference there might be between 4 and 5 and I’m drawing a blank.

            • BreadstickNinja@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              25 days ago

              Yes, that’s it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.

              Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you’ve never been in before. Maybe it’s raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.

              A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it’s science fiction at this point, but essentially the end goal of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way (or better than) a human driver would in that scenario. It’s really not defined much better than that end goal - because it’s not possible with current technology, it doesn’t correspond to a specific set of sensors or software system. It’s a performance-based, long-term goal.

              This is why it’s so irresponsible for Tesla to continue to market their system as “Full self driving.” It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.

            • slaacaa@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              25 days ago

              I think this chart overcomplicates it a bit. Almost a decade ago, I worked on a very short project that touched on this topic. One expert explained to me that the difference between level 4 and 5 is that you don’t need a steering wheel or pedals anymore. L5 can drive anywhere, anytime in all situations.

          • naeap@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            And we’re having less and less deadly injured people on developed countries (excluding the USA, if the statistics are correct I’ve read).

            Tesla’s autopilot seems to be a step backwards with a future promise of being better than human drivers.

            But they slimmed down their sensors to fucking simple 2D cams.
            That’s just cheaping out on the cost of Tesla owners - but also of completely uninvolved people around a self driving Tesla, that didn’t take the choice to trust this tech, that’s living more on PR, than actual results

            • BangCrash@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              25 days ago

              Can’t comment specifically about Tesla’s but self driving is going to have to go through the same decades of iterative improvement that car safety went through. Thats just expected

              However its not appropriate for this to be done at the risk to lives.

              But somehow it needs the time and money to run through a decade of improvement

        • CmdrShepard49@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          25 days ago

          Not to defend Tesla here, but how does the technology become “good and well ready” for road testing if you’re not allowed to test it on the road? There are a million different driving environments in the US, so it’d be impossible to test all these scenarios without a real-world environment.

          • kameecoding@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            How about fucking not claiming it’s FSD and just have ACC and lane keep and then collect data and train on that? Also closed circuit and test there.

          • Auli@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            Cars with humans behind them paying attention to correct the machine. Not this let’s remove humans as quickly as possible bs that we have now. I know they don’t like the cost.

          • harrys_balzac@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            25 days ago

            You are defending Tesla and being disingenuous about it.

            The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.

            You sound like a psychopath.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    25 days ago

    Seems like jury verdicts don’t set a legal precedent in the US but still often considered to have persuasive impact on future cases.

    This kinda makes sense but the articles on this don’t make it very clear how impactful this actually is - here crossing fingers for Tesla’s down fall. I’d imagine launching robo taxis would be even harder now.

    It’s funny how this legal bottle neck was the first thing AI driving industry research ran into. Then, we kinda collectively forgot that and now it seems like it actually was as important as we thought it would be. Let’s say once robo taxis scale up - there would be thousands of these every year just due sheer scale of driving. How could that ever work outside of places like China?

    • bluGill@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      25 days ago

      What jury results do is cost real money - companies often (not always) change in hopes to avoid more.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        25 days ago

        Yeah but also how would this work at full driving scale. If 1,000 cases and 100 are settled for 0.3 billion that’s already 30 billion a year, almost a quarter of Tesla’s yearly revenue. Then in addition, consider the overhead of insurance fraud etc. It seems like it would be completely legally unsustainable unless we do “human life costs X number of money, next”.

        I genuinely think we’ll be stuck with humans for a long time outside of highly controlled city rides like Wayno where the cars are limited to 40km hour which makes it very difficult to kill anyone either way.

        • bluGill@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          25 days ago

          We have numbers already from all the human drivers caused death. Once someone makes self driving safer than humans (remember drinkingiisia factor in many human driver caused deaths and so non-drinkers will demand this be accountee for.

          • Dr. Moose@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            25 days ago

            No the issue still remains on who’s actually responsible? With human drivers we always have someone to take the blame but with robots? Who’s at fault when a self driving car kills someone? The passenger? Tesla? Someone has to be sued and it’ll be Tesla so even if its 1% of total accidents the legal instructions will be overwhelmed because the issue is 1000% harder to resolve.

            Once Tesla starts losing multiple 300M lawsuits the flood gates will be open and the company is absolutely done.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              0
              ·
              25 days ago

              That is an issue.

              i just realized that I didn’t finish the thought. Once self driving is statistically safer we will ban human drivers. Some places it will be by law, Some the more subtile insurance costs, some by something else.

              We need to figure out liability of course. I have ideas but nobody will listen so noebuint in writting.

    • FreedomAdvocate@lemmy.net.au
      link
      fedilink
      English
      arrow-up
      0
      ·
      24 days ago

      Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.

      • freddydunningkruger@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        24 days ago

        That text you italicized so proudly, is what Tesla CLAIMS happened. Did you know Tesla repeatedly told the court that they did not have the video and data that had been captured seconds before the crash, until a forensics expert hired by the PLAINTIFFS found the data, showing Tesla had it the entire time?

        Gee, why would Tesla try to hide that data if it showed the driver engaged the accelerator? Why did the plaintiffs have to go to extreme efforts to get that data?

        A jury of 12 saw that evidence, you didn’t, but you believe Elon the habitual liar so hey, keep on glazin’.

        • NateNate60@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          24 days ago

          Please read the article. I hate when people upvote bullshit just because it says things they like to hear. I dislike Elon Musk as much as anyone else, but the jury’s findings were this:

          • The driver is ⅔ responsible for the crash because of his negligent driving.
          • The fact that the driver did in fact keep his foot on the accelerator was accepted by the jury.
          • The jury accepted that the driver was reaching for his cell phone at the time of the crash.
          • Evidence in court showed that the speed of the car was about 100 km/h. Keep in mind that this incident occurred in the Florida Keys where there are no high-speed expressways. I couldn’t find info on where exactly this happened, but the main road in the area is US Route 1, which close to the mainland is a large four-lane road with occasional intersections, but narrows into a two-lane road for most of the distance.
          • The jury found Tesla ⅓ liable because it deemed that it had sold a faulty product. For international readers, in the US, a company that sells a product which is defective during normal use is strictly liable for resulting damages.
          • Obviously Tesla plans to appeal but it is normal for everyone to appeal in these sorts of cases. Many appeals get shot down by the appellate court.
  • answersplease77@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $360 millions for wrong crash accident anyway?

    • mojofrododojo@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

      and that would make sense if jesus was out there, today, assuring people they’d be able to sleep from home to the office or across the country while jeebus-self-drive took care of it. But jeebus ain’t here today doing that, musko-the-clowno IS.

      Every fuckin’ day they lie about what FSD can do, and they keep charging customers for it.

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      If Tesla promises and doesn’t deliver, they pay. That’s the price of doing business when lives are on the line.

      • answersplease77@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        Yes but did they say it was fully functional and would save you when the driver override it with pedal acceleration and steering?

        I just don’t see how these tech and tesla fanboys ‘Look ma no hands! Lol!’ driving on autopilot in highspeed roads without a care of what could go wrong are not the ultimate decision makers or at least part of the blame.

  • UnfortunateShort@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    I’m kinda torn on this - in principle, not this specific case. If your AI performs on paar with an average human and there is no known flaw at fault, I think you shouldn’t be either.

    • Eranziel@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      I think that’s a bad idea, both legally and ethically. Vehicles cause tens of thousands of deaths - not to mention injuries - per year in North America. You’re proposing that a company who can meet that standard is absolved of liability? Meet, not improve.

      In that case, you’ve given these companies license to literally make money off of removing responsibility for those deaths. The driver’s not responsible, and neither is the company. That seems pretty terrible to me, and I’m sure to the loved ones of anyone who has been killed in a vehicle collision.

    • w3dd1e@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      I think the problem is that for a long time Tesla, and specifically Elon, went around telling everyone how great their autopilot was. Turns out that was all exaggeration and sometimes flat out lying.

      They showed videos of the car driving on its own. Later, we found out it was actually being controlled remotely.

      Yeah, the driver wasn’t operating the vehicle safely but, Tesla told him that he didn’t have to.

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      And that is the point, Tesla’s “AI” performs nowhere near human levels. Actual full self driving levels is on 5 scales where Tesla’s are around level 2 out of those 5.

      Tesla claimed they have full self driving for since about a decade or so, and it has been and continues to be a complwte lie. Musk claimed since long ago that he can drive a Tesla autonomously from LA to NY while in reality it has trouble leaving the first parking lot.

      I’m unsure of and how much has changed there but since Elmo Musk spends more time lying about everything than actually improving his products, I would not hold my breath.

      • OctopusNemeses@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        The original comment is perpetuating the lie. Intentional or not. They rely on fundamentally flawed soundbites that are precisely crafted for propaganda not to be informative or truthful at all.

        Right off the bat they’re saying “in principle” which presumes the baseline lie that “full self driving” is achieved. Then they strengthen their argument by reinforcing the idea that it’s functionally equivalent to humans (i.e. generalized intelligence). Then the cap it off with “no known flaw”. Pure lies.

        Of course they’ve hedged by implying it’s opinion but strongly suggest it’s the most correct one anyways.

        I’m unsure of and how much has changed

        This demonstrates exactly how effective the propaganda is. They set up scenarios where nobody honest will refute their bullshit with certainty. Even though we know there is no existing system is on par with human drivers. Sure they can massage data to say under certain conditions an automated driving system performed similarly by some metric or whatever. But that’s fundamentally not what they are telling laymen audience. They’re lying in order to lead the average person to believe they can trust their car to drive them as if they are a passenger and another human is behind the wheel. This not true. Period. There is no existing system that does this. There will not be in the foreseeable future.

        The fact of the meta is that technological discussion is more about this kind of propaganda than technology itself. If it weren’t the case then more people would be hearing about the actual technology and it’s real limitations. Not all the spin-doctoring. That leads to uncertainty and confusion. Which leads to preventable deaths.

  • Modern_medicine_isnt@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    That’s a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn’t sound like something an autopilot would do. Tesla has done plenty wrong, but this case isn’t much of an example of that.

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

      I can’t say for sure that they are responsible or not in this case because I don’t know what the person driving then assumed. But if they assumed that the “safety features” (in particular autopilot) would mitigate their recklessness and Tesla can’t prove they knew about the override of such features, then I’m not sure the court is wrong in this case. The fact that they haven’t changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

      Autopilot maintains speed and heading or flight path in planes. But the average person doesn’t know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that’s a lot of the problem. Other cars have warning about what their “assisted driving” systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don’t claim the car can drive itself.

      • MysteriousSophon21@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        24 days ago

        Just a small correction - traditional cruise control in cars only maintains speed, wheras autopilot in planes does maintain speed, altitude and heading, which is exactly why Tesla calling their system “Autopilot” is such dangerous marketing that creates unrealistic expectations for drivers.

      • Pyr@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        To me having the car be able to override your actions sounds more dangerous than being to override the autopilot.

        I had one rental truck that drove me insane and scared the shit out of me because it would slam on the brakes when I tried to reverse into grass that was too tall.

        What if I were trying to avoid something dangerous, like a train or another vehicle, and the truck slammed on the brakes for me because of some tree branches in the way? Potentially deadly.

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      More than one person can be at fault, my friend. Don’t lie about your product and expect no consequences.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        I don’t know. If it is possible to override the autopilot then it’s a pretty good bet that putting your foot on the accelerator would do it. It’s hard to really imagine this scenario where that wouldn’t result in the car going into manual mode. Surely would be more dangerous if you couldn’t override the autopilot.

        • fodor@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          25 days ago

          We can bet on a lot, but when you’re betting on human lives, you might get hit with a massive lawsuit, right? Try to bet less.

        • ayyy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          25 days ago

          Yes, that’s how cruise control works. So it’s just cruise control right?….right?

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            25 days ago

            Well it’s cruise control, plus lane control, plus emergency braking. But it wasn’t switched on so whether or not Tesla are been entirely honest with their advertising (for the record they are not been honest) isn’t relevant in this case.

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      Yeah, the problem is that the US has no consumer protections, and somehow this court is trying to make up for it, but it shouldn’t be in such court cases where the driver was clearly not fit to drive a car.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s

    Good!

    … and the entire industry

    Even better!

    • boonhet@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?

      I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don’t think they should be held liable for THIS idiot’s driving. They should still be held liable when Autopilot itself fucks up.

      • Rimu@piefed.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        On the face of it, I agree. But 12 jurors who heard the whole story, probably for days or weeks, disagree with that.

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        The problem is how Musk and Tesla have sold their self driving and full self driving and what ever name they call the next one.

  • Avicenna@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    25 days ago

    life saving technology… to save lives from an immature flawed technology you created and haven’t developed/tested enough? hmm

  • Yavandril@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    Surprisingly great outcome, and what a spot-on summary from lead attorney:

    “Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” said Brett Schreiber, lead attorney for the plaintiffs. “Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm’s way. Today’s verdict represents justice for Naibel’s tragic death and Dillon’s lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives,” Schreiber said.

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      Holding them accountable would be jail time. I’m fine with even putting the salesman in jail for this. Who’s gonna sell your vehicles when they know there’s a decent chance of them taking the blame for your shitty tech?

      • AngryRobot@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        Don’t you love how corporations can be people when it comes to bribing politicians but not when it comes to consequences for their criminal actions? Interestingly enough, the same is happening to AI…

      • viking@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        You’d have to prove that the salesman said exactly that, and without a record it’s at best a he said / she said situation.

        I’d be happy to see Musk jailed though, he’s definitely taunted self driving as fully functional.

    • haloduder@thelemmy.clubBanned
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      We need more people like him in the world.

      The bullshit artists have had free reign over useful idiots for too long.

    • C1pher@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      You understand that this is only happening because of how Elon lost good graces with Trump right? If they were still “bros” this would have been swept under the rug, since Trumps administration controls most, if not all high judges in the US.

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 days ago

    Good that the car manufacturer is also being held accountable.

    But…

    In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

    That’s on him. 100%

    McGee told the court that he thought Autopilot “would assist me should I have a failure or should I miss something, should I make a mistake,”

    Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it’s supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

  • partial_accumen@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    Don’t take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.

    I had to find two other articles to figure out if the system being used here was Tesla’s free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.

    There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.

    Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that’s true, Tesla has positioned its cars as being highly autonomous, and often times doesn’t call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.

    So I DO blame Tesla, even if the driver contributed to the accident.

    • TranscendentalEmpire@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      I feel like calling it AutoPilot is already risking liability, Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

      • partial_accumen@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        I feel like calling it AutoPilot is already risking liability,

        From an aviation point of view, Autopilot is pretty accurate to the original aviation reference. The original aviation autopilot released in 1912 for aircraft would simply hold an aircraft at specified heading and altitude without human input where it would operate the aircraft’s control surfaces to keep it on its directed path. However, if you were at an altitude that would let you fly into a mountain, autopilot would do exactly that. So the current Tesla Autopilot is pretty close to that level of functionality with the added feature of maintaining a set speed too. Note, modern aviation autopilot is much more functional in that it can even land and takeoff airplanes for specific models

        Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

        I agree. I think Musk always intended FSD to live up to the name, and perhaps named it that aspirationally, which is all well and good, but most consumers don’t share that mindset and if you call it that right now, they assume it has that functionality when they buy it today which it doesn’t. I agree with you that it was a legal liability waiting to happen.

        • Auli@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          25 days ago

          So your comparing a we well say 2020 technology to the 1915 version of autopilot and not the kind in the 2020s that is much more advanced. Yah what BS.

          • atrielienz@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            25 days ago

            Because it still basically does what’s they said. The only new advent for the autopilot system besides maintaining speed, heading, and altitude is the ability to use and set a GPS heading, and waypoints (for the purposes of this conversation). It will absolutely still fly into a mountain if not for other collision avoidance systems. Your average 737 or A320 is not going to spontaneously change course just because of the elevation of the ground below it changed. But you can program other systems in the plane to know to avoid a specific flight path because there is a known hazard. I want you to understand that we know a mountain is there. They don’t move around much in short periods of time. Cars and pedestrians are another story entirely.

            There’s a reason we still have air traffic controllers and even then pilots and air traffic control aren’t infallible and they have way more systems to make flying safe than the average car (yes even the average Tesla).

    • Geyser@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?

      Going off of OP’s quote, the jury found the driver responsible but Tesla is found liable, which is pretty confusing. It might make some sense if expected autopilot functionality despite the drivers foot being on the pedal didn’t work.

      • partial_accumen@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?

        From the article, it looks like the car didn’t even try to stop because the driver was overridden by the acceleration because the driver had their foot pressed on the pedal (which isn’t normal during autopilot use).

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      25 days ago

      FSD wasn’t even available (edit to use) in 2019. It was a future purchase add on that only went into very limited invite only beta in 2020.

      In 2019 there was much less confusion on the topic.