• daikiki@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    According to who? Did the NTSB clear this? Are they even allowed to clear this? If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?

    • maynarkh@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      According to who? Did the NTSB clear this?

      Yes.

      If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?

      Yes, the judge will let the driver off the hook, because Mercedes told them it will assume the liability instead.

    • Trollception@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      7 months ago

      You do realize humans kill hundreds of other humans a day in cars, right? Is it possible that autonomous vehicles may actually be safer than a human driver?

      • stoly@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Only on closed courses. The best AI lacks the basic heuristics of a child and you simply can’t account for all possible outcomes.

      • KredeSeraf@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Sure. But no system is 100% effective and all of their questions are legit and important to answer. If I got hit by one of these tomorrow I want to know the process for fault, compensation and pathway to improvement are all already done not something my accident is going to landmark.

        But that being said, I was a licensing examiner for 2 years and quit because they kept making it easier to pass and I was forced to pass so many people who should not be on the road.

        I think this idea is sound, but that doesn’t mean there aren’t things to address around it.

        • Trollception@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          Honestly I’m sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.

          • KredeSeraf@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            7 months ago

            That still doesn’t address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because “it’ll eventually get better” is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.

            But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn’t clearly codified law I might not, and you might be callous enough to say you don’t care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you’d have to go through a long, expensive court battle to determine fault because no one had done it it. So you’re in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn’t your fault.

            That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.

            • Llewellyn@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              7 months ago

              But then it’s good that the manufacturer states the driver isn’t obliged to watch the road. Because it shifts responsibility towards the manufacturer and thus - it’s a great incentive to make technology as safer as possible.

            • Trollception@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              7 months ago

              To be clear I never said that I didn’t care about an individual’s safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.

              The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn’t count as it’s not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.

              https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars

          • MeDuViNoX@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            7 months ago

            Can’t the entry point just be that you have to pay attention while it’s driving for you until they figure it out?

      • Adanisi@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        7 months ago

        *at 40mph on a clear straight road on a sunny day in a constant stream of traffic with no unexpected happenings, Ts&Cs apply.