• NIB@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash “less than humans”.

    Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.

    Noone but tesla is doing camera only “self driving” and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.

    https://www.youtube.com/watch?v=Gm2x6CVIXiE

    • TypicalHog@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      What are you? Some kind of lidar shill? Camera only should obviously be the endgame goal for all robots. Also, this article is not even about camera only.

        • TypicalHog@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Because that’s expensive and can be done with a camera. And once you figure the camera stuff out - you gucci. Now you can do all kinds of shit without needing a lidar on every single robot.

          • Zink@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            My eyes are decent, but if I had a sixth sense that gave me full accurate 3D 360 spatial awareness regardless of visibility, I would probably not turn it off just to use my eyes. I’d use both.

          • AdrianTheFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            Because that’s expensive and can be done with a camera.

            Expensive, as in probably less than $600? Compared to the $35000 cost of a tesla?

            (comparing the cost of the iPhone 12 (without lidar) and iPhone 12 pro (with lidar), we can guess that the sensor probably costs less than $200, so 3 of them (for left, right, and front) would cost probably less than $600)

            lidar can actually be very cheap and small. Unfortunately, Apple bought the only company that seems to make sensors like that (besides some other super high end models)

            There have been a lot of promising research papers on the technology lately though, so I expect more, higher resolution and cheaper lidar sensors to be available relatively soon (next couple years probably).

            • TypicalHog@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              Perhaps. Idk, maybe I’m wrong. But it for sure seems it would be so much better if we achieved the same shit with a cheaper and more primitive simpler sensor.

              • BURN@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                6 months ago

                To get the same resolution and quality of image in all lighting scenarios, cameras are actually going to be more expensive than LiDAR. Cameras suffer in low light, low contrast situations due to the physical limitations of bending light. More light = bigger lenses = higher cost, when LiDAR works better and is cheaper

            • Grippler@feddit.dk
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              6 months ago

              Yeah that’s not even remotely the same type of sensor used in robotics and autonomous cars. Yes lidar is getting cheaper, but for high detail long range detection they’re much more expensive than the case of your iphone example. The iPhone “lidar” is less than useless in an automotive context.

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Camera only should obviously be the endgame goal for all robots.

        I can’t tell if you’re a moron or attempting sarcasm but this is the least informed opinion I’ve seen in ages.

        • TypicalHog@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          I wasn’t attempting sarcasm, so maybe I’m a moron idk. Fair, it likely I’m uninformed. I just know my daddy Elon said something about how solving shit with camera only is probably the best path and will pay off.

      • howrar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        I’ve heard Elon Musk (or was it Karpathy?) talking about how camera should be sufficient for all scenarios because humans can do it on vision alone, but that’s poor reasoning IMO. Cars are not humans, so there’s no reason to confine them to the same limitations. If we want them to be safer and more capable than human drivers, one way to do that is by providing them with more information.

        • kingthrillgore@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          We built things like Lidars and ultrasound because we want better than our eyes at depth and sight.