• mihnt@lemy.lol
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Why are you using VGA when DVI-D exists? Or Displayport for that matter.

        • Pantherina@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          7 months ago

          Why should I? Full HD and working well, no reason to do so, new displays are 100€+ which is freaking expensive for that improvement

          • mihnt@lemy.lol
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            Because there’s plenty of used monitors to be had out there that have DVI on them in some capacity for very reasonable prices.

            For instance I just purchased 4 x 24inch Samsung monitors for $15 USD each.

    • renzev@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      All those new video standards are pointless. VGA supports 1080p at 30Hz just fine, anything more than that is unnecessary. Plus, VGA is easier to implement that HDMI or Displayport, keeping prices down. Not to mention the connector is more durable (well, maybe DVI is comparable in terms of durability)

      • Fuck spez@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        7 months ago

        VGA is analog. You ever look at an analog-connected display next to an identical one that’s connected with HDMI/DP/DVI? Also, a majority of modern systems are running at around 2-4 * 1080p, and that’s hardly unnecessary for someone who spends 8+ hours in front of one or more monitors.

        • renzev@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          I look at my laptop’s internal display side-by-side with an external VGA monitor at my desk nearly every day. Not exactly a one-to-one comparison, but I wouldn’t say one is noticeably worse than the other. I also used to be under the impression that lack of error correction degrades the image quality, but in reality it just doesn’t seem to be perceptible, at least over short cables with no strong sources of interference.

      • mihnt@lemy.lol
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        I think you are speaking on some very different use cases than most people.

        • renzev@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          Really, what “normal people” use cases are there for a resolution higher than 1080p? It’s perfectly fine for writing code, editing documents, watching movies, etc. If you are able to discern the pixels, it just means you’re sitting too close to your monitor and hurting your eyes. Any higher than 1080p and, at best you don’t notice the difference, at worst you have to use hacks like UI Scaling or non-native resolution to get UI elements to display at a reasonable size.

          • mihnt@lemy.lol
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            You had 30Hz when I read your comment. Which is why I said what I said. Still, there’s a lot of benefit for having a higher refresh rate. As far as user comfort goes.

          • everett@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            7 months ago

            Shaper text for reading more comfortably and viewing photos at nearly full resolution. You don’t have to discern individual pixels to benefit from either of these. And small UI elements like thumbnails can actually show some detail.