• Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    So why is it rejected?

    Just because they’re still trying to use HDMI to prevent piracy? Who in fuck’s name is using HDMI capture for piracy? On a 24fps movie, that’s 237MB of data to process every second just for the video. A 2 hour movie would be 1.6TB. Plus the audio would likely be over 2TB.

    I’ve got a Jellyfin server packed with 4K Blu-ray rips that suggest there are easier ways to get at that data.

    • lengau@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      The profiles HDMI 2.1 enables are even worse - 4k@120fps type stuff. Not exactly something needed for a movie.

    • dangblingus@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Most people don’t pirate 4K media due to file size and internet speed constraints. Most people pirate 1080p video. There’s also the prospect of people pirating live television, which HDMI capture would be perfect for.

      • Psythik@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Then most people need get a better ISP. My crappy $60/mo fixed 5G can download an entire 4K film in under 10 minutes or start streaming it within a second. Y’all should see if there are any options beyond cable and DSL in your town. You might be pleasantly surprised what’s available these days.

        • nymwit@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 months ago

          Is that not a compressed stream though? Genuinely asking. A 4k blu ray rip and a 4k stream from a service (or whatever it saves for offline viewing on an app) a pretty different. I think things are getting conflated with capturing live 4k television and capturing a 4k blu ray as it plays, which both might be using an HDMI cable.

          • Psythik@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            10 months ago

            I use Stremio and only stream full 4K Blu Ray rips, with HDR and Dolby Atmos and all. So nothing is recompressed. 50-70GB files but it starts streaming almost instantly.

            I have a poor 5G signal due to a tree that’s blocking my view of the antenna, so I get anywhere between 400Mbps and 1400Mbps (I’m supposed to get a gigabit but it’s usually closer to 500). Even with a poor signal it’s still way faster than any other ISP in my town.

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      HDMI Splitter + capture card.

      No video put on a streaming service produced in the next 40 years will need HDMI 2.1 to display.

    • sarmale@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Can’t you compress what the HDMI outputs in real time so that it would have a normal size?

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Sure. But why bother when you can rip it right from the disc in higher quality than you could ever hope to capture in real time?

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      The CEO’s of the media companies are all fucking dinosaurs who still think VCRs should have been made illegal. You will never convince them that built in copy protection is a dumb idea and a waste of time.

    • CCF_100@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Even despite that HDMI capture is simply an awful way of obtaining that data, it’s even more pathetic when that “protection” can be defeated by a $30 capture card on Amazon…

  • csolisr@hub.azkware.net
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    If we had to relay exclusively on non-proprietary protocols, I doubt that GNU/Linux would have gone anywhere beyond the Commodore 64

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Why? Most software wasn’t propreitary before companies realised they could make more money at your expense, not all went into making a better product.

      If given the choice of an uncomfortable dormitory or a comfortable jail, at least the residents can improve the living areas in the former.

      • rtxn@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Parent is right though. Unix being proprietary is why the GNU project was started, and why the Linux kernel and BSDs rose above.

        • tabular@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          Hopefully HDMI being proprietary leads to others creating an alternative, open standard which eventually can push HDMI to open up or push it out.

            • tabular@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              VESA requires an annual membership fee to access the DP standard. Perhaps that’s fine but that makes regular people unable to “open” the door to the standard. VESA has in the past claimed implementing DP can mean you own them royalty fees but they apparently backed down from that.

              Implementation of Content ““Protection”” isn’t in the spirit of an open standard to me, rather the opposite. Why have an open standard if not to weed-out corporate anti-features from existence for the benefit of the users?

              • Leeker@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                10 months ago

                Alright this sent me down a rabbit hole so I’m going to try to and summarize really quickly.

                1st, VESA requires a membership but in reality you need a company that has a vested interest in what VESA does. So you have to pay a huge due to be apart of it. This is quite BS according to me, the fact that they can do this and still claim to be an open standard. Source

                2nd, VESA never tried to implement a royalty based off the Display Port Standard. The company that did that was MPEG LA, LLC, they aren’t affiliated with VESA. Rather this company is a patent pool company that attempted to enforce their clients (such as Sony) licensing fees. They seemed to have backed off of this back in 2016 as the last patent used was for the Display Port Standard 1.4. Source

                3rd Content Protection was necessary if you want wide spread adoption. Companies aren’t going to want to do business with you if you allow for their IP to be ripped. As well, VESA is just a collection of companies that have voting shares in the company. So those corporate features are just par for the course.

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Linux never ran on the Commodore 64 (1984). That was way before Linux was released by Linus Torvalds (1991).

      I’d also like to point out that we do all rely on non-proprietary protocols. Examples you used today: TCP and HTTP.

      If we didn’t have free and open source protocols we’d all still be using Prodigy and AOL. “Smart” devices couldn’t talk to each other, and the world of software would be 100-10,000x more expensive and we’d probably have about 1/1,000,000th of what we have available today.

      Every little thing we rely on every day from computers to the Internet to cars to planes only works because they’re not relying on exclusive, proprietary protocols. Weird shit like HDMI is the exception, not the rule.

    • LifeInMultipleChoice@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Is there a reason or way to prevent display port from having so many connection issues specifically on port replicators (docking stations)?

      In corporate environments I find so many times that you plug them up over and over, unplug over and over and check the connection a million times before turning everything off one final time, holding the power button on everything (kind of like an smc reset) and then booting up everything like you originally did and they come up. Is this a result of the devices trying to remember a previous setup or is their an easy way to avoid it?

      I’ve hooked up dozens of them and still ran into issues when a family member brought a setup home to work when they were sick last week.

      • General_Shenanigans@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        We use Dell WD-19 docks. Not sure if you use similar. Updated dock firmware and laptop drivers made a difference for us with connection issues. Sometimes you gotta perform a reset on them to make them behave (disconnect dock power and USB-C and hold power button for just over 15 sec). Sometimes the laptop NVRAM needs to be reset instead (for Dell, disconnect all devices and power while off and hold button for just over 30 sec). Overall, though, no huge issues with DP specifically if the dock and laptop firmwares are up to date. Third-party docks/replicators definitely have way more issues, though.

      • zelifcam@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 months ago

        We are all aware of that. However, there are tons of studios people have constructed that use HDMI TVs as part of that setup. Those professionals will continue to be unable to use Linux professionally. That’s a huge issue to still have in 2024 with one of the major GFX options. Linux desktop relies on more than some enthusiasts if we want to see it progress.

        • Eldritch@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          Linux has very little to do with DisplayPort. My Windows PCs use DisplayPort. You can get passive adapters to switch from HDMI to DisplayPort etc.

          • zelifcam@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            10 months ago

            Linux has very little to do with DisplayPort. My Windows PCs use DisplayPort.

            What? I’m not sure what you’re on about. Of course DP is not a Linux specific technology. Not sure how you got that from my comment?

            I’m talking about people who would like to use the full capabilities of their HDMI TVs ( while using AMD), when using Linux.

            My understanding is the adapters do not provide all the features of the HDMI 2.1 spec. Is that no longer the case?

            • Malfeasant@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              The problem is those passive adapters only work because one side switches to the other’s protocol.

          • zelifcam@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            10 months ago

            Are you serious? You’re commenting on an article discussing this very problem. ???

            Personally for me it was VRR and High refresh rate over 4k+. I have since purchased an NVIDIA card to get around it. At the time the “adapters” were not providing what I’d consider a reasonable solution. Essentially crippling the features of my high end television.

        • Player2@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          USB C is just a connector, you might be referring to Displayport over USB C which is basically just the same standard with a different connector at the end. That or Thunderbolt I guess

        • admiralteal@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          I love having mysterious cables that may or may not do things I expect them to when plugged into ports that may or may not support the features I think they do.

          • trafficnab@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            If the implementation is so broad that I have to break out my label maker, can we even really call it a “standard”

        • trafficnab@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          USB C seems like a good idea but in reality all it really did was take my 5 different, not interchangeable, but visually distinct, cables, and make them all look identical and require labeling

          • ABCDE@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            Can it use others, and is there a benefit? USB C makes a lot of sense; lower material usage, small, carries data, power and connects to almost everything now.

            • BetaDoggo_@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              I believe USB-C is the only connector supported for carrying DisplayPort signals other than DisplayPort itself.

              The biggest issue with USB-C for display in my opinion is that cable specs vary so much. A cable with a type c end could carry anywhere from 60-10000MB/s and deliver anywhere from 5-240W. What’s worse is that most aren’t labeled, so even if you know what spec you need you’re going to have a hell of a time finding it in a pile of identical black cables.

              Not that I dislike USB-C. It’s a great connector, but the branding of USB has always been a mess.

              • Freestylesno@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                10 months ago

                This is the big issue I have with with “USB C”. USB c is just the connector which can be used for so many things. What actual is supported depends on things you can’t see, like the cable construction or what the device supports.

              • jaxxed@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                10 months ago

                I think that the biggest issue with dp over usbc is that people are going to try to use the same cable for 4k and large data transfers at the same time, and will then whine about weird behaviour.

              • ABCDE@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                10 months ago

                Yep, very true. I didn’t understand this until I couldn’t connect my Mac to my screen via the USB C given with the computer, I had to buy another (and order it in specifically). Pick up a cable, and I have no idea which version it is.

              • strawberry@kbin.run
                link
                fedilink
                arrow-up
                0
                ·
                10 months ago

                would be neat to somehow have a standard color coding. kinda how USB 3 is (usually) blue, maybe there could be thin bands of color on the connector?

                better yet, maybe some raised bumps so visually impaired people could feel what type it was. for example one dot is USB 2, two could be USB 3, etc

                • Flipper@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  10 months ago

                  Have you looked at the naming of the usb standards? No you havn’t otherwise you wouldn’t make this sensible suggestion.

            • frezik@midwest.social
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              There’s some really high bandwidth stuff that USB-C isn’t rated for. You have to really press the limits, though. Something like 4k + 240Hz + HDR.

              • ABCDE@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                10 months ago

                That doesn’t even seem so unreasonable. Is that the limit though? My cable puts a gigabyte a second down it so I wouldn’t imagine that would hit the limit.

                • GeniusIsme@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  10 months ago

                  It is trivial arithmetic: 4.52403840*2160 ≈ 9 GB/ s. Not even close. Even worse, that cable will struggle to get ordinary 60hz 4k delivered.

                • frezik@midwest.social
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  10 months ago

                  USB-C with Thunderbolt currently had a limit of 40Gbit/sec. Wikipedia has a table of what DisplayPort can do at that bandwidth:

                  https://en.wikipedia.org/wiki/DisplayPort

                  See the section “Resolution and refresh frequency limits”. The table there shows it’d be able to do 4k/144hz/10bpp just fine, but can’t keep above 60hz for 8k.

                  Its an uncompressed video signal, and that takes a lot of bandwidth. Though there is a simple lossless compression mode.

      • Tja@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        We cannot have two standards, that’s ridiculous! We need to develop one universal standard that covers everyone’s use cases.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      As already mentioned, DisplayPort exists. The problem is adoption. Even getting DisplayPort adopted as the de facto standard for PC monitors hasn’t done anything to get it built into TVs.

        • Scrollone@feddit.it
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          DisplayPort supports CEC.

          From Wikipedia:

          The DisplayPort AUX channel is a half-duplex (bidirectional) data channel used for miscellaneous additional data beyond video and audio, such as EDID (I2C) or CEC commands.

  • Godort@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    What’s the over/under that this was about preventing people getting around HDCP using a modified driver?

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    This is really frustrating. This is the only thing holding Linux gaming back for me, as someone who games with a AMD GPU and an OLED TV. On Windows 4k120 works fine, but on Linux I can only get 4k60. I’ve been trying to use an adapter, but it crashes a lot.

    AMD seemed to be really trying to bring this feature to Linux, too. Really tragic that they were trying to support us, and some anti-open source goons shot them down.

    • ichbinjasokreativ@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I’m a bit confused by your comment. I have a 120Hz Monitor and use an AMD GPU on linux without issues. Connected via the display port on my GPU to the HDMI Port on my monitor (because samsung does not enable DDC on the display port for some reason).

        • 🐍🩶🐢@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          I have one too. Go take a look at Cable Matters. I am able to play games at 4k120 with my mac. See if something will work for you and you can always send a message to their customer support to ask questions.

          • Justin@lemmy.jlh.name
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            Yeah thats the one I have. Maybe I’ll ask their support. It has the latest firmware but it’s so flaky about being able to do high bandwidth.

        • Cosmic Cleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 months ago

          I’m using an LG C2 Oled TV that doesnt have displayport.

          Connected via the display port on my GPU to the HDMI Port on my monitor

          ichbinjasokreativ seems to suggest that the viewing device he/she connects to is done via HDMI, the same as your OLED TV.

          Unless I’m missing something?

          Edit: You discuss this issue further down in the topic, so no need to reply.

          Could have saved myself the time of replying to you if I had scrolled all the way through first, then backtracked, but that’s kind of unintuitive to do, especially on a cell phone browser.

      • Celestus@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        The OLED TV probably only has HDMI. TVs don’t normally have DisplayPort

          • Justin@lemmy.jlh.name
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            Freesync works for me with that cable, but I have some awful cable training problem or something. Everytime the screen is reset, it refuses to go above 4k60 or turn on freesync, and it’ll only work again after I have the screen connected for a few minutes. So whenever I start my comouter, have to switch down 4k60 and wait like half an hour to be able to use high refresh rate. It’s been doing this for months.

            I don’t know if I have a bad cable, weird interference, or if FRL adapters are just flaky.

    • Sudomeapizza@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      ive found that the issue in my experience is that X11 only supports a max of 4k60, but Wayland supports 4k120 and beyond. I dont think the cable matters as the same cable im using works on windows with 4k160.

      • Justin@lemmy.jlh.name
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        It’s a matter of cable bandwidth. 4k120 4:4:4 requires more bandwidth than hdmi 2.0 can provide. You can drop down to 4:2:0, but that’s a pretty bad experience and ruins the image quality.

        I’ve been using an adapter cable, but it’s really flaky, I don’t know if it’s a bad cable or what. But a normal hdmi cable just plain works on Windows, since the windows amd driver supports hdmi 2.1.

  • nivenkos@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    This sucks as all new TVs use HDMI2.1 for modern features and modern games consoles rely on those for 4k 60Hz HDR, etc.

    So now Valve can’t just make their own home console with Steam OS for TVs directly (and support high-end features at least).

    • thedirtyknapkin@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Piracy being easier is the only risk. Once again ruining the experience of legitimate customers to try and stop a thing that they have had no success at even slowing down.

      • Acters@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Even further, it made it more expensive to buy products from all the dumb licensing fees that all the middlemen try to shoehorn in.

  • NoLifeGaming@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Always thought that display port is better anyways lol. Anything that HDMI does or have that display port doesnt?