Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • FiskFisk33@startrek.website
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    GPUs haven’t been reasonably priced since the 1000 series.

    And now there’s no coin mining promising some money back.

    • elvith@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn’t the bottle neck, so 12 GB might be fine with this use case BUT I’m also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It’s not, that I would never get a 12 GB card as an upgrade, but you’d be sure, that I’d do some research for all alternatives and then it wouldn’t be my first choice but a compromise, as it wouldn’t future proof me in this regard.

  • ReallyActuallyFrankenstein@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Yep, it’s the RAM, but also just a mismatched value proposition.

    I think it’s clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line’s prices, thinking they could just milk the “new normal” without having to change their plans.

    But when you move the x070 series out of the mid-tier price bracket ($250-450, let’s say), you better meet a more premium standard. Instead, they’re throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn’t help that it’s at a time where people generally just have less disposable income.

  • Shirasho@lemmings.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I don’t know about everyone else, but I still play at 1080. It looks fine to me and I care more about frames than fidelity. More VRAM isn’t going to help me here so it is not a factor when looking at video cards. Ignoring the fact I just bought a 4070, I wouldn’t not skip over a 4070 Super just because it has 12GB of RAM.

    This is a card that targets 1440p. It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.

    • miss_brainfart@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.

      There are many games that cut it awfully close with 12GB at 1440p, for some it’s actually not enough. And when Nvidia pushes Raytracing as hard as they do, not giving us the little extra memory we need for that is just a dick move.

      Whatever this card costs, 12GB of vram is simply not appropriate.

    • atocci@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      My monitor is only 1440p, so it’s just what i need. I ordered the Founders Edition card from Best Buy on a whim after I stumbled across it at launch time by coincidence. I’d been mulling over the idea of getting a prebuilt PC to replace my laptop for a few weeks at that point and was on the lookout for sales on ones with a 4070. Guess I’ll be building my own instead now.

    • Deceptichum@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      I’m fine playing at 30fps, I don’t really notice much of a difference. For me ram is the biggest influence in a purchase due to the capabilities it opens up for local AI stuff.

      • iAmTheTot@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        If someone says they don’t notice a difference between 60 FPS and 120+ FPS, I think… okay, it is diminishing returns, 60 is pretty good. But if someone says they don’t notice a difference between 30 and 60… you need to get your eyes checked mate.

        • Deceptichum@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          8 months ago

          I notice a difference, it’s just not enough to make it a big deal for me. It’s like going from 1080 to 1440, you can see it but it’s not really an issue being on 1080.

          • Jakeroxs@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            It depends on the game, quick action packed stuff you can see the jumping and in something like a shooter it can be a disadvantage.

            For something like Slay the Spire tho, totally fine.

            • Obi@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 months ago

              I’m at the age where if games require such quick reactions that the difference in FPS matters, I’m going to get my ass handed to me by the younguns anyway…

  • wooki@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    If they dont drop the price by at least 50% goodbye nVidia.

    So no more nVidia. Hello Intel.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I don’t think they care. In fact I think they’re going to exit the consumer market eventually, it’s just peanuts to them and the only reason they’re still catering to it is to use it as field testing (and you’re paying them for the privilege which is quite ironic).

  • caseyweederman@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?

  • Altima NEO@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    The RAM is so lame. It really needed more.

    Performance exceeding the 3090, but limited by 12 gigs of RAM .

  • Binthinkin@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.

    Aren’t they taking the 4080 completely off the market too?

  • Kazumara@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.

    • NIB@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt’s 16gB offers any advantage. Or do you think having 15fps average is more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

      The 4070 super offers significantly superior ray tracing performance, much lower power consumption, superior scaling(and frame generation) technology, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.

      Amd’s offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to “suffer” using AMD gpus, yet they usually join the nvidia shitting circlejerk.

      I have an amd 580 card and have bought and recommended AMD gpus to people since the 9500/9700pro series. But my next gpu will almost certainly be an nvidia one. The only reason people are complaining is because nvidia can make a better gpu(as shown by the 4090) but they choose not to. While AMD literally cant make better gpus but they choose to only “competitively” price their gpus, instead of offering something better. Both companies suck.

  • trackcharlie@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    less than 20gb of vram in 2024?

    The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste

    • BorgDrone@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      The real tragedy is that PCs still have to make do with discrete graphics cards that have separate VRAM.

  • Dra@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid. Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit of 20gb VRAM?

    • Eccitaze@yiffit.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

    • Asafum@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Lmao

      We have your comment: what am I doing with 20gb vram?

      And one comment down: it’s actually criminal there is only 20gb vram

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Current gen consoles becoming the baseline is probably it.

      As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

      That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

    • Hadriscus@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

    • Obi@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.

    • Space_Racer@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.