With the R515 driver, NVIDIA released a set of Linux GPU kernel modules in May 2022 as open source with dual GPL and MIT licensing. The initial release targeted datacenter compute GPUs…
That was mostly because the 20 series was so bad. Expensive, didn’t perform lightyears better to justify the price, raytracing wasn’t used in any games (until recently).
The 30 series was supposed to be more of a return to form, then covid + mining ruined things.
I got a 2060 super and i must say i’m very happy, i do 3d stuff so the ray tracing was plenty useful and despite it getting a bit it fairs pretty great in most games and the price was okay at the time (500 €still a bit high since it was during the bitcoin mining madness =-=")
RTX 3050 (which got a new 6 gb version less than a year ago) is similar to 1070 Ti in terms of performance and 1080s are of course even better. Definitely a ton of staying power, even in 2024.
I bought a secondhand 1080 a couple years ago when the crypto bubble burst finally and it’s still serving my needs just fine. It could play Baldur’s Gate 3 just fine on release last year, which was the last “new” game I played on it. Seems like it’ll still be good for a few years to come so yeah.
It’s more that back then was a better time for price to performance value. The 3000 and 4000 series cards were basically linear upgrades in terms of price to performance.
It’s an indicator that there haven’t been major innovations in the GPU space, besides perhaps the addition of the AI and Raytracing stuff, if you want to count those as upgrades.
My ol’ 1070 doesn’t make the cut hey… ;-;
I think it works but the performance might not be ideal. Keep on the proprietary module.
Maybe it’s just because I’m older and more jaded, but that really feels like the last truly good era for GPUs.
Those 10 series cards had a ton of staying power, and the 480/580 were such damn good value cards.
Still have a beautifully running 1070. 👌
Comrade. (☞ ͡° ͜ʖ ͡°)☞
That was mostly because the 20 series was so bad. Expensive, didn’t perform lightyears better to justify the price, raytracing wasn’t used in any games (until recently).
The 30 series was supposed to be more of a return to form, then covid + mining ruined things.
I got a 2060 super and i must say i’m very happy, i do 3d stuff so the ray tracing was plenty useful and despite it getting a bit it fairs pretty great in most games and the price was okay at the time (500 €still a bit high since it was during the bitcoin mining madness =-=")
RTX 3050 (which got a new 6 gb version less than a year ago) is similar to 1070 Ti in terms of performance and 1080s are of course even better. Definitely a ton of staying power, even in 2024.
I bought a secondhand 1080 a couple years ago when the crypto bubble burst finally and it’s still serving my needs just fine. It could play Baldur’s Gate 3 just fine on release last year, which was the last “new” game I played on it. Seems like it’ll still be good for a few years to come so yeah.
It’s more that back then was a better time for price to performance value. The 3000 and 4000 series cards were basically linear upgrades in terms of price to performance.
It’s an indicator that there haven’t been major innovations in the GPU space, besides perhaps the addition of the AI and Raytracing stuff, if you want to count those as upgrades.
It feels like the crypto mining goldrush really changed the way GPU manufacturers view the market.
I feel like AI has changed the game. Why sell retail when people are paying you billions to run LLMs in the cloud.