Hard to believe it’s been 24 years since Y2K (2000) And it feels like we’ve come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways
I’m a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don’t want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.
Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we’re not going to see huge gains in performance anymore because AMD isn’t caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away
Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn’t very widespread. We’re again a decade away from seeing anything really substantial in terms of performance
Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It’s so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they’re just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that’s copyrighted, but claim it’s for the public good, and then randomly swap to a for-profit model. Doesn’t make any sense and just looks like they’re going to be a vessel for widespread economic poverty…
It just seems like there’s a lot of bubbles that are about to burst all at the same time, like I don’t see how things are going to possibly get better for a while now?
I agree with you on the GPU hardware and AI bubbles, but I’m not sure I would consider VR/AR to be a bubble right now. The hype has mostly died down by now, and I think it’s stabilized to the point where it will remain until we have new advances in hardware.
VR is on the verge of collapse in the USA thanks to the US government banning byte dance. We can’t even order the new Pico 4 ultra, which is one of the most anticipated VR sets in the world right now. Meta basically has a monopoly and just announced they’re cutting funding to VR
Sorry but a new pico headset wouldnt do much of anything. New meta headset, new valve headset would give a bump.
Really needs better content. The hardware is almost there (in terms of cost and accessibility of the experience).
Its slowly getting there. But the current population of vr users is characterized by: who would play the same limited experiences consistently with hardware that is often cumbersome and loading screens that arent super long but become your entire existence and its annoying.
Meta sucks but they have been a boon for vr development.
COVID also inflated a lot of tech stock massively, as everybody suddenly had to rely a lot more on it to get anything done, and the only thing you could do for entertainment was gaming, streaming movies, or industrial quantities of drugs.
Then that ended, and they all wanted to hold onto that “value”.
It is a bubble, but whether it pops massively like in 2000, or just evens off to the point where everything else catches up, remains to be seen.
“The markets can remain irrational longer than you can remain solvent” are wise words for anyone thinking of shorting this kind of thing.
Shows that You are in the UK. Just want to clarify I’m talking specifically about the USA but I agree with everything you said. Tech stocks became so inflated! Don’t know if people are seeing it in Europe, but here in the USA, there is this really toxic and very cringe behavior from these tech companies to get people back to office, they can force people to return to office across the country, basically you have to relocate and upend your entire life which could cost you $50,000 and they’re not paying for that, if you don’t do that you get fired. Easy way to start laying off people without having to pay them anything because you can call it insubordination, since they refuse to return to office. Now they supposedly have cause to get rid of people or deny them promotions for more money. IBM for example is doing this right now, Cisco was doing it as well. One of the most major networking software companies in the market. Scumbag behavior
Wait till the Y2K38 event occurs.
If only we had some way of working with a bigger integer…maybe we’d call it something like BigInteger…
Or just a u64. 64 bit computers are pretty standard nowadays.
I had heard that. Maybe I’ll get my hands on one someday. I hear Commodore makes one.
(I do wonder now if whatever variable is being used to denote time is signed or unsigned, because that would make a big difference, too.)
It’s a societal bubble, soon we all go pop. c/collapse
. . . with 10% increase in performance rather than 50 or 60% like we really need
Why is this a need? The constant push for better and better has not been healthy for humanity or the planet. Exponential growth was always going to hit a ceiling. The limit on Moore’s Law has been more to the economic side than actually packing transistors in.
We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away
Sure you can, today, and this is why:
So many gaming companies are incapable of putting out a successful AAA title because . . .
Regardless of the reasons, the AAA space is going to have to pull back. Which is perfectly fine by me, because their games are trash. Even the good ones are often filled with micro transaction nonsense. None of them have innovated anything in years; that’s all been done at the indie level. Which is where the real party is at.
Would it be so bad if graphics were locked at the PS4 level? Comparable hardware can run some incredible games from 50 years of development. We’re not even close to innovating new types of games that can run on that. Planet X2 is a recent RTS game that runs on a Commodore 64. The genre didn’t really exist at the time, and the control scheme is a bit wonky, but it’s playable. If you can essentially backport a genre to the C64, what could we do with PS4 level hardware that we just haven’t thought of yet?
Yeah, there will be worse graphics because of this. Meh. You’ll have native 4K/144Hz just by nature of pulling back on pushing GPUs. Even big games like Rocket League, LoL, and CS:GO have been doing this by not pushing graphics as far as they can go. Those games all look fine for what they’re trying to do.
I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.
This post really nails my take on the issue. Give me original cs level graphics or even aq2 graphics, a decent story, more levels, and a few new little gimmicks (rocket arena grappling hook, anyone?!?!) and you don’t need 4k blah blah bullshit.
The #1 game for kids is literally Minecraft or Roblox…8 bit level gfx outselling your horse armor hi res bullshit.
The last game i bought was 2 days ago. Mohaa airborne for PC for $5 at a pawn shop Give me 100 of this quality of game instead of anything PS5 ever made.
deleted by creator
Depending on where you draw the line, mine looks similar:
- EU4 - >800 hours
- Cities Skylines - ~180 hours
- Magic: Arena - >100 hours
- Crusader Kings 2 - ~100 hours
After that it depends on the length of the game. I normally just play through the campaign on most games once (except the above, which have lots of replay value), so looking at playtime isn’t particularly interesting IMO. The ratio of games with interesting playtime (i.e. I probably rolled credits) between indie and AAA is easily 2:1, if not something way higher like 5:1 or even 10:1, but again, that really depends on where you draw the line. If we look at 100% completion, I have 22 indie games and zero AAA games, because I rarely find AAA games to be worth going after achievements in. If I sort by achievement completion, the top two AAA games are Yakuza games (I love that series), and that’s after scrolling through dozens of indies, many of which have a fair amount of achievements (i.e. you need to do more than just roll credits).
So yeah, AAA games really don’t interest me. If you compare the amount I’ve spent on indie vs AAA games, it would be a huge difference since I pretty much only play older AAA games if I get them on sale, and that’s mostly so I can talk about them w/ friends…
I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.
I agree. Wholeheartedly. I think it’s just so obvious how quality dramatically takes off when the people creating it feel safe, sound, and economically stable. Financial Security (UBI) drives creativity probably more than anything else. It’s a huge win!
The limit on Moore’s Law has been more to the economic side than actually packing transistors in.
The reason why those economic limits exist is because we’re reaching the limit of what’s physically possible. Fabs are still squeezing more transistors into less space, for now, but the cost per transistor hasn’t fallen for some time, IIRC about 10nm thereabouts is still the most economical node. Things just get difficult and exponentially fickle the smaller you get, and at some point there’s going to be a wall. Of note currently we’re talking more about things like backside power delivery than actually shrinking anything. Die-on-die packaging and stuff.
Long story short: Node shrinks aren’t the low-hanging fruit any more. Haven’t been since the end of planar transistors (if it had been possible to just shrink back then they wouldn’t have engineered FinFETs) but it’s really been taking up speed with the start of the EUV era. Finer and finer pitches don’t really matter if you have to have more and more lithography/etching/coating steps because the structures you’re building are getting more and more involved in the z axis, every additional step costs additional machine time. On the upside, newer production lines could spit out older nodes at pretty much printing press speed.
None of them have innovated anything in years
Well, they’ve innovated news ways to take up disk space…
There’s a reason I don’t play new release AAA games, and it’s because they’re simply not worth the price. They’re buggy at launch, take up tons of disk space (with lots of updates the first few months), and honestly aren’t even that fun even when the bugs are fixed. Indie games, on the other hand, seem to release in a better state, tend to be fairly small, and usually add something innovative to the gameplay.
The only reason to go AAA IMO is for fancy graphics (I honestly don’t care) and big media franchises (i.e. if you want Spiderman, you have to go to the license holder), and for me, those lose their novelty pretty quickly. The only games I buy near release time anymore are Nintendo titles and indie games from devs I like. AAA just isn’t worth thinking about, except the one or two each year that are actually decent (i.e. Baldur’s Gate 3).
it’s time for you to play PACMAN, as i did when i was young 😂
no AI, no GPU, no shitcoin: you just have to eat ghost, which is very strange in fact when you think about it 🤪Correction the ghosts are AI and based on how many times they killed me clearly a step above anything mainstream today (º ロ º๑).
I’m a PC gamer, and it looks like things are stagnating massively in our space.
I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.
Overall, I don’t see things the way you see them. I recommend taking a break from social media, go for a walk, play games you like, and fuck the trajectory of tech companies.
Live your life, and take a break from the doomsaying.
My only fear with the indie gaming industry is that many of them are starting to embrace the churn culture that has led AAA gaming down a dark path.
I would love an app like Blind that allows developers on a game to anonymously call out the grinding culture of game development, alongside practices like firing before launch and removing credits from workers. Review games solely on how the dev treated the workers, and we might see some cool corrections between good games and good culture.
There’s certainly room to grow with regard to workers’ rights. I think you could probably solve at least a few of them if they were covered by a union, and publishers who hire them would have to bargain for good development contract terms.
I love this, and I’ll even one up it. Let the bubbles burst, this is just a transitional period that you see like a predictable cycle in tech. The dot com burst was like a holocaust compared to this shit. Everyone who was in the tech scene before Google has an easier time with this. We can comfortable watch FAANG recede, and even be grateful for it. Let it happen.
Hello indie gamer, it’s me, you, from the future.
I’d like to introduce you to PATIENT indie gaming.
The only games I play are small team, longer running, well documented, developers are passionate, mods exist, can play on a potato or a steam deck, etc
Because I’m patient, I don’t ever get preorder, Kickstarter, prealpha disappointed.
I know exactly what I’m getting, I pay once, and boom, I own a great game for ever. (You can more often fully DL indie games)
Bruh, what do you mean “future?” That’s me right now!
Bro I’m from the future you can’t ask me stuff like that, be patient, you’ll figure it out
Genuinely wish more people understood this. I’ve mostly only been playing indie games for the past few years. By far the best fun i’ve had in gaming. A ton of unbelievably creative, unique games out there. Not to mention that 99% of them are a single-purchase experience, instead of a cash treadmill
cash treadmill
Borrowing this turn of phrase
Well said!
I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.
Amen.
Indie games might not be flashy, but they’re often made with love and concern about giving you a fun experience. They also lack all those abusive DRM and intrusive anti-cheat systems that A³ games often have.
They also tend to have linux support. Where the AAA companies want to eat the entire mammoth and scorn the scraps, small companies can thrive off of small prey and the offal. :)
Equating Linux enthusiasts to offal is a bold move on this site
It’s a great analogy though - Linux users aren’t deemed profitable by the A³ companies, just like offal is unjustly* deemed yucky by your typical person.
*I do love offal though. And writing this comment made me crave for chicken livers with garlic and rosemary over sourdough bread. Damn.
Idk, I’ve spent way more on games since Valve came to Linux. I was a Linux user first, and mostly played games on console because I didn’t like rebooting into Windows or fiddling w/ WINE, so if I played games, it’s because it had Linux support (got a ton through Humble Bundle when they were small and scrappy). When Steam came to Linux, I created an account (didn’t have one before) and bought a bunch of games. I bought Rocket League when the Steam Controller and Steam Deck launched (was part of a bundle), and when Proton launched, I bought a ton of Windows games.
So at least for me, I’ve easily spent 100x what I would’ve spent on video games due to Steam supporting Linux. That said, there are easily 50 other people spending more than me on Windows for every one of me, so I get that Linux isn’t a huge target market. But I will spend more on an indie game if it has native Linux support.
And I’ll add on to that, even if every GPU company stops innovating, we’ll still have older cards and hardware to choose from, and the games industry isn’t going to target hardware nobody is buying (effectively pricing themselves out of the market). Indie devs especially tend to have lower hardware requirements for their games, so it’s not like anyone will run out of games to play.
Plenty of good games out there, even in the early access I have found some real gems. Just recently coffee stain released satisfactory… labor of love and it shows. I recently tried bellwright, it’s impressive, so is manor lords.
And hardware stagnating also means that people get to learn what it’s all about and optimize for it. The last gen games on a console are usually also better optimized than the first series of games on a platform. So yeah…
Gaming now is more amazing that ever in part because we have access to classic games too. If someone thinks gaming was amazing 10 years ago, cool. We still have those games! I’m playing a really old game right now myself and loving it.
I think OP confuses this whole bubble bursting thing. When a phenomenon passes out of its early explosive growth phase and settles into more of a steady state, that’s not the “bubble bursting” that’s maturity.
Tech as a whole is now a more mature industry. Companies are expected to make money, not revolutionize the world. OP would have us believe this means that tech is over. How does the saying go? It’s not the beginning of the end, but it is perhaps the end of the beginning.
Companies are expected to make money, not revolutionize the world
I’d like to believe that, but I don’t think investors have caught on yet. That’s where the day of reckoning will come.
AI is a field that’s gone through boom and bust cycles before. The 1960s were a boom era for the field, and it largely came from DoD money via DARPA. This was awkward for a lot of the university pre and post grads in AI at the time, as they were often part of the anti-war movement. Then the anti-war movement starts to win and the public turns against the Vietnam war. This, in turn, causes that DARPA money to dry up, and it’s not replaced with anything from elsewhere in the government. This leads to an AI winter.
Just to be clear, I like AI as a field of research. I don’t at all like what capitalism is doing with it. But what did we get from that time of huge AI investment? Some things that can be traced directly back to it are optimizing compilers, virtual memory, Unix, and virtual environments. Computing today would look entirely different without it. We may have eventually invented those things otherwise, but it would have taken much, much longer.
What would you say Capitalism is doing with AI?
Attempting to replace people in the workplace without changing society so that people can live without work.
I had a similar feeling, the trapped part of Windows and the enshitifying erroding privacy and ownership tipped me. I stepped back to Linux and wow! Its so much better now, a friend nagged me into nix and I have been soo impressed at just how better Linux is now compared to Windows. It used to be the polar opposite with Windows being easy and shiny, now so many desktop environments are just so far ahead of Windows, its really impressive. Thats also let alone before I consider how novel nix is. As my friend would nag, I use nix btw ❤️
Well, that’s the doomer take.
The rumors are that the 80 series card is 10% faster than the 90 series card from last gen: that’s not a ‘10%’ improvement, assuming the prices are the same, that’s more like a 40% improvement. I think a LOT of people don’t realize how shitty the 4080 was compared to the 4090 and are vastly mis-valuing that rumor.
I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong. My gaming has moved almost entirely to my Rog Ally and you know what? Shit is just as fun and way more convenient than the 7700x/3080 12gb desktop even if it’s 1080p low and not 1440p120. If the only thing the game has going for it is ‘ooh it’s pretty’ then it’s unlikely to be one of those games people care about in six months.
And anyways, who gives a crap about AAAAAAAAAAAAA games? Indie games are rocking it in every genre you could care to mention, and the higher budget stuff like BG 3 is, well, probably the best RPG since FO:NV (fight me!).
And yes, VR is in a shitty place because nobody gives a crap about it. I’ve got a Rift, Rift S, Quest, and a Quest 2 and you know what? It’s not interesting. It’s a fun toy that, but it has zero sticking power and that’s frankly due to two things:
- It’s not a social experience at all.
- There’s no budget for the kind of games that would drive adoption, because there’s no adoption to justify spending money on a VR version.
If you could justify spending the kind of money that would lead to having a cool VR experience, then yeah, it might be more compelling but that’s been tried and nobody bought anything. Will say that Beat Saber is great, but one stellar experience will not sell anyone on anything.
And AI is this year’s crypto which was last year’s whatever and it’s bubbles and VC scams all the way down and pretty much always has been. Tech hops from thing to thing that they go all in on because they can hype it and cash out. Good for them, and be skeptical of shit, but if it sticks it sticks, and if it doesn’t it doesn’t.
I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong.
Some of the best games I’ve played have graphics that’ll run on a midrange GPU from a decade ago, if not just integrated graphics
Case in point, this is what I’m playing right now:
The 5080 is rumored to be 10% faster, but also use 90% the power. While performance has a normal generational leap, power consumption has gone up to match leaving you with a much smaller actual improvement.
Power consumption numbers like that are expected, though.
One thing to keep in mind is how big the die is and how many transistors are in a GPU.
As a direct-ish comparison, there’s about 25 billion transistors in a 14900k, and 76 billion in a 4090.
Big die + lots and lots of transistors = bigly power usage.
I wouldn’t imagine that the 5000-series GPUs are going to be smaller or have less transistors, so I’d expect this to be in the die shrink lowers power usage, but more transistors increase power usage zone.
Conversly, the apple silicon products ship huge, expensive dies fabbed on leading TSMC processes which sip power relative to contemporaries. You can have excellent power efficiency on a large die at a specific frequency range, moreso than a smaller die clocked more aggressively.
You’re not wrong (and those are freaking enormous dies that have to cost apple a goddamn fortune to make at scale), but like, it also isn’t an Apples-to-Apples comparison.
nVidia/Intel/AMD have gone for the maximum performance and fuck any heat/noise/power usage path. They haven’t given a shit about low-power optimizations or investing in designs that are more suited to low-power usage (a M3 max will pull ~80w if you flog the crap out of it, so let’s use that number) implementations. IMO the wrong choice, but I’m just a computer janitor that uses the things, I don’t design them.
Apple picked a uarch that was already low power (fun fact: ARM was so low power that the first test chips would run off the board’s standby power and would boot BEFORE they were actually turned on) and then focused in on making it as fast as possible with the least power as possible: the compute cores have come from the mobile side prior to being turned into desktop chips.
I’m rambling but: until nVidia and x86 vendors prioritize power usage over raw performance (which they did with zen5 and you saw how that shit spiraled into a fucking PR shit mess) then you’re going to get next year’s die shrink, but with more transistors using the same power with slightly better performance. It’s entirely down to design decisions, and frankly, x86 (and to some degree so has nVidia) have painted themselves into a corner by relying on process node improvements (which are very rapidly going to stop happening) and modest IPC uplifts to stay ahead of everyone else.
I’m hoping Qualcomm does a good job staying competitive with their ARM stuff, but it’s also Qualcomm and rooting for them feels like cheering on cancer.
This outlines several issues, a key one is outbidding apple for wafer alloc on leading processes. They primarily sell such high margin products that I suppose they can go full send on huge dies with no sweat. Similarly, the 4090’s asking price was likely directly related to it’s production cost. A chunky boy with a huge l2$.
I like the way Mike Clark frames challenges in semi eng as a balancing act between area, power, freq and performance (IPC); like a chip that’s twice as fast but twice the size of its predecessor is not considered progress.
I wish ultra-efficient giga dies were more feasible but it’s kind of rough when TSMC has been unmatched for so long. I gather Intel’s diverting focus in 18A, and I hope that turns our well for them.
I’m not sure that arm as an ISA (or even RISC) is inherently more efficient than CISC today, particularly when we look at Qualcomm’s latest l efforts at notebooks, more that Apple have extremely proficient designers and benefit significantly from vertical integration.
Little bit of pushback on the vr front: Sure, there aren’t many massive publishers driving it forward, but I would wholeheartedly argue that it can very much be a social experience, and offers experiences it is damn near impossible to get anywhere else, and three games immediately come to mind:
VRchat (obviously): Literally entirely a social game, and has a pretty large community of people making things for it, from character models to worlds because that’s what drives the game. There is a massive scene of online parties, raves, hangouts, etc. that bring people together across the whole world in a medium more real than any flat game because of the custom models, worlds, and the relative abundance of people using full body tracking to show off, dance, and interact with each other.
VTOL VR: This is still fairly social in that you can either play with friends or people online, but the main draw for me is the level of immersion in flying you can get. You have full interactable cockpits that you basically just use your real hands to interact with (depending on your controller/hand tracking) and it’s all pretty realistic. It’s just impossible to have the same level of experience without VR.
Walkabout mini golf: I was pretty skeptical of this game when my friends wanted to play it, it’s literally just a mini golf sim. The thing is, the ability to play mini golf with friends who live across the country/world is amazing, and the physics of just swinging your controller/hands in the same way as real mini golf is so special.
It is still quite expensive to get really good gear, and that is definitely the current biggest hurdle. It may forever be a smaller community due to the space/tech/cost requirements to make the experience truly incredible, but for me even just on a quest 2 in my room without a lot of fancy stuff, it is still interesting and something special. A lot of people really do care a lot about VR, and even if it is far less than conventional gaming, it should not be entirely discounted. And I personally think that while is probably won’t ever replace flat screen gaming, it is an entirely different kind of experience and has a at least decent future ahead
Fair points on VR games being fairly social. I was more thinking of the in-person social experience, which is still involving some portion of people sitting around stuffing their face into a headset and wandering off into their own world.
IMO, this is something that AR/MR stuff could do a great job of making more social by adding the game to the world, rather than taking the person out of the world to the game but, of course, this also restricts what kind of games you can do so is probably only a partial solution and/or improvement on the current state of affairs.
I also agree that it’s way too expensive still, and probably always will be because the market is, as you mentioned, small.
PCVR is pretty much dead despite its proponents running around declaring that it’s just fine like it’s a Monty Python skit. And the tech for truly untethered headsets is really only owned by a single (awful) company and only because the god-CEO thinks it’s a fun thing to dump money on which means it’s subject to sudden death if he retires/dies/is ousted/has to take time off to molt/has enough shareholder pressure put on him.
Even then, it’s only on a second generation (the original Quest was… beta, at best) and is expensive enough that you have to really have a reason to be interested rather than it being something you could just add to your gaming options.
I’d like VR to take off and the experiences to more resemble some of the sci-fi worlds that have a or take place in a virtual reality world, but honestly, I’ve thought that would be cool for like 20 years now and we’re only very slightly closer than we were then, we just have smaller headsets and somewhat improved graphics.
Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.
Nvidia is just playing conservative because it was massively overvalued by the market. The GPU use for AI is a stopover hack until hardware can be developed from scratch. The real life cycle of hardware is 10 years from initial idea to first consumer availability. The issue with the CPU in AI is quite simple. It will be solved in a future iteration, and this means the GPU will get relegated back to graphics or it might even become redundant entirely. Once upon a time the CPU needed a math coprocessor to handle floating point precision. That experiment failed. It proved that a general monolithic solution is far more successful. No data center operator wants two types of processors for dedicated workloads when one type can accomplish nearly the same task. The CPU must be restructured for a wider bandwidth memory cache. This will likely require slower thread speeds overall, but it is the most likely solution in the long term. Solving this issue is likely to accompany more threading parallelism and therefore has the potential to render the GPU redundant in favor of a broader range of CPU scaling.
Human persistence of vision is not capable of matching higher speeds that are ultimately only marketing. The hardware will likely never support this stuff because no billionaire is putting up the funding to back up the marketing with tangible hardware investments. … IMO.
Neo Feudalism is well worth abandoning. Most of us are entirely uninterested in this business model. I have zero faith in the present market. I have AAA capable hardware for AI. I play and mod open source games. I could easily be a customer in this space, but there are no game manufacturers. I do not make compromises in ownership. If I buy a product, my terms of purchase are full ownership with no strings attached whatsoever. I don’t care about what everyone else does. I am not for sale and I will not sell myself for anyone’s legalise nonsense or pay ownership costs to rent from some neo feudal overlord.
Hell yeah!
I do not make compromises in ownership.
preach!
At the end of the day though proper change will only come once the critical mass aligns on this issues along few others.
Political process is too captured for peasant to affect any change, we have more power voting with our money as customers, at least for now.
AI still needs a lot of parallelism but has low latency requirements. That makes it ideal for a large expansion card instead of putting it directly on the CPU die.
Multi threading is parallelism and is poised to scale to a similar factor, the primary issue is simply getting tensors in and out of the ALU. Good enough is the engineering game. Having massive chunks of silicon laying around without use are a mach more serious problem. At the present, the choke point is not the parallelism of the math but actually the L2 to L1 bus width and cycle timing. The ALU can handle the issue. The AVX instruction set is capable of loading 512 bit wide words in a single instruction, the problem is just getting these in and out in larger volume.
I speculate that the only reason this has not been done already is because pretty much because of the marketability of single thread speeds. Present thread speeds are insane and well into the radio realm of black magic bearded nude virgins wizardry. I don’t think it is possible to make these bus widths wider and maintain the thread speeds because it has too many LCR consequences. I mean, at around 5 GHz the concept of wire connections and gaps as insulators is a fallacy when capacitive coupling can make connections across all small gaps.
Personally, I think this is a problem that will take on a whole new architectural solution. It is anyone’s game unlike any other time since the late 1970’s. It will likely be the beginning of the real RISC-V age and the death of x86. We are presently at the age of the 20+ thread CPU. If a redesign can make a 50-500 logical core CPU slower for single thread speeds but capable of all workloads, I think it will dominate easily. Choosing the appropriate CPU model will become much more relevant.
Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.
I’m a die hard open source fan but that still feels like a stretch. I remember 10 years ago we were theorizing that windows would get out of the os business and just be a shell over a unix kernel, and that never made it anywhere.
It remained in the OS business to the extent that is required for the malware business.
Also NT is not a bad OS (except for being closed, proprietary and probably messy by now). The Windows subsystem over it would suck just as bad if it would run on something Unix.
Windows is malware now.
Yeah, I guess in my fantasy I was Assuming that windows would do a full rewrote and adopt the unix abi, but I know that wouldn’t happen.
I don’t think that is necessarily out of the running yet. OS development is expensive and low profit. Commodification may be inevitable. Control of the shell and GUI, where they can push advertisements and shovelware and telemetry on you, that is profitable.
So in 20 years, 50? I predict proprietary OSes will die out eventually, balance of probability.
I’m with you in the long term.
I am curious what kernel is backing the computers on the stuff SpaceX is doing. I’ve never seen their consoles but I am guessing we are closer to modern reusable hardware and software than we were before. When niche applications like that keep getting more diverse, i bet we will get more open specifications so everything can work together.
But again I am more pessimistic and think 50 years would be relatively early for something like that.
I agree. But also add in the movie industry that’s been complete trash for a while now. Not to mention books. I’m not sure if we’ll ever see another Harry Potter level book again, at least in our lifetimes.
My take is we’ve already left the golden ages of movies, music, and books and probably won’t get another for an extremely long time.
Video games are going through the same downfall which streaming services brought. Physical media left the movie scene as a standard while ago, but video games took longer. Now it’s going to be all streaming and subscriptions where you can never own anything.
Once that happens, enshittification will peak, companies won’t be incentivized to make the games good anymore, standards tank, and people will forget how good things once were.
deleted by creator
As with video games, the real gems imo for movies and music are from the indie scenes.
Not to mention books. I’m not sure if we’ll ever see another Harry Potter level book again, at least in our lifetimes.
Are you talking quality or popularity? Because there are many, many books that are just as good or better than Harry Potter.
movie industry that’s been complete trash for a while now.
This is not a callout of you in particular so don’t get offended, but that’s really only true if you look at the trash coming out of Hollywood.
There’s some spectacularly good shit coming out of like France and South Korea (depending on what genres you’re a fan of, anyways), as well as like, everywhere else.
Shitty movies that are just shitty sequels to something that wasn’t very good (or yet another fucking Marvel movie) is a self-inflicted wound, and not really a sign that you can’t possibly do better.
Interesting! Anything you’d recommend?
Train to Busan, Parasite, Unlocked, Wonderland, Anatomy of a Fall and Close have been ones I’ve seen recently that I liked.
I think some of those are available on Netflix, but as I don’t use Netflix I can’t say which ones and for certain, though.
Edit: I just realized some of those are vague and will lead to a billion other movies lol. The first 4 are S. Korean, the last two are French and they’re all from 2020 or newer so anything not from there or older isn’t the right one.
Not to mention an ungodly amount of Animated content of all varieties. Anime, cartoons, indie (Helluva Boss is hilarious and (un?)surprisingly dark), I recall seeing a screenshot of something French with amazing art style I want to look into watching.
One Piece is gearing up for a re-animation from the beginning using its new style from the Wano arc IIRC, and that is a hell of a long epic story.
We are sorry. So sorry indeed man! We are sorry that because of a pandemic many people in the industry had to move to safe locations and realize how much better those places were so they’re not going back. We’re sorry to have inconvenienced your game play. But we’re working hard to get you to pay another salary’s worth on the next tumb raider! We promised so much many more transistors that the boob wobble will be endless! Thru AI, anything is possible!
Also, the movie industry is struggling because of many reasons. Movies are getting too expensive, the safe formulas big studios relied on aren’t working anymore, customer habits are changing with people going less to movie theaters.
At the same time, just like with video games, the indie world is in a golden age. You can get amazing cameras and equipment for quite a small budget. What free software like Blender can achieve is amazing. And learning is easier than ever, there are so many free educational resources online.
The entire entertainment industry is floundering. Wages lagging inflation in many sectors, people are paying significantly more to eat. They’re going to cut back on the streaming services and they’re going to cut back on going out to the movies. I’m right here at these crossroads where the only thing that makes sense is to give people a little more value for the money, instead we’re going to pull every fast trick we can to make more in advertising and gambling.
Or you had several companies try to start their own streaming services from scratch and thought you needed a ton of new shows to fill it. Disney+ could have easily gotten away with archived Disney Channel shows, all the animated Disney cartoons, the old Star Wars & Marvel movies, and the Simpsons. It didn’t need a lot of the new shows, no matter how cool they looked.
I wouldn’t say the movie industry is struggling, I would say that people that work for a living are struggling. Actors are still getting paid huge sums of money, so are directors and producers. They are getting their pound of flesh one way or another. They are just not producing anything that people want to watch. For example all this marvel post-infinity War bullshit, no one wants to see that. No one cares about marvel Disney anything right now, it’s low quality drivel. But Beetlejuice, Barbie, oppenheimer… These are proof that people do still want to see movies, they just don’t want to produce anything meaningful.
The people struggling that I’m talking about, however, are the supporting roles. People doing the filming, set dressing, makeup, special effects. Lots of these lower levels supporting roles get almost nothing compared to their cost of living in California, while some of the main actors can get tens of millions
Just like AAA game studios, movie studios don’t want to take risks, so they go with productions they consider “safe”: aim for the lowest common denominator, play into nostalgia, don’t make anyone upset by touching subjects like politics, religion. And you end up with the garbage they are making right now.
I’ve just been watching older movies, there’s this amazing sweet spot when CGI just became a thing where the visual effects are passable but not so prevelant that the entire plot gets replaced with pointless explosions.
Meta isn’t the only VR space. Resontite VR plays like VR chat meets Gary’s mod, and supports most equipment you can hook up. It is not perfect but there are frequent updates trying to address the issues, same as any platform.
Meta is the only VR space for anyone not able/willing to spend as much on a VR headset as a mid range desktop (especially when that VR headset might not even function without the addition of a mid range desktop), and for people who want VR and a vague semblence of privacy there isn’t really any affordable headset at all.
Resontite doesn’t require a headset. It runs in desktop just as well… You just don’t get body tracking.
Ok . So first of all while NVIDIA is absoluetly a scumy company but the reason they are able to be this scummy is because they do generaly deliver unreasonable performance improvment ( at an unreasonable price tho ) and this time its unlikely to be any diffrent and 50xx series is expected to be monstrous as far as performance go. So far they didint do the same mistake intel did with cpus.
Second . You cant collapse something that hasnt risen. Virtual reality never gained enough traction for it to collapse. I personaly blame PlayStation for this. If there is anyone that could make a diffrence it would be them .
Third. If that’s true thats actually fucked up. Alghtough to be fair openai is very strange company and very closed one for a supposedly openai. Also i dont think going from non profit to for profit comapny changes much since it requires a thing they dont have. Profit.
So far they didint do the same mistake intel did with cpus.
Exactly. Think of where Intel would be if they didn’t sit on their hands. AMD completely ate their lunch, and they’re scrambling to retain some amount of their core business while expanding into other businesses. If they kept their CPUs solid, they would be able to devote more resources to the GPU division and probably be eating AMD’s lunch and eating a bit into Nvidia’s marketshare.