Like I’m not one of THOSE. I know higher = better with framerates.
BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.
The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!
… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.
Yet like.
I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.
And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?
Stuttering, but mostly it’s the FPS changing.
Lock the FPS to below the lowest point where it lags, and suddenly it wont feel as bad since it’s consistent.
If you’re not playing on a VRR display 45fps *will * be horrible and stuttery. 30fps locked would feel significantly better.
It depends on what you’re running, but often if the frame rate is rock solid and consistent it helps it feel a lot less stuttery. Fallout games are not known for their stability and well functioning unfortunately.
For comparison, deltarune came out a few days ago, that’s locked to 30 fps. Sure it’s not a full 3D game or anything, but there’s a lot of complex motion in the battles and it’s not an issue at all. Compared to something like bloodborne or the recent Zeldas, even after getting used to the frame rate they feel awful because they’re stuttering all the damn time.
Game design is a big part of this too. Particularly first person or other fine camera control feels very bad when mouse movement is lagging.
I agree with what the other commenters are saying too, if it feels awful at 45 fps your 0.1% low frame rate is probably like 10 fps
Well a good chunk of it is that older games only had ONE way they were played. ONE framerate, ONE resolution. They optimized for that.
Nowadays they plan for 60, 120, and if you have less too bad. Upgrade for better results.
I think it has something to do with frame times.
I’m not an expert but I feel like I remember hearing that low framerate high frametime feels worse than low framerate low frametime. Something to do with the delay it takes to actually display the low framerate?
As some of the other comments mentioned, it’s probably also the framerate dropping in general too.
Probably consistency.
Zelda was 20 fps, but it was 20 fps all the time so your brain adjusted to it. You could get lost in the world and story and forget you were playing a game.
Modern games fluctuate so much that it pulls you right out.
@VinesNFluff Most people can’t honestly perceive any change in their visual field in less than 1/60th of a second except perhaps at the very periphery (for some reason rods are faster than cones and there are more rods in your peripheral vision) and even then not in much detail. So honestly, frame rates above 60 fps don’t really buy you anything except bragging rights.
This is such bullshit, 60 and 90 fps even is such a noticable difference. Maybe 144 and 165 is hard to distinct though…
@FlembleFabber Do you have LED lights in your house? Can you see 60Hz flicker?
led lights almost always run off rectified, smoothed supplies, so no flicker. i have one cheap old crappy bulb that doesn’t, and i can definitely perceive the flicker. it’s almost nauseating.
@binom If you film with a camera with a ntsc vertical reference rate of 59.95 hz you will see a beat note between the lights and the led lighting indicating it is not well filtered if at all. If you have a newer HiDef camera, most of them work at a 24Hz refresh rate, that IS a slow enough rate that you see jitter in the movement, they also will have a beat note if recording under most LED lights. Many cheap led lights just have a capacitive current limiter and that’s it. If you power them off of 50Hz you will see the flicker, if you get dimmable LED lights they will NOT have a filter. But I don’t want to interfere with anyone’s bragging rights.
Some old games are still pretty rough with their original frame rate. I recently played 4 player golden eye on an n64, and that frame rate was pretty tough to deal with. I had to retrain my brain to process that.
Out of curiosity, did you have an actual N64 hooked up to a modern TV? A lot of those old games meant to be played on a CRT will look like absolute dog shit on a modern LCD panel. Text is harder to read, it is harder to tell what a shape is supposed to be, it’s really bad.
Trinatron baby
old games animations were sometimes made frame by frame. like the guy who drew the character pixel by pixel was like “and in the next frame of this attack the sword will be here”
It’s the framerate jumping around that causes it. A consistent 30fps feels better than 30, 50, 30, 60, 45, etc. Many games will have a frame cap feature, which is helpful here. Cap the game off at whatever you can consistently hit in the game that your monitor can display. If you have a 60hz monitor, start with the cap at 60.
Also, many games add motion blur, AI generated frames, TAA, and other things that really just fuck up everything. You can normally turn those off, but you have to know to go do it.
If you are on console, good fucking luck. Developers rarely include such options on console releases.
30 50 30 60 30… Thats FPS… Frametime means the time between each frame in this second.
My favorite game of all time is Descent, PC version to be specific, I didn’t have a PlayStation when I first played it.
The first time I tried it, I had a 386sx 20MHz, and Descent, with the graphics configured at absolute lowest size and quality, would run at a whopping 3 frames per second!
I knew it was basically unplayable on my home PC, but did that stop me? Fuck no, I took the 3 floppy disks installer to school and installed it on their 486dx 66MHz computers!
I knew it would just be a matter of time before I got a chance to upgrade my own computer at home.
I still enjoy playing the game even to this day, and have even successfully cross compiled the source code to run natively on Linux.
But yeah I feel you on a variety of levels regarding the framerate thing. Descent at 3 frames per second is absolutely unplayable, but 20 frames per second is acceptable. But in the world of Descent, especially with modern upgraded ports, the more frames the better 👍
Great games. Free space was almost mind blowing when I first played it as well
I haven’t actually played Free Space before, but I did manage to get a copy and archive it a few years ago.
I also got a copy of Overload and briefly tried that, but on my current hardware it only runs at about 3 frames per second…
The Descent developers were really ahead of their time and pushing gaming to the extreme!
Definitely give it a shot. It’s obviously different, but I loved it. My mom actually banned me from playing descent 3: vertigo, because she had vertigo and it made her sick
Vertigo was actually an expansion on Descent 2, I made the NoCD patch for it via a carefully hex edited mod based on another NoCD patch for the original Descent 2.
Any which way, yeah, anyone with vertigo wouldn’t be comfortable or oriented in any way if they’re watching or playing the game, no matter what version.
Shit you’re right. It’s been too long
Descent is pretty fun. Not as big of a fan as you are, but I definitely dig it.
Descent broke my brain. I’m pretty good at navigating in FPS’, but Descent’s 4 axis of movement just didn’t click for me. I kept getting lost, recently I tried it again after many years, I just can’t wrap my head around it.
Same with space sims. I’m dog awful in dog fights.
Indeed, it’s not quite a game for everyone, especially if you’re prone to motion sickness. Initially it only took me about a half hour to get a feel for the game, but configuring the controls can still be a headache.
Every time I set the game up on a new or different system, I tend to usually go for loosely the same sort of controls, but each new setup I might change up the controls a bit, like an endless guessing and testing game to see what controls might be ideal, at least for me.
By the way, Descent is considered a 6 Degrees Of Freedom game, not 4. But hey, at least they have a map feature, I’d go insane without the map sometimes…
I meant 6, not sure why I typed 4.
Stability of those fps are even more important. Modern games have problems with that.
also latency! playing with high input latency typical of when modern game engines choke is awful.
Because they all fuck with the frame timing in order to try to make the fps higher (on paper)
Bro when Majora’s mask came out nothing was 60fps lol. We weren’t used to it like how we are today. I’m used to 80fps so 60 to me feels like trash sometimes.
Ackshuli – By late 2000 there were a couple games on PC that could get there.
… If you were playing on high-end hardware. Which most PC gamers were not. (despite what Reddit PCMR weirdos will tell you, PC gaming has always been the home for janky hand-built shitboxes that are pushed to their crying limits trying to run games they were never meant to)
Regardless that’s beside the point – The original MM still doesn’t feel bad to go back to (it’s an annual tradition for me, and I alternate which port I play) even though it never changed from its 20FPSy roots.
Bro when Majora’s mask came out nothing was 60fps lol
Huh? 60fps was the standard, at least in Japan and North America, because TVs were at 60Hz/fps.
Actually, 60.0988fps according to speed runners.
FPS and alternating current frequency are not at all the same thing
I was looking it up, and games like Super Mario World are allegedly at 60fps according to some random things on the internet
Because CRTs (and maybe other displays) are slaved to the input and insensitive to exact timing, and console chipset designers used convenient line counts/clock frequencies, consoles often have frame rates slightly different from standard NTSC (which is 60000/1001 or ~59.94 fields per second).
The 262 AND A HALF lines per field NTSC uses to get the dumb oscillator in a CRT to produce interlacing, is not convenient. “240p” moves the VSYNC pulse, shortening the frame duration.
So NES’s run at -60.1 FPS.
The TV might refresh the screen 60 times per second (or actually refresh half the screen 60 times per second, or actually 50 times per second in Europe), but that’s irrelevant if the game only throws 20 new frames per second at the TV. The effective refresh rate will still be 20Hz.
That’s just a possible explanation. I don’t know what the refresh rate of Majora’s Mask was.
I’m pretty sure the 16-bit era were generally 60FPS
Framerates weren’t really a
Thing.
Before consoles had frame-buffers – Because Framebuffers are what allow the machine to build a frame of animation over several VBlank Intervals before presenting to the viewer.
The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.
Before that, you were in beam-racing town.
If your processing wasn’t enough to keep up with the TV’s refresh rate (60i/30p in NTSC territories, 50i/25p in PAL) – Things didn’t get stuttery or drop frames like modern games. They’d either literally run in slow-motion, or not display stuff (often both, as anyone who’s ever played a Shmup on NES can tell you)
You had the brief window of the HBlank and VBlank intervals of the television to calc stuff and get the next frame ready.
Buuuut, as of the PSX/N64/Saturn, most games were running anywhere between 15 and 60 FPS, with most sitting at the 20s.
PC is a whole different beast, as usual.
The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.
I cared about the 3DO…
Thanks for the info though!
i think you’re mixing up a few different things here. beam-racing was really only a thing with the 2600 and stopped once consoles had VRAM, which is essentially a frame-buffer. but even then many games would build the frame in a buffer in regular RAM and then copy everything into VRAM at the vblank. in other cases you had two frames in VRAM and would just swap between them with a pointer every other frams. if it took longer than one frame to build the image, you could write your interrupt handler to just skip every other or every three vblank interrupts, which is how a game like super hang-on on the megadrive runs at 15 FPS even though the VDP is chucking out 60 frames a second. you could also disable interrupts when the buffer was still being filled, which is how you end up with slowdown on certain games when too many objects were on the screen. too many objects could also lead to going over the limits of how many sprites you can have on a scanline, which is why things would vanish- bit that is it’s own seperate issue. if you don’t touch VRAM between interrupts then the image shown last frame will show this frame as well
F-Zero X ran at 60 fps. Also Yoshi’ Story, Mischief Makers, and probably a few others.
Also the PS1 has many games that ran at 60 fps, too many to list here in a comment.
Yeah but even now you can go back and play Majora’s mask, and it not feel bad.
But as mentioned the real thing is consistancy, as well as the scale of action, pace of the game etc… Zelda games weren’t sharp pinpoint control games like say a modern FPS. Gameplay was fairly slow. and yeah second factor is simply games that were 20FPS, were made to be a 100% consistant 20 FPS. A game locked in at 20, will feel way smoother than one that alternates between 60 and 45
No more optimizations. This must then be compensated for with computing power, i.e. by the end user. These are cost reasons. Apart from that, the scope has become much larger, making optimizations more time-consuming and therefore more expensive. In the case of consoles, there is also the fact that optimizations have to be made specifically for a hardware configuration and not, as with PCs, where the range of available components is continuously increasing.
FPS counters in games usually display an average across multiple frames. That makes the number actually legible if the FPS fluctuates, but if it fluctuates really hard on a frame-by-frame, it might seem inaccurate. If I have a few frames here that were outputted at 20 FPS, and a few there that were at 70 FPS, the average of those would be 45 FPS. However, you could still very much tell that the framerate was either very low or very high, which would be perceived as stutter. Your aforementioned old games probably were frame-capped to 20, while still having lots of processing headroom to spare for more intensive scenes.