Like I’m not one of THOSE. I know higher = better with framerates.
BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.
The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!
… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.
Yet like.
I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.
And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?
Bro when Majora’s mask came out nothing was 60fps lol. We weren’t used to it like how we are today. I’m used to 80fps so 60 to me feels like trash sometimes.
Ackshuli – By late 2000 there were a couple games on PC that could get there.
… If you were playing on high-end hardware. Which most PC gamers were not. (despite what Reddit PCMR weirdos will tell you, PC gaming has always been the home for janky hand-built shitboxes that are pushed to their crying limits trying to run games they were never meant to)
Regardless that’s beside the point – The original MM still doesn’t feel bad to go back to (it’s an annual tradition for me, and I alternate which port I play) even though it never changed from its 20FPSy roots.
Yeah but even now you can go back and play Majora’s mask, and it not feel bad.
But as mentioned the real thing is consistancy, as well as the scale of action, pace of the game etc… Zelda games weren’t sharp pinpoint control games like say a modern FPS. Gameplay was fairly slow. and yeah second factor is simply games that were 20FPS, were made to be a 100% consistant 20 FPS. A game locked in at 20, will feel way smoother than one that alternates between 60 and 45
No more optimizations. This must then be compensated for with computing power, i.e. by the end user. These are cost reasons. Apart from that, the scope has become much larger, making optimizations more time-consuming and therefore more expensive. In the case of consoles, there is also the fact that optimizations have to be made specifically for a hardware configuration and not, as with PCs, where the range of available components is continuously increasing. Nevertheless, the aim is to cut costs while maximizing profits.
Huh? 60fps was the standard, at least in Japan and North America, because TVs were at 60Hz/fps.
Actually, 60.0988fps according to speed runners.
The TV might refresh the screen 60 times per second (or actually refresh half the screen 60 times per second, or actually 50 times per second in Europe), but that’s irrelevant if the game only throws 20 new frames per second at the TV. The effective refresh rate will still be 20Hz.
That’s just a possible explanation. I don’t know what the refresh rate of Majora’s Mask was.
I’m pretty sure the 16-bit era were generally 60FPS
Framerates weren’t really a
Thing.
Before consoles had frame-buffers – Because Framebuffers are what allow the machine to build a frame of animation over several VBlank Intervals before presenting to the viewer.
The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.
Before that, you were in beam-racing town.
If your processing wasn’t enough to keep up with the TV’s refresh rate (60i/30p in NTSC territories, 50i/25p in PAL) – Things didn’t get stuttery or drop frames like modern games. They’d either literally run in slow-motion, or not display stuff (often both, as anyone who’s ever played a Shmup on NES can tell you)
You had the brief window of the HBlank and VBlank intervals of the television to calc stuff and get the next frame ready.
Buuuut, as of the PSX/N64/Saturn, most games were running anywhere between 15 and 60 FPS, with most sitting at the 20s.
PC is a whole different beast, as usual.
i think you’re mixing up a few different things here. beam-racing was really only a thing with the 2600 and stopped once consoles had VRAM, which is essentially a frame-buffer. but even then many games would build the frame in a buffer in regular RAM and then copy everything into VRAM at the vblank. in other cases you had two frames in VRAM and would just swap between them with a pointer every other frams. if it took longer than one frame to build the image, you could write your interrupt handler to just skip every other or every three vblank interrupts, which is how a game like super hang-on on the megadrive runs at 15 FPS even though the VDP is chucking out 60 frames a second. you could also disable interrupts when the buffer was still being filled, which is how you end up with slowdown on certain games when too many objects were on the screen. too many objects could also lead to going over the limits of how many sprites you can have on a scanline, which is why things would vanish- bit that is it’s own seperate issue. if you don’t touch VRAM between interrupts then the image shown last frame will show this frame as well
I cared about the 3DO…
Thanks for the info though!
FPS and alternating current frequency are not at all the same thing
I was looking it up, and games like Super Mario World are allegedly at 60fps according to some random things on the internet
Wrong but also not completely wrong.
F-Zero X ran at 60 fps. Also Yoshi’ Story, Mischief Makers, and probably a few others.
Also the PS1 has many games that ran at 60 fps, too many to list here in a comment.