Mau1wurf1977 wrote:Well this boils down to the subjective matter of "playing smooth". Wolfenstein 3D on that 16MHz 286 is skipping a lot of frames. Smooth for me is at 72+ fps 😀
I am so sick and tired of this FPS obsession. 24-30 fps is more than enough for me personally. 60 Hz monitors cannot show more than 60 fps anyways. Before you link me to some "you can see the difference" website, that is not actually a valid method for evaluating anything. Try a double blind test. The same applies to people who trash 128 kbps mp3s and swear how they can hear the difference, even through modern encoders produce basically transparents files to the original. Most of these people fail the double blind test very badly, often even preferring the 128 kbps version to the original.
I remember when I was FPS obsessed and always turned down settings in Bioshock because in my delusion it was "laggy as fuck". It turned out that my framerate was actually around 200 fps. Never compare things "side my side" because you KNOW the right side is 30 fps and your mind actually percieves it as "unplayable" even through you might actually confuse it for the 120 fps side in a true blind test. If you want to see what you really percieve, have someone show you things but without telling you which is which.
I am also infuriated by people who claim "man, after 1080p and 120 fps everything else feels/looks shitty man, you can't go back". Yes I can, I came back, I wasn't impressed by HD or stupidly high framerates. Most people who whine about framerates are sociopathic shooter players who blame their bad skill on "lag". Many people became Quake pros on shitty 100 mhz Pentiums and many people with quad core machines suck at gaming.
Honestly, ultra high FPS and graphics with way too many effects fall into the uncanny valley for me. Water shimmering like mercury and stuff moving so fast and smooth that it defies physics are NOT realistic. I am used to seeing 25 fps TV and graphics that don't pretend realism and that is the thing I am comfortable for me. Sure, gamers always say how real video contains blur and rendered scenes do not, but that is not true in case of retrogaming. Most old CRT monitors have quite a bit of blur and the output of my PSone on my CRT TV is very blurry. I play games for fun, not to admire "kickass graphics" that will be considered "zomg shitty" by "gamer kids" in half a decade. What's next? Will gamers in 2030 claim "any les than 550 fps is liek total unplayable lololol"?