My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.
Everyone's perception is different, I've met someone who couldn't tell the UFOs apart past 30 fps. They also didn't like shooters/action games much, probably because following fast movements was difficult for them.
But I think the vast majority of people easily notice the difference between 30 and 60. And 60 to 120 should also be possible notice for most. As for me, I used to play Quake 3 a lot and back then that was played on 120 or even 240hz CRTs. The first flat screens with slow reaction times and 60hz max were quite a step down.
While I don't really like linus tech tips, they did some nice testing on the topic and came to the conclusion that more fps is measurably better for shooters, even if the refresh rate can't keep up.
I’ve gotten a lot of helpful answers, but yours was the only one that included a visual aid! Thanks!
What’s interesting is that when I focused on the UFOs, I didn’t notice a difference between the 30 fps and the 60 fps stars. When I let my eyes go out of focus, though, I was able to see a clear difference between them.
That's because the middle part of your vision sees detail much better but is less reactive to movement, but the outer part of your vision sees motion much better so it can notice the stutter more easily.
It's also why low refresh rates are more noticeable on larger screens.
afaik the edges of your vision are better at picking up movement too (for “seeing out of the corner of your eye” kinda things), so it’s possible that while you’re trying to make out specific things by looking directly at them, you’re missing the parts of your eye that can make out the higher FPS?