Jump to content

How many FPS can humans perceive? Turns out we knew the answer 30 years ago


Recommended Posts

Don't like reading? Scroll to the end for the TLDR conclusion, but you'll miss all the steps and reasons.

 

I started gaming in the 80s, so i've seen the whole dev history of computer displays: From monocolor greenscreens to today. The only two display-types i have no personal experience with are oldschool arcade vector-displays and current prototype holoscreens. I very early on started to appreciate motion and smoothness over detail and fidelity. In other words: I care less about realism and more about "being in the zone" and there being no distracting artifacts. So basically i'm the perfect target audience for high refreshrate displays (and antialiasing, but that's another story).

 

Here's something most of you are prolly too young to know: Back in the 90s gamers were happy with 25 FPS, but they ran 75hz displays or even 90hz if they were chads. Why? Flicker. Screens back then weren't TFTs but instead beam-based. In simple terms, the monitor would flash a picture every n millisecs. In-between those flashes (the "vertical blank"), the screen was black. So it was literal "flicker-tech": The higher the screen refreshrate, the faster the flicker. It turned out 60hz gave people headaches, 75hz was "okay", and 90hz was "comfortable" enough to make the flicker unperceivable.

 

Now flicker isn't the same as motion, just like refreshrates aren't the same as FPS. But here's a question nobody asked back then, and still hasn't asked today: If 90-100hz is the human limit for detecting flicker, then detecting motion should be lower or equal to that number? If this doesn't sound intuitive and obvious to you, lemme rephrase the question: Can humans perceive local motion faster than outright flicker of the entire field of view? No, that makes no sense. So 90-100hz is the theoretical upper bound, and the only question is: How much lower is the practical limit for motion detection?

 

(At this point some people are bound to already be typing they can detect over 100hz. You're wrong but also right: What you perceive above 100hz is not the screen itself, but artifacts in the I/O and render pipeline. For example, a mouse with 100hz precision is kinda bad. I could go into more detail, but that's not the topic here. This article only is about human perception of display technology)

 

So, did they test this in the 80s and 90s, when screen-flicker was all on people's minds? Nope. Or rather: Companies did in fact test motion perception internally, and then only marketed their preferred conlusions. Here's how this all went down...

 

While gamers on home computers were fine with 25 FPS, arcade machines had to offer a better experience to attract coins. They settled on a 60 FPS standard - there's that number again. And they loved 60 FPS so much, they hardcoded entire games to that framerate. This is where the whole 60 FPS thing comes from, that survives to this day: It's the golden rule arcades established in the 80s and 90s.

 

But why 60 FPS? Well, like i said: Companies actually tested motion perception back then. And they found that 60 FPS was the lowest they can get away with, and still have humans perceive motion as "fluid". 60hz also was the cheapest monitors they can put in arcades. Who cares about headaches: Gamers were supposed to play for minutes on those machines - not hours - so comfort wasn't important. Would higher FPS look even better according to their tests? Yep. In fact they found 70 FPS was the sweet spot, and no measureable benefit above 80 FPS. But 80hz/FPS hardware would cost double the price, and at a time when home computers ran 25 FPS or less, a 60 FPS arcade was gamer-heaven. So that's how it was marketed: 60 FPS is god.

 

Now aren't those three decade old findings interesting? 60 FPS as the minumum, 70 as the sweet spot, and 80 for maximum comfort. That's not too different to 60/75/90hz for flicker. In fact i'd bet the true numbers are identical, and the test subjects simply were more forgiving to stutter than flicker. Now why would that be? In fact, where does this preconception of humans being less sensitive to motion than flicker come from? I have a few ideas:

 

1. Flicker is worse than motion stutter. It's literarily painful (headaches), rather than just "annoying". So we're "pain-conditioned" to judge flicker more harshly than stutter. Problem is, that's two very different questions: "What is comfortable" VS "What is perceivable". 

 

2. Personal cope. Whenever you cannot have something, you convince yourself that yours is "Just as good", or at least "good enough".

 

3. Marketing cope. Every product sold is the best ever - or at least totally "good enough". Why tell customers that you're not producing the best possible, but rather what makes economic sense - the most bang for the buck?

 

Notice the problem with #2 and #3 isn't the justification itself. You may be fine with 60 FPS, and it might be the best compromise for your personal preferences and budget. Likewise i totally agree with arcade manufacturers in the 80s making 60hz/60FPS machines: All things considered, it was the optimal and most efficient solution.

 

Instead the problem with cope is the lies and state of denial. It poisons public discourse and individual learning by memoryholing the reasons for decisions made. Case in point: Today we once again are debating "how many FPS can humans perceive". And we act like it's a new thing that's at the frontier of science. All the research from decades ago? Memoryholed. It never happened. 

 

"But wait", i hear you say: That old research - it's old. And newer is always better. So how do i know they were right? Have i actually tested those claims myself?

 

Yeah well: I've got a 240hz monitor, decent GPU setup for low latency, and a mouse with stupidly high DPI that polls at 1000hz. Answer: Yes, the guys from the 80s were almost right. I found 60 FPS to be bare minimum - just like them. 75 FPS is my sweet spot, and above 85 FPS i can't tell the difference. Back in the old days, i also couldn't tell 85hz flicker from 90hz flicker, so the numbers are identical.

 

TLDR: Flicker sensivity == Motion sensivity.

 

Edited by libertyordeath
Link to comment

None of this is either accurate nor true.

 

The human brain can actively observe up to about 1000mhz and there is no baseline because there are people whom literally have different eyes or amount of signal that can be received from the optic nerves, and/or that can perceive almost double the distance and amount of motion vector of the '20/20 baseline'.

 

There is no average other than a derived one.

Link to comment
1 hour ago, 27X said:

The human brain can actively observe up to about 1000mhz

Which is relevant how exactly? No, just throwing technobabble into a discussion doesn't impress me. In this day and age, it actually makes me suspicious.

 

EDIT: Just found the paper. It's not even true, nevermind relevant. What they tested was NOT how fast the brain as a whole can observe anything. Instead they literarily just tested dancing neurons. Completely useless for drawing conclusions on the mind-level about conscious perception.

 

Construct an actual logical argument-chain, or i don't care about your opinion.

Edited by libertyordeath
Link to comment
  • 3 weeks later...

The US NAVY did some testing and pilots could not only perceive, but they could accurately identify aircraft visible in a single frame when the frames were presented at increasingly higher rates of frames per second. I forget what the ceiling was for the participants, but it was a lot higher than 30.

Edited by Cyrodiil_Sword
Link to comment
On 6/6/2023 at 7:04 PM, Cyrodiil_Sword said:

The US NAVY did some testing and pilots could not only perceive, but they could accurately identify aircraft visible in a single frame when the frames were presented at increasingly higher rates of frames per second. I forget what the ceiling was for the participants, but it was a lot higher than 30.

 

If I remember correctly it was a flash of light 1/600 of a second and it was the air force. So you can't really translate it to what you can see on the screen or else you would be able to see the flickering LEDs at the back of your screen. Like mentioned in the first post flickering is not the same as motion sensitivity.

 

 

Also not to forget the first 120hz screens that came out on the market had HORRIBLE input delay. Some was even above 100ms and in class with the worst LED tvs at the moment. But back then only computer nerds were talking about input delay while on most hardware forums similar to Toms hardware and even Linus (back in NCIX) did not even know what it was.

All those people were talking about HZ on the screen and the super fast gray to gray and how much better the experience it was compared to a 60hz screen even if the image on the 60hz was faster.

 

So this two decades old discussion really boils down to nothing. We are still back in the same spot we were when quake came out. It all depends on personal preference and not science or facts.

 

Btw go watch the Hobbit in 48fps on a big screen.

 

Link to comment
  • 2 months later...

because people try to categorize the brain's "auto complete motion" feature the same as its actual perception.

you can extrapolate and interpolate motion from very choppy images, because this is how your brain was designed to work.(not losing sight of something when blinking or when it pass behind a cover of trees)

but in terms of perceiving actual images it DOES go up to the equivalent of hundreds of FPS, but it means nothing for the average person.

the actual difference in how you see things on screen vs real life is the depth perception and the photography version of the uncertainty principal.

not sure how it's called.

because in real life, things don't just pop into existence beyond the quantum scale. so you don't really need the "perfect FPS" to see things, but a screen does, things on the screen do pop into existence.

so we need those +144Hz screens or the +1000FPS bs, because we can see it, even without a "truly seeing 1000FPS" people keep arguing about because our eyes don't work the same way a screen displays things.

 

this is why old FPS games with low LOD backgrounds and high LOD models had such good visibility, it takes inspiration from how our depth perception works.

 

if you want to argue "at what point it's good enough?" I would say that it depends.

basically when the eye strain stops is the "good enough" point IMO, but some competitive FPS players playing on a 3ms latency with 144Hz screen may still complain they want better response time and FPS because they want the physical latency to be as close to 0 as possible so all that is left is their own response time.

Link to comment
  • 2 weeks later...
On 5/18/2023 at 10:01 PM, libertyordeath said:

Yeah well: I've got a 240hz monitor, decent GPU setup for low latency, and a mouse with stupidly high DPI that polls at 1000hz. Answer: Yes, the guys from the 80s were almost right. I found 60 FPS to be bare minimum - just like them. 75 FPS is my sweet spot, and above 85 FPS i can't tell the difference. Back in the old days, i also couldn't tell 85hz flicker from 90hz flicker, so the numbers are identical.

This is broadly my experience as well, although I will say that 90 vs 120 is noticeable in VR headsets (mostly through nausea... I couldn't tell you how it looks different, but you can feel it)

Link to comment
  • 3 weeks later...

the question is not if you see the difference its if you understand the difference, the lower frame rate the more your brain has to fill the gaps the higher the less it has to

work on that and that also depends on the condition and how used a person is to it,

 

also what one see is not is not universal and things can get muffled by the placebo effect, besides very few thigs outside of games have a framerate over 90

but yes 80 to 240 is noticeable, but 120 to 240 is not as noticeable

 

eyes is not really seeing frame by frame , they feel at all times rest is up to the brain

so to see the difference you probably don't past 100 hz most the time unless focusing on trying to see it and know how to do it.

 

VR is an exception are you are focusing regardless if you want or not.

Link to comment
  • 2 weeks later...
On 5/18/2023 at 8:33 PM, libertyordeath said:

Which is relevant how exactly? No, just throwing technobabble into a discussion doesn't impress me. In this day and age, it actually makes me suspicious.

 

EDIT: Just found the paper. It's not even true, nevermind relevant. What they tested was NOT how fast the brain as a whole can observe anything. Instead they literarily just tested dancing neurons. Completely useless for drawing conclusions on the mind-level about conscious perception.

 

Construct an actual logical argument-chain, or i don't care about your opinion.

 

Wrong, and you obviously didn't find the actual study. There is no min-line baseline because of genetic and biological variance other than broad spectrums. 20/20 is used because the largest number of people fall under it as measurement.

 

Perception > vision, and you obviously haven't a fucking clue what the difference is, and you sure as hell don't speak for anyone but yourself.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. For more information, see our Privacy Policy & Terms of Use