Higher frame rates have been topping wish lists for camera gear, smartphones and – perhaps most of all – gamers.
With viewfinders on cameras like the Canon EOS R3 hitting 120fps, Apple's ProMotion clocking in at 120Hz, and a growing market for 144Hz monitors (or higher) and matching GPUs, it's hard not to fetishize the tech – but can people actually perceive the difference? Some, like FilmmakerIQ, say not.
Gamers also love to cite a test given to potential USAF fighter pilots; an image of an aircraft is flashed on a screen for 1/220 sec and the pilots can identify the aircraft they've seen. They'll tell you that this means the human eye can see at least 220fps (and some invest accordingly). Any photographer can see the flaw in this logic, though. After all, you can see a xenon flash fire, and that's just a fraction of a millisecond.
Looked at differently, if you were shooting video at 24fps and fired a xenon flash, the frame when the flash fires is brightened, even if the flash is only 'on' for a fraction of the time the shutter is open. So if the human eye is working at 10fps, it'd still perceive that plane at 220fps.
Alright, but a faster refresh rate like ProMotion does look smoother, so that proves it? No. Motion on a phone screen, an old 30Hz monitor, or a super-fast gaming display is still a succession of still images. When things 'move' on them, scientists call this 'Apparent Motion,' the basis of all animation.
Move a mouse around fast and you'll see small gaps and multiple mouse pointers on the display. The faster the refresh rate, the more pointers and the smaller the gaps – but you'll still likely see multiple instances. Which actually supports the idea that the eye's 'refresh rate' is lower than the monitor's.
Okay, so where it really gets interesting is with research from Adolphe-Moïse Bloch in 1885, which said that – below a certain amount of time (or 'exposure', let's say) – the eye perceived light as less bright if seen for less time. Above that exposure, the perception of brightness wasn't affected. Bloch and other scientists found the time period where perception was affected by the duration of exposure to light was – drum roll – 100 milliseconds. Or a tenth of a second.
Get the Digital Camera World Newsletter
The best camera deals, reviews, product advice, and unmissable photography news, direct to your inbox!
Unlike a camera, there is no digital clock running capture and readout. The eye is always active so there is no need for – and there isn't – an actual frame rate. The human eye actually has different areas of perception; the high-resolution fovea – the middle – sees better color but is slower. The peripheral vision is better adapted to identifying movement for evolutionary reasons.
It too, though, can't usually identify the flickering of, say, a low-energy lightbulb somewhere around 60-90Hz. Video makers, however, will be well aware that strobing is something that cameras can easily pick up if the shutter speed is wrong.
The stroboscopic effect, however, can be seen in the eye. You'll know it most from videos of a wheel turning in which, at a certain point, the wheel appears to be turning another way. JF Schouten, in 1967, showed that humans viewing a rotating subject in continuous light (no flicker) nevertheless saw a 'subjective stroboscopic', the first being 8-12 cycles per second (so, yes, around 10Hz again).
Since then, different researchers have pursued the idea that this reveals a frame rate (some basing their conclusions on LSD users' perceptions of their experiences). The most recent research seems clear, though: there is no frame rate. Biology is just more complex.
All of which is a very long way of explaining why Peter Jackson might have been wrong to choose HFR (High Frame Rate) for The Hobbit!
If you want to keep diving into frame rate, we can also answer "What is a variable frame rate (VFR)." If you're interested in capturing slow motion, then definitely check our best slow-motion camera guide.