What exactly is the point of all those extra Hz? I get that in this case it’s just a “because we can” kind of situation, but in general… I’ve never paid any attention to refresh rate and it has never affected me. Is higher really better?
After a certain point, no, not really. 30 FPS is good for basic video. 60 is good for fine motion (sports, fast video games). 120 is good enough for basically every regular viewing use case. Beyond 144, it’s really diminishing returns. You know how when something starts spinning really fast, it just turns into a blur? Yeah.
I think they’re mostly talking about regular video, in which case 60 is generally fine. Heck, 30 is usually fine. But I agree that in video games anything below 120 is downright painful
I definitely don’t play with anything near 120 and it doesn’t bother me. I suppose it’s something that once you start paying attention to you notice haha.
You would honestly probably be fine after a short while with lower frame rates. Guaranteed you used to game at those slower frame/refresh rates and never knew better.
I absolutely agree there’s still benefit to be had above 60, but 60 is still mostly fine. Unless I guess all you do is ultra competitive gaming where twitch reactions are necessary.
Depends on the human, there was an article many years ago from a proper science study, some peoples internal vision refresh brain clock speed doesn’t get more info with the super higher refresh.
I can tell that 90 is smoother than 60 just slightly, but when it involves large motion across the screen like at the movie theatre my brain doesn’t process the spots in between and I end up seeing static snapshots. it becomes nauseating, so for a scene I know will have a speedy side to side motion I end up looking down. And it is not the saccade phenomenon, because it happens even if I have a focal point on screen to not move my eyes of off.
I don’t know where is the limit but I’m willing to keep trying. My previous monitor was 165 Hz and it was good. My new 480 Hz monitor it’s glorious when I can run the game at that speed. Played Boltgun and there where areas where it “only” ran at 360 Hz and others where it ran at full 480 Hz and the difference what noticeable and very satisfying.
On not so super super technical level, being able to see something updating on your screen before your opponents, can give you an advantage. Provided you can also react fast enough for it to matter.
That’s bull, your reaction time is always leagues slower. Not to talk about input lag, although it has gotten much better, it still adds 5 to 10 fps (on 145 fps) to the reaction time.
It’s more of a moar = better thing, because most gamers are male teens/young adults, and no one in the industry fights the claims, because they make more money from expensive gaming hw.
I’m pretty sure reaction time doesn’t matter, as long as both players have the same reaction time, right? Like, reaction time could be 10 minutes and if one player sees the stimuli 1ms faster than the other, then they will react first and (assuming their decision making is correct) “win” the interaction.
The next test of usefulness would be real world variance of reaction time between people. For high level players, I would expect it to be very similar, and thus potentially a few ms improvement could take you from slower to faster than an opponent. But “very similar” is doing a lot of heavy lifting here since I don’t have exact numbers to look at.
Physically, the eye tops out at about 100 Hz; cones can’t refresh their chemical faster than ~70 Hz but some effects with LCD (vs. CRTs line-by-line) increasing sensitivity.
But apparently, you can train your sensibility with computer work where you have to follow the mouse with your eye (CAD, artists, etc). I guess the neuron layer in the eye for preprocessing get optimized for smaller areas of sensitivity that way. Such trained people notice a stuttering in animations even if the focus is elsewhere, which is annoying.
At least, i’m not affected and can’t tell the difference between 60 Hz and 30 Hz.
So in short, it depends. If you aren’t bothered, look for other specs.
That’s the point with the neuron layer around the eye. It “compresses” the data, the optical nerve is a limited bandwith bus and the brain eats enough calories already. But like everything neuron, it’s adaptable.
A faster refresh rate also means the image on screen is more up-to-date with what the computer is actually processing. Basically, it doesn’t matter if the difference is perceptible in terms of image smoothness because the gap between your inputs and the response of the screen narrows significantly.
On the other hand… there was this registry tweak in Windows 7 taskbar’ autohide feature, to make it snappier. Default was 400 ms for the animation and setting it to 300 made it noticeably faster. But setting it to 200 ms made much less of a difference, you could have set it to 0 too with the same result. Others might be more sensible to this but it taught me a lesson in how fast 0.1 second really is.
Now, 100 Hz aka 100 frames per second is 0.01 second per frame, reaction time somewhere in the range 250 - 350 ms aka 0.2 - 0.3 second and reflexes, which pro gamers extensively use, somewhat around 100 ms aka 0.1 second.
I don’t think you miss much with a frame, neurons are slow.
Up to a certain extent, yeah. The faster an image appears on the screen, the sooner your can perceive it and start to react. But it’s a diminishing return. The difference between 30 FPS and 60 FPS is perceptibly much bigger than the difference between 60 and 90. Beyond about 180-240, it becomes faster than any human alive can perceive. Even 144 is fast enough for nearly everyone to reach their theoretical peak performance.
What exactly is the point of all those extra Hz? I get that in this case it’s just a “because we can” kind of situation, but in general… I’ve never paid any attention to refresh rate and it has never affected me. Is higher really better?
After a certain point, no, not really. 30 FPS is good for basic video. 60 is good for fine motion (sports, fast video games). 120 is good enough for basically every regular viewing use case. Beyond 144, it’s really diminishing returns. You know how when something starts spinning really fast, it just turns into a blur? Yeah.
60 isn’t fine
I think they’re mostly talking about regular video, in which case 60 is generally fine. Heck, 30 is usually fine. But I agree that in video games anything below 120 is downright painful
I definitely don’t play with anything near 120 and it doesn’t bother me. I suppose it’s something that once you start paying attention to you notice haha.
When you regularly start playing at >120hz you definitely notice when stuff is playing at lower than 60hz
Like it sounds snobby but I can’t play stuff at lower than 100hz ish otherwise I somehow get motion sick from it
You would honestly probably be fine after a short while with lower frame rates. Guaranteed you used to game at those slower frame/refresh rates and never knew better.
I absolutely agree there’s still benefit to be had above 60, but 60 is still mostly fine. Unless I guess all you do is ultra competitive gaming where twitch reactions are necessary.
Yeah, you’re just used to it.
And my wallet is no doubt thankfully for it. As long as my old GTX1660 keeps chugging on I’ll keep gaming at ?Hz on my ???p monitor lol
Depends on the human, there was an article many years ago from a proper science study, some peoples internal vision refresh brain clock speed doesn’t get more info with the super higher refresh.
I can tell that 90 is smoother than 60 just slightly, but when it involves large motion across the screen like at the movie theatre my brain doesn’t process the spots in between and I end up seeing static snapshots. it becomes nauseating, so for a scene I know will have a speedy side to side motion I end up looking down. And it is not the saccade phenomenon, because it happens even if I have a focal point on screen to not move my eyes of off.
Yes this…panning shots at 24fps literally make me nauseous.
Then why has it been the standard for almost 50 years?
60 is fine, and its cuz we used the wall power 60 hz as a clock since it was extremely stable and free.
Because more means more costs which means people won’t buy as many?
Because we didn’t have as good technology for higher framerates
Not only do you need better screens, but also faster processing speeds
I played Half Life at 15 fps back then, and I can tell you that 60 fps is mostly fine.
My next monitor will still be 144 or more though.
I don’t know where is the limit but I’m willing to keep trying. My previous monitor was 165 Hz and it was good. My new 480 Hz monitor it’s glorious when I can run the game at that speed. Played Boltgun and there where areas where it “only” ran at 360 Hz and others where it ran at full 480 Hz and the difference what noticeable and very satisfying.
On not so super super technical level, being able to see something updating on your screen before your opponents, can give you an advantage. Provided you can also react fast enough for it to matter.
That’s bull, your reaction time is always leagues slower. Not to talk about input lag, although it has gotten much better, it still adds 5 to 10 fps (on 145 fps) to the reaction time.
It’s more of a moar = better thing, because most gamers are male teens/young adults, and no one in the industry fights the claims, because they make more money from expensive gaming hw.
I’m pretty sure reaction time doesn’t matter, as long as both players have the same reaction time, right? Like, reaction time could be 10 minutes and if one player sees the stimuli 1ms faster than the other, then they will react first and (assuming their decision making is correct) “win” the interaction.
The next test of usefulness would be real world variance of reaction time between people. For high level players, I would expect it to be very similar, and thus potentially a few ms improvement could take you from slower to faster than an opponent. But “very similar” is doing a lot of heavy lifting here since I don’t have exact numbers to look at.
Physically, the eye tops out at about 100 Hz; cones can’t refresh their chemical faster than ~70 Hz but some effects with LCD (vs. CRTs line-by-line) increasing sensitivity.
But apparently, you can train your sensibility with computer work where you have to follow the mouse with your eye (CAD, artists, etc). I guess the neuron layer in the eye for preprocessing get optimized for smaller areas of sensitivity that way. Such trained people notice a stuttering in animations even if the focus is elsewhere, which is annoying.
At least, i’m not affected and can’t tell the difference between 60 Hz and 30 Hz.
So in short, it depends. If you aren’t bothered, look for other specs.
While the cones can only refresh at 70, your cones aren’t synchronized. You can “see” a lot higher.
That’s the point with the neuron layer around the eye. It “compresses” the data, the optical nerve is a limited bandwith bus and the brain eats enough calories already. But like everything neuron, it’s adaptable.
Damn, reading this from a CS POV really puts into perspective how efficient our brain is.
A faster refresh rate also means the image on screen is more up-to-date with what the computer is actually processing. Basically, it doesn’t matter if the difference is perceptible in terms of image smoothness because the gap between your inputs and the response of the screen narrows significantly.
On the other hand… there was this registry tweak in Windows 7 taskbar’ autohide feature, to make it snappier. Default was 400 ms for the animation and setting it to 300 made it noticeably faster. But setting it to 200 ms made much less of a difference, you could have set it to 0 too with the same result. Others might be more sensible to this but it taught me a lesson in how fast 0.1 second really is.
Now, 100 Hz aka 100 frames per second is 0.01 second per frame, reaction time somewhere in the range 250 - 350 ms aka 0.2 - 0.3 second and reflexes, which pro gamers extensively use, somewhat around 100 ms aka 0.1 second.
I don’t think you miss much with a frame, neurons are slow.
Up to a certain extent, yeah. The faster an image appears on the screen, the sooner your can perceive it and start to react. But it’s a diminishing return. The difference between 30 FPS and 60 FPS is perceptibly much bigger than the difference between 60 and 90. Beyond about 180-240, it becomes faster than any human alive can perceive. Even 144 is fast enough for nearly everyone to reach their theoretical peak performance.
BlurBusters have nice articles about this.
TL;DR: Less motion blur and less artifacts (like stroboscopic effects, which can also be visible at 480hz).
Higher refresh rate is definitely better yes, but 700 hz is well past the point of diminishing returns lol