• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why Higher Refresh Rates Matter

winjer

Gold Member



Higher refresh rates in gaming are superior for three primary reasons: input latency, smoothness, and visual clarity. We'll mainly discuss the last reason in this article, but first, let's briefly address input latency and smoothness, as they are also significant benefits of high refresh rate monitors.
8abThlz.jpg
 

nkarafo

Member




8abThlz.jpg


Very true.

The sad thing is that if we were talking about CRTs, all frame rates would display perfectly sharp images regardless. It's the modern panel tech that needs stupidly high frame rates to stay sharp. A huge regression.
 

NeoIkaruGAF

Gold Member
Honest question: is there a way for 60+ fps content to be displayed so that the viewer doesn’t perceive the infamous soap-opera effect?

I gave away my beloved plasma TV 3 years ago for a series of reasons, and since then my vision never fully adapted to the poor movement resolution of LED-based TVs. 30fps CG scenes and in-engine cutscenes in games are invariably a pain to watch due to sample-and-hold, and TV and movies absolutely require some motion interpolation if I want to watch more than a few minutes of footage.
On the other hand though, on modern screens the soap-opera effect is very obvious and, to use a very abused expression, “not cinematic”. Things feel too smooth, and it seems everything is moving too fast. That is perfect for actual gaming, but in cutscenes it’s a bit jarring. Would there be some way to make it “feel right” while still maintaining the smoothness?
 
Very true.

The sad thing is that if we were talking about CRTs, all frame rates would display perfectly sharp images regardless. It's the modern panel tech that needs stupidly high frame rates to stay sharp. A huge regression.
TV's are for more than just gaming. You want to watch the Super Bowl on a 32" CRT or a 77" OLED.
 

Hugare

Member
Sure, higher refresh rates are better. This should be a consensus.

But if games were only made with 60 FPS in mind, we wouldnt have some of the best games ever made by now

BOTW/TOTK, The Last of Us, Ocarina of Time, Mario 64, all of the GTA games, RDR, RDR 2, Bioshock, Mass Effect and etc.

They all launched at 30 FPS, 'cause they were pushing the hardware to its limits

We would have to wait at least for one more generation until being able to play them at 60 FPS.
 

winjer

Gold Member
Very true.

The sad thing is that if we were talking about CRTs, all frame rates would display perfectly sharp images regardless. It's the modern panel tech that needs stupidly high frame rates to stay sharp. A huge regression.

Yes, we traded size and convenience, for image quality.
I don't think that we knew at the time, exactly what we were losing.
 

NeoIkaruGAF

Gold Member
Also, those stills are brutal in showing the poor motion resolution of even 240Hz. It reminds me of digital photography around 15 years ago, when there was a huge debate with some professionals arguing that 12 megapixels would be nearly indistinguishable from 35mm film, while comparisons with high-res scans of actual films showed that to achieve parity you’d need around 200 megapixels.

In the end, unfortunately, convenience trumps all. Digital has huge advantages, but it needs to be pushed way beyond the bare minimum to achieve something analog had for a long time.
 
High refresh rate monitors only matter if your hardware can actually support those frame rates, right?

Like say you have a 540 hz monitor but your hardware can only support gaming at 60 fps.

Is there any difference in blur, latency, etc. between a monitor at 540 hz but 60 fps verse a 60 hz monitor at 60 fps?
 

nkarafo

Member
Honest question: is there a way for 60+ fps content to be displayed so that the viewer doesn’t perceive the infamous soap-opera effect?

I gave away my beloved plasma TV 3 years ago for a series of reasons, and since then my vision never fully adapted to the poor movement resolution of LED-based TVs. 30fps CG scenes and in-engine cutscenes in games are invariably a pain to watch due to sample-and-hold, and TV and movies absolutely require some motion interpolation if I want to watch more than a few minutes of footage.
On the other hand though, on modern screens the soap-opera effect is very obvious and, to use a very abused expression, “not cinematic”. Things feel too smooth, and it seems everything is moving too fast. That is perfect for actual gaming, but in cutscenes it’s a bit jarring. Would there be some way to make it “feel right” while still maintaining the smoothness?
I think it's better to just get used to it.

The only reason you see the "soap opera effect" as a negative is because you got used to the low standards of film motion and 30fps.
 

nkarafo

Member
High refresh rate monitors only matter if your hardware can actually support those frame rates, right?

Like say you have a 540 hz monitor but your hardware can only support gaming at 60 fps.

Is there any difference in blur, latency, etc. between a monitor at 540 hz but 60 fps verse a 60 hz monitor at 60 fps?
Yes.

You also need the frame rate to be high, not just the refresh rate. If you lock a game at 60 fps, it will be blurry even on a 360hz LCD panel.

However, you can make use of the "unused hz" for tricks like black frame insertion. Which will reduce the blurriness but can also create artifacts like duller/darker colors or other annoying things.
 
Last edited:

STARSBarry

Gold Member
Good to see someone else watches the Hammer on Box videos (or at least their second channel)

Probably the best channel for anything involving monitors.
 

RickSanchez

Member
Higher fps is usually better yes, but there are limits and nuances to this. If someone is playing purely non-competitive single player games (like me) where superfast reaction times are not critical to the experience, then 60 or 120 fps seems more than good enough. I personally can barely tell the difference between 60 and 120, and am happy with 60 fps for all the games i play, because once i hit 60, i would rather then crank up all settings and push the resolution as long as 60 is maintained.
 

Gaiff

SBI’s Resident Gaslighter
Yes, we traded size and convenience, for image quality.
I don't think that we knew at the time, exactly what we were losing.
I most certainly wasn't aware at the time. Went from a CRT to a plasma that I kept until 2016 and then moved to a Samsung LCD LED TV (KS8000) that was receiving glowing praise for being a great gaming TV. Little did I know it was a great "gaming" TV only for consoles because it was limited to 60Hz and the dogshit motion clarity and response time of LCD actually made it all-around worse than my old plasma. I thought something was wrong with my TV but no, consumers were just generally ignorant of what good picture quality and motion was for gaming and got suckered into buying into the LCD hype. The only thing it had over it was 4K but good luck driving that with a 980 Ti which was more suited for 1080p/1440p. Now I have an OLED monitor and just don't bother with my TV anymore. I just use it to watch movie and occasionally to play some Fifa or 2K when I have friends over. I'll probably upgrade to an OLED too when it craps on me.

I do appreciate how much lighter and thinner LCD is though. I was able to set up my 55" TV on my own. My mom's old 55" CRT needed three grown men to move around lol.
 
Last edited:

winjer

Gold Member
I most certainly wasn't aware at the time. Went from a CRT to a plasma that I kept until 2016 and then moved to a Samsung LCD LED TV (KS8000) that was receiving glowing praise for being a great gaming TV. Little did I know it was a great "gaming" TV only for consoles because it was limited to 60Hz and the dogshit motion clarity and response time of LCD actually made it all-around worse than my old plasma. I thought something was wrong with my TV but no, consumers were just generally ignorant of what good picture quality and motion was for gaming and got suckered into buying into the LCD hype. The only thing it had over it was 4K but good luck driving that with a 980 Ti which was more suited for 1080p/1440p. Now I have an OLED monitor and just don't bother with my TV anymore. I just use it to watch movie and occasionally to play some Fifa or 2K when I have friends over. I'll probably upgrade to an OLED too when it craps on me.

I do appreciate how much lighter and thinner LCD is though. I was able to set up my 55" TV on my own. My mom's old 55" CRT needed three grown men to move around lol.

I think that another part of the reason were hardware reviewers. Almost no one pointed out the huge drawbacks of going away from CRTs.
 
I think that another part of the reason were hardware reviewers. Almost no one pointed out the huge drawbacks of going away from CRTs.

Well there was a serious lack of maturity in the technology market back when we transitioned from CRT to other technologies.

The amount of quality analysis and content/benchmarks we can get now on monitor technology is fantastic.
 

RoboFu

One of the green rats
meh 60 works just fine for me. Its more about control lag than any slight ghosting.
 

Bernoulli

M2 slut
Since trying pc games with a 240hz screen I can't play on PS5 anymore unless there is a 120 fps mode for online games
I can still play 30 fps solo games if they don't have 60 fps mode
 

Spukc

always chasing the next thrill
Console makers push the importance of fps in games… so they can push the same gfx but faster and on newer hardware..

Ps3 supported 60fps games..
It’s a lie
 

Whitecrow

Banned
Sure, higher refresh rates are better. This should be a consensus.

But if games were only made with 60 FPS in mind, we wouldnt have some of the best games ever made by now

BOTW/TOTK, The Last of Us, Ocarina of Time, Mario 64, all of the GTA games, RDR, RDR 2, Bioshock, Mass Effect and etc.

They all launched at 30 FPS, 'cause they were pushing the hardware to its limits

We would have to wait at least for one more generation until being able to play them at 60 FPS.
Thread can be closed already.

FPS matters, also a lot of other things.
We could have 540 fps mediocre and awful games, but nobody wants that... right?
 

King Dazzar

Member
I've tried playing 120fps games with BFI engaged. And its excellent. The BFI makes a very noticeable additional improvement, but it needs to be locked with no VRR for BFI to work. But 60fps is my sweet spot, especially as I spend my days on a PS5 or XSX these days. I've given up tolerating 30fps.
 

Hugare

Member
We are focusing a lot on image clarity, but there are other advantages to having a higher frame and refresh rate. Latency is a big one as well.
Not really. We are having so many games with low internal resolution that sometimes goes back to PS3/360 era

The catch is that they put "4K resolution" on the box, but its being upscaled from sometimes sub 1080p res

Alan Wake II in Performance Mode runs at 872p. It's ridiculous. And it still runs at sub 60 FPS. Same for the Quality mode that runs at 1272p and sub 30.

So what do we do? We sacrifice game design in order to have a stable game at a higher frame and resolution? Maybe many features in the game wouldnt be possible if they had to make it work on that hardware at 60 FPS decently (higher than 1080p)

How many years we would have to wait until a Nintendo console could run TOTK at 60 FPS/4K?

It's a tough topic
 

twilo99

Member
I went from 60Hz to 144Hz and it was massive, but going from 144Hz to 165Hz wasn't noticeable. I'll probably go for a 4K/120/144 display when the RTX 5090 releases.

Diminishing returns past 125hz for the average gamer I think.

4k/144 with low latency would be great.

Currently 1440p/165hz is the best experience in terms of price/performance ratio.

If you think about it a very low percentage of gamers have the necessary hardware for high refresh rate gaming unfortunately
 

damidu

Member
around 120+ its definitely diminishing returns once you factor in sacrificed gpu time to work on each frame.
from 60 to 240 it goes like from 16 to 4ms to add all the eye candy you come to expect.
i agree 60 should be the minimum to target though
 

winjer

Gold Member
Not really. We are having so many games with low internal resolution that sometimes goes back to PS3/360 era

The catch is that they put "4K resolution" on the box, but its being upscaled from sometimes sub 1080p res

Alan Wake II in Performance Mode runs at 872p. It's ridiculous. And it still runs at sub 60 FPS. Same for the Quality mode that runs at 1272p and sub 30.

So what do we do? We sacrifice game design in order to have a stable game at a higher frame and resolution? Maybe many features in the game wouldnt be possible if they had to make it work on that hardware at 60 FPS decently (higher than 1080p)

How many years we would have to wait until a Nintendo console could run TOTK at 60 FPS/4K?

It's a tough topic

The world doesn't revolve just around consoles.
 

Topher

Gold Member
Not really. We are having so many games with low internal resolution that sometimes goes back to PS3/360 era

The catch is that they put "4K resolution" on the box, but its being upscaled from sometimes sub 1080p res

Alan Wake II in Performance Mode runs at 872p. It's ridiculous. And it still runs at sub 60 FPS. Same for the Quality mode that runs at 1272p and sub 30.

So what do we do? We sacrifice game design in order to have a stable game at a higher frame and resolution? Maybe many features in the game wouldnt be possible if they had to make it work on that hardware at 60 FPS decently (higher than 1080p)

How many years we would have to wait until a Nintendo console could run TOTK at 60 FPS/4K?

It's a tough topic

It's a tough topic for consoles perhaps as the updates to hardware isn't as frequent, but that doesn't change the facts that are being stated in the video.
 

HeisenbergFX4

Gold Member
45 inch monitor??????
how do you even use that, I have a 27 and it's too big
At 2500 you can get an oled TV and oled monitor
I love the size for single player games, its super immersive honestly

It has black bars on each side using consoles but I am ok with that size the PS5 is where I play my shooters and the monitor still works great with 1440p on the consoles
 

Xdrive05

Member
Or just get a CRT monitor and get perfect motion clarity at any framerate.

hqdefault.jpg


Seriously though, high framerate is very nice on a flat panel too. My 65" Sony x90h becomes pretty great for PC couch gaming at 120hz. Almost not worth bothering at 60hz.
 
Last edited:

Akuji

Member
Every 3 months that topic. Everyone thats interested in the topic knows that crts had a big motion advantage. But the honest Truth is that oleds get very fucking close nowadays. The new panels who atleast have Software for gaming driving them with 0.03ms g2g and basicly ever other important stat for motion clarity also improving makes this topic become irrelevant soon after the topic was relevant for like 20 years and nobody gave a fuck.
 

Hugare

Member
The world doesn't revolve just around consoles.
Oh, but it does

What was the last relevant PC exclusive?

Cyberpunk and Alan Wake II are curently the best looking PC games on the market, and they were made with consoles in mind first and foremost.

Consoles are the baseline, always. So game development's world does revolve around consoles and their limitations.

It's a tough topic for consoles perhaps as the updates to hardware isn't as frequent, but that doesn't change the facts that are being stated in the video.

My first sentence was "Sure, higher refresh rates are better. This should be a consensus."

But the reality is that game devs have to develop for consoles. So it's all about them.

If you have a PC, surely you'll go for higher framerates when possible, that's a given
 
Ugh yea. I've been on 120Hz monitor since 3 years or so now. Nothing fancy like >300 fps monitors, but it was rough to play Zelda TOTK with those frames.
Oh mate, I'm just poking fun at another thread. TOTK wasn't perfect could have been a lot worse I think with regard to performance. The game itself is great.
 

winjer

Gold Member
Oh, but it does

What was the last relevant PC exclusive?

Cyberpunk and Alan Wake II are curently the best looking PC games on the market, and they were made with consoles in mind first and foremost.

Consoles are the baseline, always. So game development's world does revolve around consoles and their limitations.

Consoles are the common minimum denominator. For some games they are the base, but not for others.
Curious to think you picked 2 games that are first and foremost PC tech games.
Cyberpunk, from launch was always better on PC, by a huge margin. Not only it had fewer bugs and performance issues, but it also looked better.
And it has the most advanced PC tech, such as DLSS 3.5, RTX, XeSS and soon FSR3.
Alan Wake is using PC specific tech that the PS5 doesn't have, Mesh Shaders. And then there is the DLSS and RTX 3.5 tech.
These are 2 games that were made with the high specs of PC in mind, and then cutback to fit into consoles.
You are talking about 2 games that have the biggest sponsorship from Nvidia.
 
Last edited:

Akuji

Member
Oh, but it does

What was the last relevant PC exclusive?

Cyberpunk and Alan Wake II are curently the best looking PC games on the market, and they were made with consoles in mind first and foremost.

Consoles are the baseline, always. So game development's world does revolve around consoles and their limitations.



My first sentence was "Sure, higher refresh rates are better. This should be a consensus."

But the reality is that game devs have to develop for consoles. So it's all about them.

If you have a PC, surely you'll go for higher framerates when possible, that's a given
Cyberpunk was 100÷ pc first. U should read the Release reviews and User reviews from that time ...
 

dave_d

Member
Every 3 months that topic. Everyone thats interested in the topic knows that crts had a big motion advantage. But the honest Truth is that oleds get very fucking close nowadays. The new panels who atleast have Software for gaming driving them with 0.03ms g2g and basicly ever other important stat for motion clarity also improving makes this topic become irrelevant soon after the topic was relevant for like 20 years and nobody gave a fuck.
True we get this every couple of months don't we.(And OLEDs are pretty good.) Of course then there's the downplaying of all the downsides to CRTs they forget including
  • Burn in(Not sure which is more susceptible to that, OLEDs or CRT)
  • Weight
  • Energy Usage
  • Flicker at 60i. (Admittedly you could get CRTs Monitors that were 75p, that was way better) It was kind of noticeable they flickered.
  • Geometry issues. (A straight line on your game would be curved on screen. I'm guessing it's because the screen was curved.)
  • Color accuracy issues (I never could get the colors right on my 20" Sony Trinitron and every screen was a little different)
  • Humming. (Which admittedly these days at my age I can't hear. In my youth, it was very noticeable.)
 

rofif

Can’t Git Gud
Obviously higher is better. 240hz was a point I stopped needing enabling motion blur in games. Brain had enough frames to think it’s full motion.

That said I don’t care. 30 is fine
 
Top Bottom