• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why I decided to limit my 144Hz monitor to 60Hz... Even when hardware is more than enough.

buenoblue

Member
So many inane PC master race posts in one thread, good grief. Зажрались, чесслово. Y'all are so used to pork and wine you've forgotten what bread and water taste like. It's disgusting. ._.

Yes, some people consider 1080p@60 a perfectly fine standard to hold to, since it's far easier on the wallet and perfectly good visually. This is not even counting all the people who are happy to game at 30 fps or 720p, because that's all they have access to or don't feel the need to upgrade. I'm perfectly happy staying where I am, and not feeding into the vicious cycle of constantly increasing visual fidelity demands and game development times and costs.

Would I use better hardware to play at higher fidelity if I could get it for free? Sure. But I would not chase higher fidelity and high framerates, as they're not things that make a game. Because I consider myself a gamer. Anyone who cares for resolution and framerate over a game's substance is just a graphics junkie.

All good points, but buying an expensive 144hz monitor to only use it at 60hz just doesn't make sense. If op doesn't see the point of high refresh than that's fine, but to say that it's pointless for all is a bit much. Just buy a 60hz monitor and live life lol.
 

Dream-Knife

Banned
Blow up was hyperbowl, as in stops working. I had it happen to multiple cards. They will last 2 years and then either artifacts on screen or won't post in windows with card in. Happened with 8800gts, 9800gt and 7870 cards. In current scalper/miner/asshole market landscape, I'm not taking any risks on not being bale to play as I can't get a console now either (been trying since launch).

I down clock my 3060ti, which I lucked out on getting in the first place at msrp. Had to wait 8 months for it. I use the x1 software and set temp limit at 73 and power limit down to 95%. Otherwise games like cyberpunk or uncapped old games going up to 500+ fps will keep raising the Temps. Sure it will eventually downclock at 83c... That's what they set it for. Meanwhile my office room temp raises lwhen the card gets hot too. It's factory overclocked and uses 30 to 40 watts over stock for this model card. So my downclock is bringing it closer to stock temps and wattage if the reference cards. I don't notice much difference in performace at all using afterburner to gauge fps.
Right click on desktop and select NVIDIA control panel. Under manage 3d settings, set maximum frame rate to 3 under your monitors max refresh rate.
I8rltkp.jpg
 

rofif

Can’t Git Gud
All good points, but buying an expensive 144hz monitor to only use it at 60hz just doesn't make sense. If op doesn't see the point of high refresh than that's fine, but to say that it's pointless for all is a bit much. Just buy a 60hz monitor and live life lol.
Yeah I play a lot of souls and console games so that 240hz monitor was not doing much and souls looked like trash on it
 

01011001

Banned
Blow up was hyperbowl, as in stops working. I had it happen to multiple cards. They will last 2 years and then either artifacts on screen or won't post in windows with card in. Happened with 8800gts, 9800gt and 7870 cards. In current scalper/miner/asshole market landscape, I'm not taking any risks on not being bale to play as I can't get a console now either (been trying since launch).

I down clock my 3060ti, which I lucked out on getting in the first place at msrp. Had to wait 8 months for it. I use the x1 software and set temp limit at 73 and power limit down to 95%. Otherwise games like cyberpunk or uncapped old games going up to 500+ fps will keep raising the Temps. Sure it will eventually downclock at 83c... That's what they set it for. Meanwhile my office room temp raises lwhen the card gets hot too. It's factory overclocked and uses 30 to 40 watts over stock for this model card. So my downclock is bringing it closer to stock temps and wattage if the reference cards. I don't notice much difference in performace at all using afterburner to gauge fps.

wtf? I ran a GTX1070 with a 2.1ghz overclock, the normal boost clock for that chip is rated at 1.8ghz
this one to be exact:
ASUS-ROG-STRIX-GTX-1070-6.jpg


that thing is now 4 years old and still worked without a single issue until I got the new PC a few months ago. and it still works. I always tried to max out the GPU load as well when playing games. I always had the settings set up to reach the framerate I wanted and nothing was left on the table... that baby ran 99% maxed out at all times lol, usually 1440p sometimes dynamic or ~80% to ~90% scale if the game was too demanding at settings that didn't look shit

I even played a bit of Control with raytracing reflections enabled at 30fps for a while, just for the lols... ran better than on last gen consoles funnily enough even tho RT shouldn't really work that well on this card xD but you can be damn sure that this was massive stress on that card, not only the GPu cores but also the VRAM gets hammered like a motherfucker when running raytracing

edit: I remembered that I even posted screenshots of it running on this card because people on here didn't quite believe me
control135jg0.png

control2kzkrj.png

control5jpjl5.png

yeah, that is running on a GTX1070 clocked at 2.1ghz. the phone room is giving it a beating not gonna lie, but cap that at 30fps and you have a better experience than on any last gen console even after all the patches they got.


so TL;DR I put this card through hell and back for 4 years and it works like a charm to this day
 
Last edited:
I unironically agree with OP. I have a 170hz monitor and I rather game at 60hz on my CRT. It's so much easier to hit 60 fps than 144+ and 60 fps on a 60hz screen looks great. 60 fps on a 144hz screen is NOT the same. Check if out at the Test UFO site from Blurbusters and see for yourself.
 
I unironically agree with OP. I have a 170hz monitor and I rather game at 60hz on my CRT. It's so much easier to hit 60 fps than 144+ and 60 fps on a 60hz screen looks great. 60 fps on a 144hz screen is NOT the same. Check if out at the Test UFO site from Blurbusters and see for yourself.
Agreed.

I'd also like to add gaming at 120hz in Halo Infinite on my LG C1 OLED is just...

GIF by FOX International Channels
 

01011001

Banned
I unironically agree with OP. I have a 170hz monitor and I rather game at 60hz on my CRT. It's so much easier to hit 60 fps than 144+ and 60 fps on a 60hz screen looks great. 60 fps on a 144hz screen is NOT the same. Check if out at the Test UFO site from Blurbusters and see for yourself.

it's called gsync my dude, if it works, that doesn't happen in the first place
 

mclaren777

Member
I own a 144 Hz monitor but I cap all of my games to 80fps because I can't perceive any difference going above it.

Different strokes for different folks, I guess.
 

Soodanim

Gold Member
it's called gsync my dude, if it works, that doesn't happen in the first place
To that end, the idea of a >60hz display without G/Freesync/VRR is a terrible one, but I bet there’s some poor sod out there trying to get by with that unfortunate omission. At least you can do ½hz VSync locks, I suppose.

On topic, I do love going over 60 but 60 is a happy baseline most of the time. I look forward to the time when I can comfortably buy a TV and know all of the 120hz VRR issues are worked out though, because then it really will be a best of both worlds situation. Anything with some speed feels so much better.

The drop from 120 to 60 is quite literally half as bad as 60 to 30, so that adjustment isn’t as jarring. Don’t get me wrong, I’ve had the “I see the 60fps slideshow” moment, but it doesn’t last half as long.
 

Kenpachii

Member
Blow up was hyperbowl, as in stops working. I had it happen to multiple cards. They will last 2 years and then either artifacts on screen or won't post in windows with card in. Happened with 8800gts, 9800gt and 7870 cards. In current scalper/miner/asshole market landscape, I'm not taking any risks on not being bale to play as I can't get a console now either (been trying since launch).

I down clock my 3060ti, which I lucked out on getting in the first place at msrp. Had to wait 8 months for it. I use the x1 software and set temp limit at 73 and power limit down to 95%. Otherwise games like cyberpunk or uncapped old games going up to 500+ fps will keep raising the Temps. Sure it will eventually downclock at 83c... That's what they set it for. Meanwhile my office room temp raises lwhen the card gets hot too. It's factory overclocked and uses 30 to 40 watts over stock for this model card. So my downclock is bringing it closer to stock temps and wattage if the reference cards. I don't notice much difference in performace at all using afterburner to gauge fps.

If your card keeps raising temps, your card cooling is shit or your airflow is shit or u got a whole pile of dust on the card.

Here i will showcase you how ampere works.

3080 tuf oc which current goes for 1,8 grand in my country.
Disable the entire fans while slamming witcher 3 modded to the brim at 3440x1440 resolution that puts the gpu core at max taxation.

3 minutes after disabling the fan.

061c75c1e9ecd50b3295df136aa7b787.jpg


10 minutes after disabling the fan:

afeb0c4e4968669da1f21818aa72fa77.png


Now after re-enabling the cooler again, it blasts to a full 100% for about 2 minutes, then goes down again on minute 3.

76fbe4a016bce22ec324adc0e0470e0a.jpg


It downclocks all the way to 330mhz and will go even lower if that temp rises for whatever reason.

Card still not dead. because it will shut down before it gets to critical value's.

This is what i mean, u can't overvolt your GPU anymore to bricking levels as nvidia doesn't allow for it. Unless u hard mod your card which u cannot do without actually physical hardware changes. So blowing up your card or damaging it isn't something that can happen unless your card was borked to start with.

Now if you want to save energy or limit temperature in the room, yea i get that. lots of people undervolt the 3080 because of that.
 
Last edited:

T4keD0wN

Member
It's just that gaming is low on my list of priorities in life, that I want to pay for it as much as possible.
If its so low on the priorities list then why start threads on a video game forum where you talk about framerates and games? Something does not add up here. Also buying a 144hz monitor and limiting it to 60hz is like buying a brand new 2022 ferrari and then putting an engine from a 20yo honda civic into it.
 
it's called gsync my dude, if it works, that doesn't happen in the first place
I have it and I honestly rather run without it. Some of my games don't get gsync if I set it to just "fullscreen only" applications so I have to set it to windowed and fullscreen apps. Problem then is, stupid Windows apps start lagging out hardcore from gsync trying to synchronize to these apps which internally run at like 30 fps or less. This ends up creating a ton of desktop mouse lag and choppiness. Disabling gsync instantly solves the problem. I've tried manually adding profiles for these apps and disabling gsync for them but it never works so I just rather run with it off.

Also,, unless you have a high end real gsync monitor with the module, then 60hz freesync is NOT the same as 60hz fixed refresh rate. This is because freesync is garbage at synchronizing the framerate to the exact refresh rate so instead of getting 60hz, you'll be getting 90-120hz as the cheapo freesync monitor has to double the refresh rate for the fps. This results in a double image ghosting effect that looks significantly worse than a true 60hz monitor. Here's the link I was talking about so you can see this for yourself:

If you have like a 144hz or higher monitor, look at how bad the half fps looks. 72hz should NOT look the stuttery and blurry. That's from the double scanout problem and freesync/gsync compatible can only solve this at very high framerates like 100+ where these crappy panels can truly single cycle refresh them at. For 60hz and lower, I have yet to find a single freesync monitor that can truly do it.
 

01011001

Banned
I have it and I honestly rather run without it. Some of my games don't get gsync if I set it to just "fullscreen only" applications so I have to set it to windowed and fullscreen apps. Problem then is, stupid Windows apps start lagging out hardcore from gsync trying to synchronize to these apps which internally run at like 30 fps or less. This ends up creating a ton of desktop mouse lag and choppiness. Disabling gsync instantly solves the problem. I've tried manually adding profiles for these apps and disabling gsync for them but it never works so I just rather run with it off.

Also,, unless you have a high end real gsync monitor with the module, then 60hz freesync is NOT the same as 60hz fixed refresh rate. This is because freesync is garbage at synchronizing the framerate to the exact refresh rate so instead of getting 60hz, you'll be getting 90-120hz as the cheapo freesync monitor has to double the refresh rate for the fps. This results in a double image ghosting effect that looks significantly worse than a true 60hz monitor. Here's the link I was talking about so you can see this for yourself:

If you have like a 144hz or higher monitor, look at how bad the half fps looks. 72hz should NOT look the stuttery and blurry. That's from the double scanout problem and freesync/gsync compatible can only solve this at very high framerates like 100+ where these crappy panels can truly single cycle refresh them at. For 60hz and lower, I have yet to find a single freesync monitor that can truly do it.

on my end the half refresh ufo looks absolutely fine, no stutter, not overly blurry 🤷‍♂️
 

PerfectDark

Banned
Lol @ this thread. I was like ya in a few games that are poorly optimized like Last Epoc and Phasmobia I limit my FPS to 60 instead of 165 just to keep the fans silent and keep my PC cool.
 
on my end the half refresh ufo looks absolutely fine, no stutter, not overly blurry 🤷‍♂️
Are you comparing it to the native refresh equivalent? Eg if you have a 120hz monitor are you comparing the 60 fps/120hz to 60 fps/60hz? If not you don't understand what it should look like.
 

01011001

Banned
Are you comparing it to the native refresh equivalent? Eg if you have a 120hz monitor are you comparing the 60 fps/120hz to 60 fps/60hz? If not you don't understand what it should look like.

just did, looks identical. 144hz/72fps or 72hz/72fps
 
Last edited:

GloveSlap

Member
60hz for graphics heavy single player games

120hz+ for competitive multiplayer

Not interested in 30hz anymore except in rare cases like the Matrix demo or the new Flight Simulator.
 

93xfan

Banned
OP seems to just not want to adjust between such varying frame rates.

Heard others say they can’t go back to 30FPS after 120FPS.

Feel some people really want to be dicks about others opinions and choices.
 
- 60fps is smooth enough

- that feeling of smoothness is even more enhanced when you watch low fps stuff like movies, and series, many YT videos, or just from time to time games which are 30fps

- anything above 60fps is noticeably smoother, but that extra smoothness also ruins standard framerate in games

- 60fps as ideal standard for most people makes you more connected to your fellow gamers

- most people in the world will rarely, if ever, experience more than 60fps

- more fps requires more power, which means more money

- it's just a game; Metal Gear Solid, many Final Fantasy titles, Silent Hills (plural, not Kojima's thing), Pokemon, etc. are some of the most beloved titles in history of gaming, and they ran at 30fps, or lower

- fast paced multiplayer games suck anyway

- in life you learn that it is often necessary to lower your standards for happiness
These are some bizarre statements. It seems to me that your PC is struggling and you are trying to justify the drop to 60fps.

After playing games at 165+hz, 60 just looks choppy to me now. I’m dead serious.

Use Gsync and enjoy your games.
 

NinjaBoiX

Member
Well, at the very least you understood what I was talking about.

Still, I don't want to spend more than 100 euro for gaming per decade; since I have more important things in life.
You only want to spend 10 euros PER YEAR on gaming? I mean, that doesn’t even buy one game, do you even enjoy this hobby?

WTF…
 

Petopia

Banned
Unless you play competitive FPS with a mouse it honestly doesn't matter. I notice a difference playing MH rise on PC at 1440p 170fps vs 540p 30fps on switch, but that's an extreme example. 3rd person single player games I'm fine with 60.

I do notice I drive better in Forza at 170fps vs 60 on my TV though.
So you're one of those guys huh.
 

StreetsofBeige

Gold Member
I've gotten so used to gaming above 60 that now even 60 appears slightly choppy.
In 2 years when I get a new tv with 120 fps mode, I will finally be able to witness what it's like above 60. I don't have a 120 fps tv and I've never had a PC or laptop with a screen refresh rate above 60 hz.
 

ethomaz

Banned
It actually makes sense.
Not having to deal with the eye adaptation when shifting between 144Hz and 60Hz is a god send and as result it makes the eye adaptation to 30Hz even easier.

Plus 144fps is really hard to hit even with super high end machines… 60fps is a very cost/effective setting in hardware terms.
 
When I had a 144hz monitor, some games wouldn’t even work correctly unless the screen was set to 60Hz

What games? I've had an 144hz monitor for like 7 years and have never had to change the hz of the screen itself for a game...some games are capped to 60fps which is fine.

Some seriously deluded people in here doing mental gymnastics to justify the fact they cba. It's objectively a much better experience. If people can be bothered or don't want to pay for that? That's cool. But for me it makes games much more enjoyable, input and responsiveness are key mechanics for me which I value in games. But there will always be people with a backwards mentality. It's like trying to get a dog to listen to classical music reading some of the posts in here haha.
 
What games? I've had an 144hz monitor for like 7 years and have never had to change the hz of the screen itself for a game...some games are capped to 60fps which is fine.

Some seriously deluded people in here doing mental gymnastics to justify the fact they cba. It's objectively a much better experience. If people can be bothered or don't want to pay for that? That's cool. But for me it makes games much more enjoyable, input and responsiveness are key mechanics for me which I value in games. But there will always be people with a backwards mentality. It's like trying to get a dog to listen to classical music reading some of the posts in here haha.
Layers of Fear: Stepping side to side while moving the mouse would cause a weird jitter that wasn’t in the frame rate but rather some physics problem when running higher than 60Hz. I couldn’t fix it no matter what I did unless of course I set my refresh to 60Hz before I played the game.

Resident Evil 2 remake: this game changed the knife damage with higher frame rates than 60 for whatever reason.

That’s 2 examples.

Edit: Rayman Origins would change the speed of the game based on the frame rate. Anything over 60 wouldn’t play right.
 
Last edited:

rofif

Can’t Git Gud
In 2 years when I get a new tv with 120 fps mode, I will finally be able to witness what it's like above 60. I don't have a 120 fps tv and I've never had a PC or laptop with a screen refresh rate above 60 hz.
120 is barely noticeable or enough to make 60 look choppy. But 240 destroyed 60 and 120 for me…. At least for some time
 

Mercador

Member
Stupid question but I have to ask; could the augmented refresh rate requires more on the brain? Maybe it's the other way around, the brain have less gaps to feed between each image? I'm curious.
 

Dream-Knife

Banned
120 is barely noticeable or enough to make 60 look choppy. But 240 destroyed 60 and 120 for me…. At least for some time
It's noticeable in racing games (if using hood cam) and FPS. From 110-170 I don't notice as big of a difference.
 

Trunim

Member
I like to sometimes go back to my older consoles and suffer through some of the horrendous framerate and just wonder how the hell did I even play through this. Killzone 2 comes to mind.
 

LOLCats

Banned
Dudes not totally wrong. Though my desktop runs at 120hz, i cap all my games at 60fps.

with gsync on my monitor runs at games frame rate.

so i agree mostly, 60fps is perfect and anything more you just wasting power (Playing single player games)
 
Last edited:
just did, looks identical. 144hz/72fps or 72hz/72fps
Then your display's response times must be so slow that it doesn't even matter and blurs it all the same. The sharper the motion clarity, the worse this phenomenon appears. It's still there on blurry LCD, it's just harder to notice. OLED and strobed backlight displays make it painfully obvious. It's why some people say playing 30 fps games on OLED hurts their eyes vs LCD.
 
Top Bottom