• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

1440p is overrated by people who can't go up to 4k

BlackTron

Member
I never did that comparison with 1440p 27" montior but I had few and it was barely any better than 1080p.

I had 1080p 27" monitor running 4k supersampling and 4k 27" monitor running native. These were the results...
wUghV0k.jpg


Now. This was on 27". It's even more important on bigger screens

Cogent post. I will concede that I was about to order a 27" but did 24" instead partially for this reason. There was a great price on the 27" but I realized I was just getting the extra size because I could, not because I wanted it. It's too big for me for playing FPS games a few feet from the screen, and I feared it would serve no purpose but make 1080p start to look bad at that close distance. I really wonder how that same billboard texture would look at both resolutions on a 24", I might have to do some tests myself today.
 
Last edited:
laughs in 1080p plasma
Don't you mean cries? It's time to move on bud. Even the Kuro elite plasmas (owned 2, one calibrated) and Panasonic VT60 just fail to impress these days. Oled has come so far and is relatively inexpensive. Even what are now going to be last gen models have addressed the weaknesses of Oled to such a degree that I think we're nearing the point like with cell phones where the average flagship is so capable that you really don't need the latest model (or even the one before that).
 

Melfice7

Member
Don't you mean cries? It's time to move on bud. Even the Kuro elite plasmas (owned 2, one calibrated) and Panasonic VT60 just fail to impress these days. Oled has come so far and is relatively inexpensive. Even what are now going to be last gen models have addressed the weaknesses of Oled to such a degree that I think we're nearing the point like with cell phones where the average flagship is so capable that you really don't need the latest model (or even the one before that).

If you wanna buy me one ill gladly accept :3
 

AGRacing

Member
That is a 5 star shit post, sir. Bravo.

Anyway.. would rather the higher framerate at 1440p. Especially on a PC monitor. We can talk again in 2026.
 

daveonezero

Banned
I have yet to game at 4k on PC. I look forward to it actually. Whenever I'm ready to spend another 700 dollars on a monitor. Til then 1440p max frames on ultra is more than enough.
Right PC is crazy expensive just for a monitor to go up to that.

OLED has it draw backs for Pc use. And monitors are a premium.
 

Klik

Member
I'm still on 1080p.
Not due to money issues, but my monitors just refuse to kick the bucket and I'm not gonna waste money :messenger_beaming:

And honestly?
Games look great in 1080p!

Thats true,games can look amazing in 1080p. Devs nowdays are making too much emphasis on resolution instead of graphical fidelity.

I think right now pushing for 4k is a bit too early and resource heavy. Im sure around Nvidia RTX 6xxxx series playing around 4k will be much more comfortable
 
Last edited:

twilo99

Member
Realistically 1440p is enough pixels for games. Native 4K at the expense of frame rate is a waste. Games need frames more than pixels for right now, so drop the internal resolution and do AI upscaling from 1440p to 4K for me all day every day.

higher pixel density at a higher refresh rate will always win.

34-36 inch 4k monitor at 165fps+ and a GPU that can drive it is probably the best you can do at the moment.

That like ~120 PPI at 165hz …very nice.

A 36inch 8K at 165hz will give us 244ppi , that’s glorious level visuals.
 
Last edited:

Hurahn7

Banned
1440P on a monitor is fine. More than fine actually. 4k on a monitor is extra. 4k on your giant TV in your game room is the shit.
 

ClosBSAS

Member
LoL Walmart sells 4k tvs for like 100 bucks. Shits overrated, waste of resources, next gen looks current gen for PC as always. It's the sweet spot for performance and quality. I rather do 1440p 60fps like demon's souls than 4k 30fps fideshitty mode.
 

ClosBSAS

Member
Try that without any upscalers or frame generation, on the highest settings. Don't think so.
What are you talking about man? Perfect example...I can run ff15 at 8k with a 3080ti. 4090 would destroy it easily. In fact, it can run it easily above 50fps.

And that's a high end game, anything below that a 4090 can run it fine at 8k.
 

Justin9mm

Member
People who game on PC's on monitors know 4K monitors are generally not economical for a variety of reasons. This is why 1440p is the most logical choice. This thread is dumb.
 
Those calculators are bullshit. I have a 48 inch CX Oled that I've been using as a monitor for well over a year. Switching back and forth between 1800p and 4k at a little over a meter viewing distance it's obvious that one is sharper than the other. 1800p is a great in between resolution because 1440p is completely insufficient at a size greater than 32 inches for PC gaming purposes. Again, calculator is bullshit--says more than 1440p is unnoticeable at 3.2 feet. I guess all those manufacturers making 4k 32inch monitors are doing it for the lulz
Dude, you are talking about upscaled picture, and since upscaling destroy edge sharpness (and especially without integer ratio), so of course you can see the difference. Upscaled 1440p looks very bad on 4K display, but the same 1440p looks amazing on 2560x1440p display.
 

SmokedMeat

Gamer™
I’d rather the much higher frame rates at 1440p. I had a 4K monitor I bought years ago, and sold it for a much faster 1440p model.

The performance jump alone made it easy to forget 4K
 
Last edited:

Sorcerer

Member
For gaming it's still the era of 1440p. Of course, 4K is there but it's not quite as accessible yet. 4K will get its own era, 1440p will fade away in years to come, but then 8K will be lingering on the horizon the same way 4K is now, more for the enthusiast. Something better, just out of reach for most people will always be waiting in the distance to become the new normal.
 
Last edited:

MarkMe2525

Member
I generally do not agree with OP. I would rather devs focus resources on the quality of the pixels rather than the amount of them.
 

Kataploom

Gold Member
When we get affordable GPUs that can do 4k at 240fps+ people will stop sign boosting 1440p over 4k. The problem with 4k is that it comes at a higher fps cost than 1440p but eventually 4k fps will reach an acceptable level for everyone who's not an eSports gamer just like 720p, 1080p and 1440p did.
That's what "sweet spot" means, if 4K was free on GPU cost, all of us would go 4K "just because" even when not getting the benefits due to screen size and pixel density... It's not free though, we have to find a balance between performance and IQ, and too be fair, 30 fps look ugly as fuck to many of us so 4K at 30 fps isn't even an option.
 
Last edited:

Ev1L AuRoN

Member
Unless you are willing to spend upwards of 1k on a GPU, you are also a part of the people who can't afford 4k.

1440p gives a monitor superior ppi than big tvs and you can perceive it because you are sitting near the damm thing and your average gpu can drive those babies at a very high fps.

4K is great for big tvs, its also great resolution to upscale content. But it is too expensive to drive, most people who game in 4k monitors do so at lower resolution, taking advantage of upscale technics.
 
Dude, you are talking about upscaled picture, and since upscaling destroy edge sharpness (and especially without integer ratio), so of course you can see the difference. Upscaled 1440p looks very bad on 4K display, but the same 1440p looks amazing on 2560x1440p display.
Acts like I haven't had 1440p monitors before...I know how upscaling works. Had some of the highest end displays during the Xbox 360/PS3 and PS4/Xbox one gens and in each, having the best tv (best overall picture combined with low input lag) usually meant having one with a higher resolution than the native rez of the games. Handful of decent looking 1080p titles when I had my pioneer plasma really popped but most relied on scaling (the Xbox in the 360's case and I believe the tv in the ps3's, as that system did no internal scaling if I recall). When the first 4k oleds came out, you were better off with the 1080p model for most gaming needs. With image reconstruction, the need to stick to the displays native rez is fast becoming antiquated. People who drop major coin on 4080s/90s and then play them on a $350-400 monitor is a waste of money.
 

Hoddi

Member
I have a pair of 27" monitors at 1440p and 4k on my desk and the difference is 100% noticeable. I wasn't impressed when I first upgraded my 1080p monitor to 1440p but moving to 4k was a whole different ballgame.

I wouldn't tell people to upgrade though. You're only setting yourself up for performance issues and there are more important factors to consider. I'm considering 'downgrading' to the Alienware QD-OLED because colors and black levels are more important than raw display resolution. I'll probably do that and stick with my 2080 Ti instead of spending stupid money on a 4090 and having a crappy LCD.

Rendering resolution matters more than display resolution anyway. Supersampling is a thing and it's why I'm still using my old plasma with my consoles.
 

daveonezero

Banned
It’s weird how for a long time PC gamers were playing at a very high res. Then since 1080p console gamers started to want higher res. Now it has switched.

On PC it’s always about flexibility.

I can’t wait to get a 1440p monitor and a Zen4 to run all the games ai want from the past 10 years
 

Kataploom

Gold Member
It’s weird how for a long time PC gamers were playing at a very high res. Then since 1080p console gamers started to want higher res. Now it has switched.

On PC it’s always about flexibility.

I can’t wait to get a 1440p monitor and a Zen4 to run all the games ai want from the past 10 years
Probably because 1080p is already pretty good anyway at monitor sizes most people use, socially those for competitive gaming. Also, as PC gaming is about choices, nobody stops a PC player to use a 4K screen for single player stuff if that's what they want.

PC gaming is mostly gameplay driven though, I'm ok at 2K-4K at 60 or more fps (never 30) but others will prefer FHD or 2K at very high frame rates that are literally impossible for consoles, in the end PC keeps doing what consoles can, no matter the choice, and it's not just a matter of raw power alone.
 

mxbison

Member
Well if you're rocking a 4090 and can play games at 4K/60fps/RT/Ultra, good for you.

99% of players can't and 4K costs way too much performance for what it gives. 1440p is the sweet spot.
 

tvdaXD

Member
What are you talking about man? Perfect example...I can run ff15 at 8k with a 3080ti. 4090 would destroy it easily. In fact, it can run it easily above 50fps.

And that's a high end game, anything below that a 4090 can run it fine at 8k.
Of course some games will run, but not all of them, not even close. But that's not the point, 4K is overrated and a waste or resources in many cases.
 

CGNoire

Member
I dont know about that, but what I do know is that DLSS artifacts are way worse and more noticable on Plasma or any TV tech with High-Motion Resolution than DF admits (LCD "Blur" hides the artifacts).

I personally need Native because I game on Plasma.

Consoles should still aim at 1440p but for PC its Native all the way.
 
Last edited:

Scotracer

Neo Member
It's a balance, isn't it?

I want High Frame Rate (100Hz min). I want to have the details turned on and RTX if the game has a good implementation.

My 3080Ti would not do the above at 4K. My monitor is a 3440x1440 Ultrawide (100Hz) and achieves it just fine.

60FPS isn't enough any more.
 

Schmendrick

Member
I dont know about that, but what I do know is that DLSS artifacts are way worse and more noticable on Plasma or any TV tech with High-Motion Resolution than DF admits (LCD "Blur" hides the artifacts).

I personally need Native because I game on Plasma.

Consoles should still aim at 1440p but for PC its Native all the way.
77" OLED here, aside from a tiny bit of ghosting for very fine geometry I`ve had barely any artifacting with DLSS at Quality setting since dlss 2.0 released and with the current version native resolution is now completely obsolete for me.
 
Last edited:
Acts like I haven't had 1440p monitors before...I know how upscaling works. Had some of the highest end displays during the Xbox 360/PS3 and PS4/Xbox one gens and in each, having the best tv (best overall picture combined with low input lag) usually meant having one with a higher resolution than the native rez of the games. Handful of decent looking 1080p titles when I had my pioneer plasma really popped but most relied on scaling (the Xbox in the 360's case and I believe the tv in the ps3's, as that system did no internal scaling if I recall). When the first 4k oleds came out, you were better off with the 1080p model for most gaming needs. With image reconstruction, the need to stick to the displays native rez is fast becoming antiquated. People who drop major coin on 4080s/90s and then play them on a $350-400 monitor is a waste of money.
So if you know how upscaling works, then why were you whining about 1440p picture quality when upscaled to 4K, when you should know the same picture would look totally different on 2560x1440 monitor.

I think people who bought high end GPUs are still happy with 1440p. They dont need to relay imperfect DLSS/FSR so much in RT games, and they can downsample from much higher resolutions making even TAA games look extremely sharp, and something like 144-240Hz also requires a lot of GPU power (and not to mention new games will require even more faster GPUs). Also the vast majority of content will look better on 1440p display, simply because lower resolution hide imperfections and things like lower texture quality.

If you cant see pixel structure from the place you are sitting, then you are not benefiting in any way from having higher pixel density. Look at your 4K display and see from what distance you can see the pixel structure, and you will know how far you must sit in order to see 4K. Dude, the visual acuity calculator is based on science, and I doubt you would see pixel structure on 4K TV from normal viewing distance, therefore you are not benefiting from having 4K tv.
 
Nope. I have a 4090. Im very capable of 4k Ultra on most games, but if it comes down to a choice, i'll always take 1440p over 4k if im getting less than 60fps.

Most days it doesn't matter, my most played PC game is Total War Warhammer 3, so yeah, 4k ultra is better, but in general, framerate is far better.
 
Top Bottom