• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why do game developers neglect 1080p on current gen consoles

So if you played returnal, you were just sat there thinking how bad it looked?

Didn't that game use some trickeration to get to a convincing 4k output. No one is against DLSS or some type of temporal upscaling that accomplishes the same thing. Some of these techniques can do wonders with a base internal res of 1080p.

People will always judge resolution by the display they are using to display the content, and that's never been more true than the LCD/LED age, where anything less than native is often a mess. When everyone had 1080p screens, that res looked sharp on that screen, take the same image to a 4k screen and it is smeared in Vaseline.
 

Pagusas

Elden Member
Didn't that game use some trickeration to get to a convincing 4k output. No one is against DLSS or some type of temporal upscaling that accomplishes the same thing. Some of these techniques can do wonders with a base internal res of 1080p.

People will always judge resolution by the display they are using to display the content, and that's never been more true than the LCD/LED age, where anything less than native is often a mess. When everyone had 1080p screens, that res looked sharp on that screen, take the same image to a 4k screen and it is smeared in Vaseline.
100% depends on the quality of the TV and upscaling method, Sony flagships screens do a brilliant job of upscaling.

And yes a native 1080p output on a 1080p screen looks sharp, but still has aliasing and shimmering problems with high frequency detail, even when displayed natively. Last gen was full of games that just looked rough and aliased, even as temporal systems started becoming an option. There is a lower limit to resolution where you just can't hide enough of the flaws, and 1080p is not that lower limit, most would agree 1440p appears to be.
 
100% depends on the quality of the TV and upscaling method, Sony flagships screens do a brilliant job of upscaling.

And yes a native 1080p output on a 1080p screen looks sharp, but still has aliasing and shimmering problems with high frequency detail, even when displayed natively. Last gen was full of games that just looked rough and aliased, even as temporal systems started becoming an option. There is a lower limit to resolution where you just can't hide enough of the flaws, and 1080p is not that lower limit, most would agree 1440p appears to be.

True. Hair is one area where a higher res is surely preferred. The consoles have decent upscalers anyway, better than most TVs, but like you said I think an output 1440p or greater is better for a 4k set. I game on PC monitors as a general rule, so, you never expect much from the inbuilt upscaling on those.
 
Nope, it looks like 1080p. Actually a 1080p game with no AA or msaa looks sharper than returnal.

To each their own, but the general consensus was that the final image was quite good. I almost think they were the ones that upsampled the image twice, but maybe I'm thinking of a different game.
 
To each their own, but the general consensus was that the final image was quite good. I almost think they were the ones that upsampled the image twice, but maybe I'm thinking of a different game.
The image quality is ok, but the upscaling used isn’t magic. IIRC Housemarque say they use cbr + unreal temporal upscaling, neither of which is impressive.

Selene herself who is close to the screen looks a tad blurry even.

Even comparing Uncharted 4 on ps4 pro with its 1440p plus TAA to returnal, you can see Uncharted looks better.
 
Last edited:
The image quality is ok, but the upscaling used isn’t magic. IIRC Housemarque say they use cbr + unreal temporal upscaling, neither of which is impressive.

Selene herself who is close to the screen looks a tad blurry even.

Even comparing Uncharted 4 on ps4 pro with its 1440p plus TAA to returnal, you can see Uncharted looks better.


DLSS and some of the temporal techniques are getting quite good these days.
 
Last edited:
That’s an Ue5 demo. I’m on my phone at work atm so I couldn’t tell you the difference anyways right now lol.

Also, yeah dlss 2.3 is great, but it’s far superior to Unreal TAA upscaling.

I’m just saying returnal specifically could have better iq even at 1080p, though it’s a bit better than some ue4 games like crash 4 or destroy all humans at 1080p on standard ps4/pro.
 
Last edited:

Pagusas

Elden Member
The image quality is ok, but the upscaling used isn’t magic. IIRC Housemarque say they use cbr + unreal temporal upscaling, neither of which is impressive.

Selene herself who is close to the screen looks a tad blurry even.

Even comparing Uncharted 4 on ps4 pro with its 1440p plus TAA to returnal, you can see Uncharted looks better.
I'll give up sharpness for a more natural and alias free image, but luckily as system power increases and resolution increases, that trade off becomes less and less. Games like Horizon, in Fidelity mode, look amazing with high frequency detail (when standing still at least, more power is needed to boost frame rate and temporal resolution, as it does fall apart a bit in motion). We are almost there, almost to that point where we're painting alias free, high detailed images at good frame rates. Almost.
 
I'll give up sharpness for a more natural and alias free image,
DF agrees with you, but I don’t like TAA most of the time except for the best examples like Insomniac. I wish the industry had prioritized sharpness.

I would never want a super bright, colorful nintendo game to use TAA and it is kind of counterproductive to have all these nice details in games only for TAA to blur them. TAA works better for more muted looking games.

Edit : coincidentally, Luigi 3 on switch has taa but it works well because luigi is more muted and dark than something like mario odyssey.

What I will say is that native 4k plus TAA can look pretty decently sharp even with ue4 TAA but still not as sharp as it should be. Crash 4 on ps5 for example looks nice but still a touch blurry.

However DLSS is becoming great and i’m excited and hopeful that switch 2 has this capability.
 
Last edited:

Roni

Member
I honestly don't plan on bumping from 1080p anytime soon. I have a 3080Ti now and make a point out of running things like Cyberpunk on full settings, but leave the resolution at 1080p so I can game in between 50-60 FPS.

60 FPS + Max Raytracing can't be beat...
 
Last edited:

tmarmar

Member
I think it's a marketing issue since the "4K" label sells in the console wars. But the truth is that most games only reach that resolution dynamically. Consoles have no real power for native 4K at 60fps (in minimally graphically complex games), much less with ray tracing.
 

64bitmodels

Reverse groomer.
And many of us have perfectly fine 1080p sets a few years old. When crts were a thing, I remember holding on to them until they died. I also don't remember them bing much larger than 32" very often. We made do, people are spoiled today. Why would you throw out something that you spent hundreds on, a few years after buying it? Makes no sense. PS5 looks amazing on 1080p. I am sure 4k looks better, but it also needs lower fps.

I just want them to give more settings, like on pc. Its like they they think console users are stupid or something, so they don't add sliders.
maybe consider upgrading them? a decent 4k tv costs 300 dollars. also, by a few years old do you mean 2020 or 2016. if it's the former ok yeah i guess i can understand, but if it's the latter you've needed an upgrade for 2 years now
 

64bitmodels

Reverse groomer.
1080p +TAAU or any of the other modern reconstruction techniques or AI upscaling will look as good, maybe better than native 1440p though.
how about you render natively at 1440p and then use the AI tech like DLSS to make it look as good as 4k
1080p is old and ugly as fuck, 1440p on the other hand looks much better and doesn't require that much more horsepower to run. I played Sonic Generations on my XSS at 1440p not too long ago and it's quite possibly one of the most beautiful video games i've played. 1440p should be the new "budget" resolution
 

adamsapple

Or is it just one of Phil's balls in my throat?
how about you render natively at 1440p and then use the AI tech like DLSS to make it look as good as 4k
1080p is old and ugly as fuck, 1440p on the other hand looks much better and doesn't require that much more horsepower to run. I played Sonic Generations on my XSS at 1440p not too long ago and it's quite possibly one of the most beautiful video games i've played. 1440p should be the new "budget" resolution

It's possible on an ever changing PC environment but on fixed hardware consoles, they'll need to find ways to reduce strain as much as possible, dropping resolution is usually the first and best way.
 

64bitmodels

Reverse groomer.
It's possible on an ever changing PC environment but on fixed hardware consoles, they'll need to find ways to reduce strain as much as possible, dropping resolution is usually the first and best way.
well if 4k is the default, dropping resolution would be dropping to 1440p.
 

rofif

Can’t Git Gud
I was never impressed by 1440p. It is a good starting point for upscaling/good anti aliasing but it's not good enough over 1080p.
Anyway. There was a time in 2019 when I changed monitors like a crazy person. I had around 10 monitors 2018-2020.
There was a point I had 2 new IPS displays at the same time on my desk. hp27ea (1080p 27") and lg27uk650 (4k 27").
Now. To those who say 27" is too small to see the difference.
1080p 27" is displayed at default 100% scaling
4k 27" monitor is displayed at 200% to match physical size of fonts and objects between two screens.
Game example is mirrors Edge but with a HUGE twist. On 1080p monitor, I am using 4k downsampling. So we are comparing THE SAME 4k game but running on 4k vs 1080p same size monitors
Interesting observation was how much better yt videos looked on that 4k monitor when played in a smaller window... turns out windows/chrome was never bad at downsizing the video. It was the lack of res.

I don't need to say which one is which:

7l2JBN5.jpg

JQdXr5L.jpg

zEtxAtU.jpg

Now I am using 48" oled on the same desk so 4k is important. But we also have much better anti aliasing techniques now.
And surprisingly - even coming from 27" 4k, the dpi on that 48" is not bad. Maybe that's because I sit a bit further away but I do not see the pixels until I almost touch it.
 

adamsapple

Or is it just one of Phil's balls in my throat?
well if 4k is the default, dropping resolution would be dropping to 1440p.

True.

But we're already seeing modern games like Guardians, Dying Light 2 which drop to 1080p for their 60 FPS mode.

But I guess it depends on a developers and optimization time just as much.
 
Last edited:
Tbh even 4k is too low and looks aliased without good aa technique
resident evil 3 remake @4k w/ no AA is a goddamn mess.
but with max AA... ooooo it's so cleeeeeeean.

also, i played days gone and nioh 2 on PS4, then played them on PC @4k... night and day difference.

resolution, baby--it matters.
but upscaling is likely the future. its just far more performant, and can even have its own benefits over increasing the native res.
 

rofif

Can’t Git Gud
resident evil 3 remake @4k w/ no AA is a goddamn mess.
but with max AA... ooooo it's so cleeeeeeean.

also, i played days gone and nioh 2 on PS4, then played them on PC @4k... night and day difference.

resolution, baby--it matters.
but upscaling is likely the future. its just far more performant, and can even have its own benefits over increasing the native res.
yep.
REngine games have a lot of dithering and noise in the hair and some materials.
These games are DESIGNED with Taa in mind and I always see people disabling Taa becuase "it is blurry". Right alongside disabling motion blur and raising fov to max lol
 
Now. To those who say 27" is too small to see the difference [between 4k and 1080p].
1080p 27" is displayed at default 100% scaling
4k 27" monitor is displayed at 200% to match physical size of fonts and objects between two screens.
when i got my first 4k tv/monitor, text was the first thing that stood out.

what a difference.

i can now guess a laptop's resolution based solely on text clarity.
 

Danknugz

Member
i have a 4k tv but i made it kind of a goal to not play any games in 4k until i get a 4k HMD, in a poor mans effort to try and equalize the never ending resolution disparity between flat and VR gaming.

i've been successful so far, with the slight exception of capturing skater XL clips in 4k to make 4k montages. but those are only replays and the tricks were done in 1080p (technically 1200p cause my monitor is 1920x1200)

when i finally get a 4k HMD i'll probably still try and play 1080/1200p until i can't stand it anymore. but i grew up with 240, 480p so 1080p still feels fine to me.
 
Got a chance to look at it on my PC, it does look softer with the upscale but honestly both images look awful lol. Compression?

Anyway, that Unreal engine TAA is very aggressive, definitely not one of the better ones but there are worse of course.

The one image is native, so using no TAA, the other looks quite nice. Compare it to a native 1080p image and the difference would be quite apparent. Or get new glasses. :messenger_tears_of_joy:
 
The one image is native, so using no TAA, the other looks quite nice. Compare it to a native 1080p image and the difference would be quite apparent. Or get new glasses. :messenger_tears_of_joy:
Uh, no. 1440p no AA (and if that’s what it is, there’s serious compression going on because it looks blurry as heck) will naturally show more texture detail than with TAA. Goes double for a lower res with TAA. Because the image on the left is showing more texture detail, and yeah looks slightly more aliased.

If you don’t know that TAA scrubs away detail than you need to read more about this.

Wait, dude it even says TAA algorithm in the bottom left corner of the left image!
 
Last edited:
1080p is fine, I watch most of my YouTube content at 1080p... that seems fine. Anything above 1080p is definitely sharper. When these devs do their magic it doesn't even seem like 1080p. The problem is the "taboo" of 1080p. There are very very vocal crowd that get absolutely woozy and dizzy at the mention of 1080p. Once they hear 1080p their eyes stop working and the game is immediately terrible, shit, and not worth more than $10.

All you have to do is look around, Returnal is a fantastic looking game. DF the graphics whores, loved it. I thought it looked pretty damn good and looked pretty crispy, not the crispiest thing I've seen but still good. To this day what do ppl bring up about that game? 1080p and not worth it.
 
Last edited:
Uh, no. 1440p no AA (and if that’s what it is, there’s serious compression going on because it looks blurry as heck) will naturally show more texture detail than with TAA. Goes double for a lower res with TAA. Because the image on the left is showing more texture detail, and yeah looks slightly more aliased.

If you don’t know that TAA scrubs away detail than you need to read more about this.

Wait, dude it even says TAA algorithm in the bottom left corner of the left image!

Compression or not, we don't have a raw image, so, your argument has no validity here. The issue at hand isn't about questioning the quality of the native image, LOL. It's about how close to native the upscalled image can get. If we had a native 1080p image it should be inferior to the 1080p upscale.
 
Last edited:
*facepalm* Ok i'm moving on lol

Also, native 1080p with no TAA will be sharper/show more detail than UE TAA upscale.

You are hilarious, this must be a joke. Please provide an example.

From FSR, to DLSS, to Temporal upscaling, I've never seen a technology demonstrated that reduces image clarity at a cost. You're saying that TSR results in a lower quality image than a native 1080p but has an implementation cost? So, the developers just implement it as a bad joke then?
 
Last edited:
You are hilarious, this must be a joke. Please provide an example.

From FSR, to DLSS, to Temporal upscaling, I've never seen a technology demonstrated that reduces image clarity at a cost. You're saying that TSR results in a lower quality image than a native 1080p but has an implementation cost? So, the developers just implement it as a bad joke then?
Dlss is different because it uses machine learning, so it actually can resolve more detail than native. BUT will still be softer than native without TAA.

TAA upscale on the other hand, cannot resolve more in surface texture detail because it is essentially just blurring frames together. It might make things like hair or chain link fences more cohesive (sub pixel detail) but for regular textures the TAA will look blurrier.

As a good example you can use a ratchet and clank 2016 disc version and compare it side by side with patched version on base ps4. The former will be sharper but more aliased because it’s not using TAA but a simple post process method instead.

TAA is a trade off ; more cohesive image but at a loss of sharpness. It cannot be as sharp as native.
 
Last edited:
Dlss is different because it uses machine learning, so it actually can resolve more detail than native. BUT will still be softer than native without TAA.

TAA upscale on the other hand, cannot resolve more in surface texture detail because it is essentially just blurring frames together. It might make things like hair or chain link fences more cohesive (sub pixel detail) but for regular textures the TAA will look blurrier.

As a good example you can use a ratchet and clank 2016 disc version and compare it side by side with patched version on base ps4. The former will be sharper but more aliased because it’s not using TAA but a simple post process method instead.

TAA is a trade off ; more cohesive image but at a loss of sharpness. It cannot be as sharp as native.

I was talking about the reconstruction methods (Returnal uses checkerboard) in general. The idea that the reconstructed image looks worse than running at a much lower native res (which will bring its own softness on a 4k screen) while simultaneously incurring a rendering cost is a new concept for me. Checkerboard rendering is not the best thing going at the moment, but it still should result in a better looking image than a native image with an equal pixel count, that's the whole point. Though I did mention temporal upscaling vs. temporal reconstruction, maybe that confused what I was getting at.
 
Last edited:
The idea that the reconstructed image looks worse than running at a much lower native res (which will bring its own softness on a 4k screen) while simultaneously incurring a rendering cost is a new concept for me.
We were talking about temporal reconstruction, not checkerboard. I never actually used the word “worse”, that’s what you’re interpreting what i’m saying as. I said softer, and reduced texture detail as a result of that softness. It’s just the nature of temporal image treatment. Sometimes that works best, sometimes it would be better to have a sharper image without temporal component. Ghosting is another factor.

Even the best implementation without machine learning, insomniacs temporal injection does come with a blur hit despite it being minimal compared to ue4 upscaling, and it looks great. Don’t get me wrong, because ratchet 2016 definitely looks better with the temporal injection, but I did notice it wasn’t as sharp.

Checkerboard rendering can look sharper than the base resolution it’s starting from, or softer depending on the quality of implementation. If done properly it’s supposed to look sharper, but with edge artifacts. Dragon quest 11 OG release is a bad example.

Days gone on ps5 is the best example i’ve seen, though it also has TAA. I hear that resident evil 8 on ps5 has a really nice checkerboard 4k output, but I haven’t got round to playing it yet. Capcom in general did a nice job with reconstruction last gen.
 
Last edited:

daveonezero

Banned
Who knows but you can buy a decent laptop these days and play all the modern games at that resolution and do it well. I think its sort of cool.

That is also why the Steam Deck is doing so well. it targets 720p and can do it well.
 
I've owned a 4K TV for over a year now because that's pretty much all they sell now. I've still yet to view anything in 4K on it and don't feel a need to. 1080p still looks as good as it ever did.

I can understand a game targeting 4K or 1080, but using anything between those as a performance target seems really strange to me...
How does 1080p content look on a 4k screen? I would hate to buy one and have content look like a blurry mess as much of my games and movies are 1080 or lower resolution. I know on pc playing a 90s era 640x480, 800x600/1024x768 4:3 res game on pc it looks like ass compared to 1080p and blurry. Same with playing ps2 games on a 1080p set. Is that what 1080p content looks like on 4k?

I know I heard somewhere that 4k can display 1080p better as its evenly divisble by it. Where as something like 720-1080 was not. Dude could of been bullshiting but I don't have a 4k set to test.
 

Jeeves

Member
How does 1080p content look on a 4k screen? I would hate to buy one and have content look like a blurry mess as much of my games and movies are 1080 or lower resolution. I know on pc playing a 90s era 640x480, 800x600/1024x768 4:3 res game on pc it looks like ass compared to 1080p and blurry. Same with playing ps2 games on a 1080p set. Is that what 1080p content looks like on 4k?

I know I heard somewhere that 4k can display 1080p better as its evenly divisble by it. Where as something like 720-1080 was not. Dude could of been bullshiting but I don't have a 4k set to test.
All I can tell you is that going from 1080p content on a 1080p TV to 1080p content on a 4K TV, I personally don't notice any difference. I know what you mean about sub-HD on an HDTV, and it's definitely not like that.
 

01011001

Banned
How does 1080p content look on a 4k screen? I would hate to buy one and have content look like a blurry mess as much of my games and movies are 1080 or lower resolution. I know on pc playing a 90s era 640x480, 800x600/1024x768 4:3 res game on pc it looks like ass compared to 1080p and blurry. Same with playing ps2 games on a 1080p set. Is that what 1080p content looks like on 4k?

I know I heard somewhere that 4k can display 1080p better as its evenly divisble by it. Where as something like 720-1080 was not. Dude could of been bullshiting but I don't have a 4k set to test.
All I can tell you is that going from 1080p content on a 1080p TV to 1080p content on a 4K TV, I personally don't notice any difference. I know what you mean about sub-HD on an HDTV, and it's definitely not like that.

that is due to 2 factors:
1: 4K is exactly a 2x scale on each axis of a 1080p image which makes it easy to scale well to 4K (480p to 720p is an uneven 1.5x scale, and 480p to 1080p is and uneven 2.25x scale on each axis, resulting in double width lines and half width lines)
and 2: 4K has such a high pixel density that scaling artifacts are way less noticeable than on lower resolution displays, if there was a half width line it wouldn't stand out nearly as much, which is why even 1440p scaled to 4K looks good even tho it's an uneven scale

this is why the Analogue Pocket has such a ridiculously high res 1600x1440 screen btw. while it is a perfect 10x scale for GameBoy and GameBoy Color games, it is not an even scale for many other systems, including the GBA which it natively supports out of the box. but you will not really notice the uneven scaling due to the fact that there are just so many pixels that you'd really have to look for it on a still image.
the super high pixel density screen makes sure that even unevenly scaled systems look sharp and presentable on it
 
Last edited:
Top Bottom