• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

Edder1

Member
and most games so far do not perform like a 2080 at all on current consoles.

we have Control or Guardians of the Galaxy that, as I said, barely run bettert han on a 1070.

Control on PS5 is 1440p60 on low settings. that is between 1070 and 1080 levels of performance.

Guardians of the Galaxy is 1080p60 with a mix of medium and high. and that also is very close to how a 1070 runs the game.

Dying Light 2, 1080p medium, very close to PS5 performance on an overclocked 1070 or a 1070ti on stock clocks.
not quite PS5 levels of performance but not far off either

there are games that favour the RDNA2 GPUs in the consoles but some fall short and are not as impressive.
and like I said, as time goes on and games get less and less optimised for older cards this gap will widen, but right now it's not far off
You're making a bunch of false comparisons.

Taking games designed around last gen hardware and then saying you're not seeing a major improvement on new hardware is not a very good argument for new console hardware not being a big leap over last gen.

The 3 games you mentioned are also known to be very demanding and probably not the best optimised games, for every one of those games there are 10 more cross gen games that run at much higher resolution and double the framerate of last gen consoles. It seems you cherry picked the worst optimised games in order to strengthen your argument, which frankly comes across as disingenuous.

What we should really be focusing on is how games designed exclusively for current gen consoles can tap into new architecture and see how they perform. If you look at current gen only game like Rift Apart designed by a capable developer, you would see a generational jump in visuals with an option to play at 1800p at 60fps. So not only are you getting a generational jump in fidelity, but you're also getting 2X jump in resolution and framerate.

Something like 1070 would absolutely die playing above mentioned current gen exclusives at those resolutions and framerates. Saying current gen hardware isn't that impressive or isn't a big jump is a ludicrous statement when we have clear examples in front of our eyes of current gen only titles that prove otherwise.
 
Last edited:
Why can't last gen games be used to measure the next gen consoles? Esp when those games get native patches it should be fair game. The console architecture is so similar to PC now. Also, those three games are not the only examples I hate to break it to you. Sony can't even get Uncharted 4 to present a leap over the ps4 version with the remaster. So many ps5 upgrades can only squeeze out 1440p/60. They may not be designed around the SSD but they should be able to brute force their way to higher settings bit they rarely do.
 

Edder1

Member
Why can't last gen games be used to measure the next gen consoles? Esp when those games get native patches it should be fair game.
They can, but that will not give you a clear picture of hardware leap because consoles come out after multiple GPU generations have passed. There are many architecture changes by the time a new console generation arrives, these new features can only accessed by games that are designed to take advantage of those features. But even going by last gen games we see that 90% of the titles show 2X jump in resolution and framerate when it comes to cross gen games. There are some exceptions of course, but these are usually games with poorly optimised engines or haven't been optimised well enough for new consoles.
 
Last edited:

Tqaulity

Member
and most games so far do not perform like a 2080 at all on current consoles.

we have Control or Guardians of the Galaxy that, as I said, barely run bettert han on a 1070.

Control on PS5 is 1440p60 on low settings. that is between 1070 and 1080 levels of performance.

Guardians of the Galaxy is 1080p60 with a mix of medium and high. and that also is very close to how a 1070 runs the game.

Dying Light 2, 1080p medium, very close to PS5 performance on an overclocked 1070 or a 1070ti on stock clocks.
not quite PS5 levels of performance but not far off either

there are games that favour the RDNA2 GPUs in the consoles but some fall short and are not as impressive.
and like I said, as time goes on and games get less and less optimised for older cards this gap will widen, but right now it's not far off
Most games? Not true. You pointed out some specific outliers for games that have been documented to not have really been optimized for the next gen console and have engines that aren't a great match for them either. Control, Guardians, and Dying Light 2 are literally some of the worse cast examples of next gen performance. I'd hardly call that "most games".

There are plenty of examples of games that run much better on PS5/XSX than a 2070, nevermind a 1070. Call of Duty, Far Cry 6, Battlefield 2042, Deathloop, and many others point to raster performance on console that is above a 2070.

The best way to compare console vs PC perf is to look at a game that released on console and then was ported to PC. What level PC did it take to match the console performance?

One example: Death Stranding Technical Analysis

No I'm not saying that every game will run optimally and approach 3070 level on console. But this is a better example of a game that started out optimized for consoles and ported to PC (then re-ported to next gen console in this case) that is more indicative of the perf gains you can get on console when you don't just do a straight PC port.

7Ylg6Sc.png
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Why can't last gen games be used to measure the next gen consoles? Esp when those games get native patches it should be fair game. The console architecture is so similar to PC now. Also, those three games are not the only examples I hate to break it to you. Sony can't even get Uncharted 4 to present a leap over the ps4 version with the remaster. So many ps5 upgrades can only squeeze out 1440p/60. They may not be designed around the SSD but they should be able to brute force their way to higher settings bit they rarely do.
ND is just being lazy with the uncharted port. KojiPro has shown just how powerful the PS5 GPU can be. If anything, the CPU might be holding it back as seen by NXGamer's test with the ryzen 2700z which is roughly equivalent to the PS5 CPU.

But yes, last gen games can be used to measure next gen consoles especially when we have last gen games running ray tracing on next gen consoles. Spiderman MIles is a great example. 1080p 30 fps on a 1.8 tflops GPU. Native 4k 30 fps with 4kcb ray traced reflections, better character models, world detail, better hair, way better NPC and Traffic density, and even better lighting on a 10 tflops GPU.

ND just phoned it in with the Uncharted remaster. I expect their next game to be much better.
 

SlimySnake

Flashless at the Golden Globes
We are there lighting, geometry and hero character wise.
We are just not there simulation, vfx and scaling character rendering from just one hero character on the screen to everyone.

(this is 2 year old lumen that looks way better since then)
quentin-marmier-pov-high-render11-0001-1.jpg
This screenshot shows just how big the gulf between current and last gen games really is.

GLzEc0u.jpg


FzJFLUu.jpg
MUlAabI.jpg


Horizon has to blur out and hide so many things with fog because the current graphics pipelines simply cant render both near and far objects with the same level of detail.

JbNT9f7iDURRXoM3Dy38mY.png


SE7CzsQ.gif


I am so done with this cross gen bs.
 
quentin-marmier-pov-high-render11-0001-1.jpg
This screenshot shows just how big the gulf between current and last gen games really is.

GLzEc0u.jpg


FzJFLUu.jpg
MUlAabI.jpg


Horizon has to blur out and hide so many things with fog because the current graphics pipelines simply cant render both near and far objects with the same level of detail.

JbNT9f7iDURRXoM3Dy38mY.png


SE7CzsQ.gif


I am so done with this cross gen bs.
UE5 looks waaaaaaay better than Forbidden West, but I kinda dig the art style of Forbidden West. It's not realistic, but it's bordering between cartoonish and realistic at the same time. It certainly has a style to it.

However, UE5 takes the friggin cake when it comes to geometry and lighting. Holy cow.
 

Edder1

Member
I am so done with this cross gen bs.
Same here. I fully expect Guerilla to outdo what we saw with UE5 demo with their next project, people forget how insane of a jump Killzone Shadow Fall was over Killzone 3 visually and it was a launch game at that. Guerilla is easily one of the top dogs when it comes to pushing hardware to the max, it's a pity they had to be tied down by cross gen.
 
Last edited:

ChiefDada

Gold Member
Yea, the lack of bandwidth on these GPUs is keeping them from showing some really awesome stuff.

But they have already shown awesome stuff with the UE5 demos. Prior to the first PS5 UE demo, what gaming PC rendered assets with better geometry and textures in real time?

Also, unless you are privy to the GPU and I/O hardware customizations made to improve bandwidth efficiency, how can you be sure what these consoles are capable of? Honestly, I'm fully convinced that if you were shown the original UE5 demo without context and no one telling you it was running on a console, you would swear to the ends of the earth that the consoles wouldn't be able to produce this due to poor bandwidth or some other perceived hardware shortcomings.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Here is that south korean game. The level of detail far into the distance, visibility and the lighting is top notch compared to HFW.

Er2hxjC.gif


E4HaJt9.gif


FVFb0t7.gif


I don't think this is a fair/valid comparison.

To me it looks like the fog in Horizon is clearly artistically driven, not a technical limitation.

Cross-platform games have had near limitless draw distance last gen as well, most of the distant objects are rendered at a potato quality, but they're visible.

emk3Sw2.jpeg



I don't think there's any reason Horizon couldn't have done it, even on PS4, outside of artistic interpretation.
 

VFXVeteran

Banned
But they have already shown awesome stuff with the UE5 demos. Prior to the first PS5 UE demo, what gaming PC rendered assets with better geometry and textures in real time?
UE5 demo doesn't have geometry perfected. The ground tessellation looks better than any game but it's a farcry from true CG. I wouldn't stop at UE5 being the best realtime graphics can do.. maybe for the GPUs we have today but there is always room for improvement. Textures look good for static objects but Crysis Remake has sharper textures than UE5 upon close inspection. Texture work is still lacking in dynamic objects and there isn't enough layers of textures. As a reference for me, I am used to 5 or so texture layers on a single object in film.

Also, unless you are privy to the GPU and I/O hardware customizations made to improve bandwidth efficiency, how can you be sure what these consoles are capable of? Honestly, I'm fully convinced that if you were shown the original UE5 demo without context and no one telling you it was running on a console, you would swear to the ends of the earth that the consoles wouldn't be able to produce this due to poor bandwidth or some other perceived hardware shortcomings.
I'm an outlier here. I can tell because I'm used to making stuff way better than what UE5 can do. Every film industry person that has worked on CG movies can tell a difference that it's a game and not CG. We know what to look for. I actually worked on the Matrix movies with the CTO of Epic back in the day so it didn't surprise me that he'd want to do a demo to see how close GPUs can get to what we worked on.
 
Personally, Im hoping for better overall physics and effects, instead of just "shiny graphics"

Stuff like Terrain deformation, destruction, ect for stuff like Shooters.

More Realistic damage modeling, vehicle handling, weather effects, ect for Racing gamings.

More realistic AI, better lighting, more/better particles, ect, ect, ect.


Im generally more impressed by details like that, then i am by super large amounts of polygons that look really shiny and pretty.
 

Hunnybun

Member
Personally, Im hoping for better overall physics and effects, instead of just "shiny graphics"

Stuff like Terrain deformation, destruction, ect for stuff like Shooters.

More Realistic damage modeling, vehicle handling, weather effects, ect for Racing gamings.

More realistic AI, better lighting, more/better particles, ect, ect, ect.


Im generally more impressed by details like that, then i am by super large amounts of polygons that look really shiny and pretty.

Generally I think a lot of things that are heavy on the CPU really benefit how a game looks. It's not strictly graphics but so what, it still looks great and that's what matters.

60fps LOOKS way better than 30. It really brings the graphics to life by making them so much clearer in motion.

Physics simulations like objects scattering and destruction just looks really cool. It does way more to bring a scene to life than just more polygons or better IQ.

Weather simulation is the same. The single most impressed I was by a game in the whole of last gen was Uncharted 4 when Drake is marooned on the beach in the storm. It looked fucking amazing.

Great animation is also hugely impressive visually. Just look at TLOU2.

These are all things that are way more possible this generation but have barely been touched on yet by developers.

This generation will end up being about way more than just better visual effects and high resolutions.
 

SlimySnake

Flashless at the Golden Globes
I don't think this is a fair/valid comparison.

To me it looks like the fog in Horizon is clearly artistically driven, not a technical limitation.

Cross-platform games have had near limitless draw distance last gen as well, most of the distant objects are rendered at a potato quality, but they're visible.

emk3Sw2.jpeg



I don't think there's any reason Horizon couldn't have done it, even on PS4, outside of artistic interpretation.
Horizon has focused on rendering the highest level of detail up close whereas AC is far more even. AC just doesnt look as good as Horizon up close for this reason because GG spends their entire rendering budget making the game look great up close.

With nanite, that wont be an issue anymore.
 

Omegaking

Banned
quentin-marmier-pov-high-render11-0001-1.jpg
This screenshot shows just how big the gulf between current and last gen games really is.

GLzEc0u.jpg


FzJFLUu.jpg
MUlAabI.jpg


Horizon has to blur out and hide so many things with fog because the current graphics pipelines simply cant render both near and far objects with the same level of detail.

JbNT9f7iDURRXoM3Dy38mY.png


SE7CzsQ.gif


I am so done with this cross gen bs.
TC52kwq.jpg

I'm done with people posting sh*ty pics of native 4k games. What are you trying to prove?
 

SlimySnake

Flashless at the Golden Globes
all I'm saying is if you are making comparisons make it fair, and you are comparing a tech demo with a game. You seem to think that both are playable.
I am comparing current gen tech to next gen tech. That's the entire point. All screenshots and gifs in my post are taken from 4k youtube videos. For both Horizon and UE5 demo which is playable on PC. And yet you can see a generational difference in the UE5 youtube captures.

The entire point of my post is that DISTANT objects and draw distance do not need to be blurred out or hidden in fog. you posting close up shots of Aloy and high res mountains serves no purpose in this discussion. No one is saying Horizon looks bad. We are talking about what Nanite brings to next gen which is basically removing LOD management and letting the engine handle everything without having to downgrade objects in the distance like GG has to do for HFW.
 

Neo_game

Member
I am comparing current gen tech to next gen tech. That's the entire point. All screenshots and gifs in my post are taken from 4k youtube videos. For both Horizon and UE5 demo which is playable on PC. And yet you can see a generational difference in the UE5 youtube captures.

The entire point of my post is that DISTANT objects and draw distance do not need to be blurred out or hidden in fog. you posting close up shots of Aloy and high res mountains serves no purpose in this discussion. No one is saying Horizon looks bad. We are talking about what Nanite brings to next gen which is basically removing LOD management and letting the engine handle everything without having to downgrade objects in the distance like GG has to do for HFW.

I am not sure that is the case as there is always compromise and UE5 is not going to change it. Let us see how Stalker2 runs on consoles but UE5 tech demo was running around 1440P 30fps. Where as most games not running RT are 4K 30fps. So most of the rendering budget is being utilized for running at 4K and these consoles are BW limited as well. We need to wait for fidelity mode at 1440P 30fps instead.
 

Omegaking

Banned
I am comparing current gen tech to next gen tech. That's the entire point. All screenshots and gifs in my post are taken from 4k youtube videos. For both Horizon and UE5 demo which is playable on PC. And yet you can see a generational difference in the UE5 youtube captures.

The entire point of my post is that DISTANT objects and draw distance do not need to be blurred out or hidden in fog. you posting close up shots of Aloy and high res mountains serves no purpose in this discussion. No one is saying Horizon looks bad. We are talking about what Nanite brings to next gen which is basically removing LOD management and letting the engine handle everything without having to downgrade objects in the distance like GG has to do for HFW.

We don't play games staring at distant objects. Both the EU5 tech demo AND the matrix demo are sub 4k running at 24 fps. outside the opening sequence, the matrix demo is nothing spectacular. the first thought I had of the matrix was how muddy the textures are from afar. I think Horizon is a better presentation and looks better than both even for a cross-gen game.

dHyRwHQ.jpg
 
Last edited:

Edder1

Member
all I'm saying is if you are making comparisons make it fair, and you are comparing a tech demo with a game.
You literally are handpicking screenshots with time of day and lighting where Horizon looks the most impressive, hardly a fair comparison on your part.

While Horizon does look impressive for a cross gen game, its geometry and level of detail are nowhere near UE5 demo. Even that Korean Pokémon clone DokeV looks way more impressive than Forbidden West because it's not held back by last gen. The lighting in Horizon is now really showing its age, it makes a lot of the scenery look flat in places. I think we'll all see how outdated these cross gen games technically are once next gen (current gen) games become more common.
 
Last edited:

UnNamed

Banned
Horizon FW is good... but the way light interacts with materials is not always optimal. Some materials are not very good, Aloy seems to be detached from the environment sometimes, like she's rendered in a way and the environment in another. Also, the quality of the light and materials tends to go down for objects distant from the camera.
It's not a flaw, it's because HFW is designed in that way since it's a crossgen game, and that's why it can run 4K/60. No secret sauce there.

But UE5 is not an unknown alien tech, and Sony and Microsoft will implement the same features in their game in the future. Expect an Horizon with a graphics compared to Matrix.
 

amigastar

Member
I honestly don't know what to expect from graphics this gen. I simply look forward to games that will come out.
I look forward to Arma 4 and Starfield and GTA 6 and so on.
 
Horizon FW is good... but the way light interacts with materials is not always optimal. Some materials are not very good, Aloy seems to be detached from the environment sometimes, like she's rendered in a way and the environment in another. Also, the quality of the light and materials tends to go down for objects distant from the camera.
It's not a flaw, it's because HFW is designed in that way since it's a crossgen game, and that's why it can run 4K/60. No secret sauce there.

But UE5 is not an unknown alien tech, and Sony and Microsoft will implement the same features in their game in the future. Expect an Horizon with a graphics compared to Matrix.
Its called hero lighting…
 

saintjules

Member
I honestly don't know what to expect from graphics this gen. I simply look forward to games that will come out.
I look forward to Arma 4 and Starfield and GTA 6 and so on.

Honestly, I think it will be a generation about the option of framerates, fidelity, faster loading times and doing things that couldn't be done last gen due to hardware limitations. The ability to add more density to the game. More people on screen, etc.

It will be cool to see where we end up 5-6 years from now.
 

ChiefDada

Gold Member
UE5 demo doesn't have geometry perfected. The ground tessellation looks better than any game but it's a farcry from true CG. I wouldn't stop at UE5 being the best realtime graphics can do.. maybe for the GPUs we have today but there is always room for improvement. Textures look good for static objects but Crysis Remake has sharper textures than UE5 upon close inspection. Texture work is still lacking in dynamic objects and there isn't enough layers of textures. As a reference for me, I am used to 5 or so texture layers on a single object in film.

I agree with everything you're saying except for bolded. I tried to find these better textures on Crysis Remastered PC with no success. Can you provide examples?

Also, assuming Naughty Dog ships TLOU Remake this year, what are your general expectations? Do you think character model and environmental rendering will showcase a generational leap? If not, what PC game that exists today will best a future TLOU remake?

But UE5 is not an unknown alien tech, and Sony and Microsoft will implement the same features in their game in the future. Expect an Horizon with a graphics compared to Matrix.

This. People assume UE5 is the epitome of next gen engines simply because Epic was the first to introduce their new engine to the public. I expect Sony 1st party studios such as Insomniac, Guerilla, and Naughty Dog engines to eclipse UE5 as it relates to PS5 games. Why? Because they don't have to accommodate for PC and other console architecture. If Sony went through the trouble of making an I/O focused machine, why wouldn't their engines be largely I/O dependent?
 

Edder1

Member
I honestly don't know what to expect from graphics this gen.
Yeah, the lack of current gen showcase has made many of us question what current gen games will really look like. By this time last gen we already knew very well of what that gen was capable of because there were many examples on hand. So far the only example of true next gen leap we've seen is Rift Apart, but it's hard to judge this gen based on that game because it uses stylised graphics. Regardless, I think Rift Apart showed a proper generational leap over PS4 Ratchet. In a few months at E3 we should finally get a clear idea of what current gen is all about.
 
Last edited:

Hunnybun

Member
Yeah, the lack of current gen showcase has made many of us question what current gen games will really look like. By this time last gen we already knew very well of what that gen was capable of because there were many examples on hand. So far the only example of true next gen leap we've seen is Rift Apart, but it's hard to judge this gen based on that game because it uses stylised graphics. Regardless, I think Rift Apart showed a generational leap over PS4 Ratchet. In a few months at E3 we should finally get a clear idea of what current gen is all about.

I agree with the general point, but there are a few other data points if we're being pedantic.

I thought the new Plague Tale game looked brilliant, and roughly equivalent to Ratchet, albeit with a realistic style.

Of course there's Hellblade 2, which is just stunning.

There are the small snippets from Dead Space and Starfield, which both look fantastic, if of course we can trust that footage as representative.

There's that Arc Raiders game that looks very good too.

There's the Suicide Squad trailer, although that's pretty cartoony too.


So I think there's a fair bit of footage out there to build up a rough idea of what we can expect.
 

SlimySnake

Flashless at the Golden Globes
Yeah, the lack of current gen showcase has made many of us question what current gen games will really look like. By this time last gen we already knew very well of what that gen was capable of because there were many examples on hand. So far the only example of true next gen leap we've seen is Rift Apart, but it's hard to judge this gen based on that game because it uses stylised graphics. Regardless, I think Rift Apart showed a proper generational leap over PS4 Ratchet. In a few months at E3 we should finally get a clear idea of what current gen is all about.
I think MS Flight Simulator was the first true next gen showpiece. It looks photorealistic 99% of the time, and I am a 100% certain racing games will look that good next gen.

Ratchet is wasting a lot of the GPU on rendering native 4k pixels at 40 fps. Had they target 1080p 30 fps, they wouldve had 4x more GPU power to increase visual fidelity. While Ratchet might be the best looking traditional game out on the market right now, it will look last gen by the time this gen is done.
 
I think MS Flight Simulator was the first true next gen showpiece. It looks photorealistic 99% of the time, and I am a 100% certain racing games will look that good next gen.

Ratchet is wasting a lot of the GPU on rendering native 4k pixels at 40 fps. Had they target 1080p 30 fps, they wouldve had 4x more GPU power to increase visual fidelity. While Ratchet might be the best looking traditional game out on the market right now, it will look last gen by the time this gen is done.
1080p is way too low IMO…lets go with 1440p upscaled to 4K and 30 or 60 fps…like TLOU II.
 

Sosokrates

Report me if I continue to console war
all I'm saying is if you are making comparisons make it fair, and you are comparing a tech demo with a game. You seem to think that both are playable
vlD503e.jpg
tSZcl8o.jpg
xX9MQdD.jpg
.
txn1Ovz.jpg
vhB0syS.jpg
c0dCTjB.jpg
WAQ1rAw.jpg
Really wish I could rent this game just to admire the texture and geometric detail.
 

VFXVeteran

Banned
I agree with everything you're saying except for bolded. I tried to find these better textures on Crysis Remastered PC with no success. Can you provide examples?
If you have the game, run it with max textures (8k) and take a look at the rocks up close. Take note of the detail and then go to the UE5 demo for the PC (the one before the Matrix demo) and try to move up to a rock as close as you can. Then take a screenshot. You will surely see that there is more texture detail on the Crysis Remake rock than the UE5 demo rock. Remember we are talking about textures and not geometry detail.

Also, assuming Naughty Dog ships TLOU Remake this year, what are your general expectations? Do you think character model and environmental rendering will showcase a generational leap? If not, what PC game that exists today will best a future TLOU remake?
No there won't be a generational leap with TLOU Remake. I seriously doubt they will be running 8k textures. "Generational leap" is definitely going to be extremely hard to achieve this generation unless we are talking about Photogrammetry with a pure RT lighting pipeline. I look for every single light source to be shadow casters, RT GI with aggressive RT AO on clothing and every object (even if it's not moving). PBR hair shaders need to enhance the typical low sample shading with only 1-2 specular lobes, low res shadow maps and using actual curves instead of clumps.

This. People assume UE5 is the epitome of next gen engines simply because Epic was the first to introduce their new engine to the public. I expect Sony 1st party studios such as Insomniac, Guerilla, and Naughty Dog engines to eclipse UE5 as it relates to PS5 games. Why? Because they don't have to accommodate for PC and other console architecture.
That's false. Every single studio you mentioned is repurposing their graphics engine for PC portability. I know ND is doing it and we all know that GG and SSM has already done it. The PS5 is not the hardware to showcase all of what their graphics engines can do any longer.
 
Last edited:
If you have the game, run it with max textures (8k) and take a look at the rocks up close. Take note of the detail and then go to the UE5 demo for the PC (the one before the Matrix demo) and try to move up to a rock as close as you can. Then take a screenshot. You will surely see that there is more texture detail on the Crysis Remake rock than the UE5 demo rock. Remember we are talking about textures and not geometry detail.


No there won't be a generational leap with TLOU Remake. I seriously doubt they will be running 8k textures. Generational leap is definitely going to be extremely hard unless we are talking about Photogrammetry with a pure RT lighting pipeline. I look for every single light source to be shadow casters, RT GI with aggressive RT AO on clothing and every object (even if it's not moving). PBR hair shaders need to enhance the typical low sample shading with only 1-2 specular lobes, low res shadow maps and using actual curves instead of clumps.


That's false. Every single studio you mentioned is repurposing their graphics engine for PC portability. I know ND is doing it and we all know that GG and SSM has already done it.
While I agree with many of your opinions I find it kinda funny that when talking about a "generational leap", you have this hard line that it has to have 8k textures, photogrammetry mixed with a suite of the most intensive RT possible. The bar you're setting is for the most powerful PC possible when Last of Us Remake will be coming to console. Shouldn't the bar for a generational leap for a console game be much lower? It's like you jumped ahead a generation.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
If you have the game, run it with max textures (8k) and take a look at the rocks up close. Take note of the detail and then go to the UE5 demo for the PC (the one before the Matrix demo) and try to move up to a rock as close as you can. Then take a screenshot. You will surely see that there is more texture detail on the Crysis Remake rock than the UE5 demo rock. Remember we are talking about textures and not geometry detail.


No there won't be a generational leap with TLOU Remake. I seriously doubt they will be running 8k textures. Generational leap is definitely going to be extremely hard unless we are talking about Photogrammetry with a pure RT lighting pipeline. I look for every single light source to be shadow casters, RT GI with aggressive RT AO on clothing and every object (even if it's not moving). PBR hair shaders need to enhance the typical low sample shading with only 1-2 specular lobes, low res shadow maps and using actual curves instead of clumps.


That's false. Every single studio you mentioned is repurposing their graphics engine for PC portability. I know ND is doing it and we all know that GG and SSM has already done it. The PS5 is not the hardware to showcase all of what their graphics engines can do any longer.
Remember you said Horizon Forbidden West on PS5 wouldn't look better than Horizon Zero Dawn on PC?

You can be wrong about these things
 
I think MS Flight Simulator was the first true next gen showpiece. It looks photorealistic 99% of the time, and I am a 100% certain racing games will look that good next gen.

Ratchet is wasting a lot of the GPU on rendering native 4k pixels at 40 fps. Had they target 1080p 30 fps, they wouldve had 4x more GPU power to increase visual fidelity. While Ratchet might be the best looking traditional game out on the market right now, it will look last gen by the time this gen is done.
Hope your talking about 1080p upscaled to 4k using a technique at minimum as good as returnals because otherwise who the fuck wants to play at 1080p/30? I almost don't care how good a games graphics are if the image quality and performance is that low
 

VFXVeteran

Banned
While I agree with many of your opinions I find it kinda funny that when talking about a "generational leap", you have this hard line that it has to have 8k textures, photogrammetry mixed with a suite of the most intensive RT possible. The bar you're setting is for the most powerful PC possible when Last of Us Remake will be coming to console. Shouldn't the bar for a generational leap for a console game be much lower? It's like you jumped ahead a generation.
It will be very difficult to make last gen game features with just higher samples look a generation ahead unless EVERYTHING is higher. You are implying that people should recognize the enhancements that a PC GPU makes to a typical 3rd party AAA game over it's consoles to be considered a generation ahead since it simply does "more". I would agree with you on that but most gamers on these boards don't want to accept that as a truth.
 

VFXVeteran

Banned
Remember you said Horizon Forbidden West on PS5 wouldn't look better than Horizon Zero Dawn on PC?

You can be wrong about these things
I never said it wouldn't look better, I said it *could* look better. If you accept that the PS4 version of HFW looks subpar (ignoring artistic vision and focusing only on the rendering capabilities) compared to the PC version of HZD (which I *hope* you do), then the enhancements that they made for the PS5 was extracted out from the PC features that they implemented for HZD. Aside from different level design, a change in biomes, and a new water shader, there really isn't anything *rendering* related that dwarfs the PC features in HZD. PC version features of HZD was propagated to the enhancements that they put into HFW for PS5 and therefore, I was pretty much spot on saying they would *technically* look at least equivalent. I tried to hint to you guys information about the new engine features without going on record about it like I have about other things. Even NxGamer took note of the enhancements made to their renderer when the PC port came out. It should have been obvious that the team would use the newfound power from PS5 to add in the enhancements from the PC feature set. The main advantage the PC version has right now is that it runs > 60FPS while the PS5 runs at 30FPS.

In short, no one can make the claim that HFW has better textures, higher resolution, better shadows, farther LOD models out to the horizon, better texture filtering, more accurate ambient occlusion, etc.. to make it "look" better than the PC version of HZD. That was my original point even if I didn't convey it properly (which is my fault).
 
Last edited:

ChiefDada

Gold Member
Take note of the detail and then go to the UE5 demo for the PC (the one before the Matrix demo) and try to move up to a rock as close as you can.

Well that's why I specifically referred to the PS5 demo. I knew the Valley of the Ancient PC demo had lower texture quality than the original PS5 demo as soon as I saw it running. Epic later mentioned how 8k texture was feasible on PS5 due to the proprietary compression technique. So can we at least agree that Crysis isn't using dense geometry at the levels displayed in original PS5 demo while also rendering 8k textures? Thus PS5 was able to render dense geometry with 8k textures due to compression technology not currently available on PC.

Generational leap is definitely going to be extremely hard unless we are talking about Photogrammetry with a pure RT lighting pipeline. I look for every single light source to be shadow casters, RT GI with aggressive RT AO on clothing and every object (even if it's not moving). PBR hair shaders need to enhance the typical low sample shading with only 1-2 specular lobes, low res shadow maps and using actual curves instead of clumps.

This definition seems very specific yet not all encompassing, but of course you're entitled to your opinion of what constitutes a generational leap.

That's false. Every single studio you mentioned is repurposing their graphics engine for PC portability. I know ND is doing it and we all know that GG and SSM has already done it.

Console games with I/O heavy game code can be ported by increasing memory requirements and/or using more CPU/GPU resources for decompression, etc. That doesn't prevent first party studios from designing their engines around the consoles such as Ps5.
 

VFXVeteran

Banned
Well that's why I specifically referred to the PS5 demo. I knew the Valley of the Ancient PC demo had lower texture quality than the original PS5 demo as soon as I saw it running.
I don't know if the PS5 had 8k textures on the rocks. I remember Brian talking about the statues have 8k texture maps though.

Epic later mentioned how 8k texture was feasible on PS5 due to the proprietary compression technique. So can we at least agree that Crysis isn't using dense geometry at the levels displayed in original PS5 demo while also rendering 8k textures? Thus PS5 was able to render dense geometry with 8k textures due to compression technology not currently available on PC.
Absolutely. But if you do a 1:1 comparison with graphics features the consoles will lose all the time. That's IF we are talking about technology bullet points for a game. The consoles just don't have the bandwidth to render too much data or take too many samples. It is what it is.

Console games with I/O heavy game code can be ported by increasing memory requirements and/or using more CPU/GPU resources for decompression, etc. That doesn't prevent first party studios from designing their engines around the consoles such as Ps5.
The bottleneck on the consoles isn't the I/O obviously. It's the GPU. Assume I'm able to get all the massive data to the GPU with it being compressed. OK. That handles the geometry assets being loaded in and the textures being loaded in. But now how do I render those assets when I need to run a lighting equation with BRDFs for materials? Making conditions worse, let's ray-trace and test against heavy asset bounding boxes to find that per-pixel triangle that I hit? Then let's keep the framebuffer resolution 4k throughout the entire rendering pipeline for all the buffers to combine properly. But we can't because we've ran out of compute resources and have to start culling stuff out. Resolution is the first optimization but that means we need a reconstruction technique which will degrade quality.. etc.. etc.. so many moving parts and not enough bandwidth. This is why you are seeing these demos running at 1080p with reconstruction all over the place. AMD failed to develop a proper DLSS-like hardware feature to help out with bandwidth. They literally missed the boat this generation.
 
Last edited:

Edder1

Member
I think MS Flight Simulator was the first true next gen showpiece. It looks photorealistic 99% of the time, and I am a 100% certain racing games will look that good next gen.

Ratchet is wasting a lot of the GPU on rendering native 4k pixels at 40 fps. Had they target 1080p 30 fps, they wouldve had 4x more GPU power to increase visual fidelity. While Ratchet might be the best looking traditional game out on the market right now, it will look last gen by the time this gen is done.
I would suggest the following:

1. Quality mode: 1440p30 upscaled to 1800p/4K, possibly with RT.
2. Performance: 1080p60 upscaled to 1440p/1600p
3. An additional RT performance mode, at something like 720p/800p60 upscaled to 1080p/1200p.

The first two would be enough, the third would be a bonus.
 
Last edited:

sendit

Member
The bottleneck on the consoles isn't the I/O obviously. It's the GPU. Assume I'm able to get all the massive data to the GPU with it being compressed. OK. That handles the geometry assets being loaded in and the textures being loaded in. But now how do I render those assets when I need to run a lighting equation with BRDFs for materials? Making conditions worse, let's ray-trace and test against heavy asset bounding boxes to find that per-pixel triangle that I hit? Then let's keep the framebuffer resolution 4k throughout the entire rendering pipeline for all the buffers to combine properly. But we can't because we've ran out of compute resources and have to start culling stuff out. Resolution is the first optimization but that means we need a reconstruction technique which will degrade quality.. etc.. etc.. so many moving parts and not enough bandwidth. This is why you are seeing these demos running at 1080p with reconstruction all over the place. AMD failed to develop a proper DLSS-like hardware feature to help out with bandwidth. They literally missed the boat this generation.

Not sure what you're trying get at? The GPU will always be bandwidth constrained as game requirements increase regardless of the generation. The only way I can ever see this ever going away is the cloud. Where GPU power is auto scaled on demand depending on the application. Then again, even that would be limited by the capacity of a data centers. However, I do agree. AMD fucked up by now having a similar DLSS solution.
 

VFXVeteran

Banned
Not sure what you're trying get at? The GPU will always be bandwidth constrained as game requirements increase regardless of the generation. The only way I can ever see this ever going away is the cloud. Where GPU power is auto scaled on demand depending on the application. Then again, even that would be limited by the capacity of a data centers. However, I do agree. AMD fucked up by now having a similar DLSS solution.
Bingo!!
 
Top Bottom