• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Poor optimization or is the Series X/PS5 underpowered? Dying light 2 as reference

adamsapple

Gold Member
I can't believe we already have a 1080P 30fps mode on these consoles. Loading times also being 25+secs is not great. Even the dev recommends to use Dx11 settings, it is more heavy on DX12. The game is poorly optimized and it is well know that there were issues with the game dev. The videos I have seen the game is not that good looking either too be so demanding.

We shouldn't use one example as a generalization for the entire generation.
 

ArtHands

Thinks buying more servers can fix a bad patch
I can't believe we already have a 1080P 30fps mode on these consoles. Loading times also being 25+secs is not great. Even the dev recommends to use Dx11 settings, it is more heavy on DX12. The game is poorly optimized and it is well know that there were issues with the game dev. The videos I have seen the game is not that good looking either too be so demanding.

Its inevitable, because consoles has a hardware performance limit unlike PC
 

VFXVeteran

Banned
Why would we ever not resort to reconstruction techniques in the future though?
Because I'm speaking about hardware having much more bandwidth for today's games that reconstruction won't be needed. We should all be looking for more GPU power and best image quality. Even though DLSS is an excellent reconstruction technique, it still has drawbacks. The other reconstruction techniques degrade image quality too much to be viable IMO.
 

VFXVeteran

Banned
You can't be serious! Digital Foundry literally has a video up on their channel saying it primarily targeted last gen with a little bit of polishing for next gen. Digital Foundry Video the game has been in development for several years lol
I don't care about what they targeted. I'm telling you that RT uses DX12 and isn't the same as DX11. They are 2 different code paths.
 

VFXVeteran

Banned
But... DL2 does not use RT for lighting on consoles, just for shadows (which in some cases are still screen space shadows full of artifacts) and ambient occlusion. And still it has much lower resolution and frame rate than Doom Eternal. 1080p@30fps vs dynamic 1800p@60fps.

To be clear, are you saying DL2 has better tech than Doom Eternal? Just that it looks better? Both?
DL2 has better tech than Doom Eternal IMO. iD is still working on their complete realtime RT pipeline. They have been behind the curve for sometime now. Their codebase is extremely optimized because they are more concerned with FPS than graphic features.

Looking better or not is subjective.
 
Last edited:

VFXVeteran

Banned
Are ray traced shadows more expensive then reflections? I'm assuming they are since so few games on ps5 have uses them?
It depends. Reflections, the way games uses them, is fairly cheap compared to RT shadows unless they both are only casting 1 ray per pixel and shadows are only being cast from 1 light source. The more appropriate way to cast RT shadows would be significantly more complex than RT reflections (even if the reflections are blurred) *if* reflection recursion is limited to a depth of 1.
 

Justin9mm

Member
Ever since Tom, sold it's site, that Tom's hardware has been riddle with non-sense like this.
Be it in gaming of cinema, 2K is for 1080p.
ethomaz ethomaz has already done a clear explanation of this.
It's ok to be ignorant of this. The vast amount of knowledge that exists means we are all ignorant of many things.
The issue is to insist in an mistake, despite others with clear knowledge explaining the standard.
No one is denying the facts that 1080p is technically 2K.

The problem is you both being ignorant of accepting that 1440p is commonly referred to as 2K throughout the gaming industry.

ethomaz ethomaz was clearly aware what the person meant when they said DL2 is not 2K in performance mode and decided with his reply comment to imply that the person was not technically correct which was uncalled for and not entirely relevant given all parties comprehended what was being said. That's what we call being a jackass.

Maybe try resetera to preach your 1080p is 2K propaganda
 

ethomaz

Banned
No one is denying the facts that 1080p is technically 2K.

The problem is you both being ignorant of accepting that 1440p is commonly referred to as 2K throughout the gaming industry.

ethomaz ethomaz was clearly aware what the person meant when they said DL2 is not 2K in performance mode and decided with his reply comment to imply that the person was not technically correct which was uncalled for and not entirely relevant given all parties comprehended what was being said. That's what we call being a jackass.

Maybe try resetera to preach your 1080p is 2K propaganda
DL2 is already 2k in performance mode.
A made up and weird excuse won’t become true because you believe it is “right” lol
 
Last edited:

assurdum

Member
UE5 is doing something new. I'm not trying to imply that it isn't. But UE5 is using RT for their lighting though. The Nanite tech is amazing but again, we are only talking about 1 company right now. We are talking about all the other companies that are in existance now.


Our disagreement is that just because 4A can do it, doesn't mean that other companies can do it too without a big hit to the GPU. 4A has always been a leader of graphics tech (likewise with Epic).
From what nxg reported it seems their engine is very outdated and doesn't support direct X 12 and async computation properly on console that's why is so heavy.
 
Last edited:

xrnzaaas

Member
Dying Light 2's development was definitely problematic, the only difference is they've managed to make a more stable working product than CDP with Cyberpunk. I definitely think they can and should work on improving the performance mode either by raising the resolution or offering better image quality in 1080p with sharper textures and less blur.
 

winjer

Member
No one is denying the facts that 1080p is technically 2K.

The problem is you both being ignorant of accepting that 1440p is commonly referred to as 2K throughout the gaming industry.

ethomaz ethomaz was clearly aware what the person meant when they said DL2 is not 2K in performance mode and decided with his reply comment to imply that the person was not technically correct which was uncalled for and not entirely relevant given all parties comprehended what was being said. That's what we call being a jackass.

Maybe try resetera to preach your 1080p is 2K propaganda

I frequent a few tech forums, which also have a lot of gamers. And no one thinks 1440p is 2K.
Every gamer that knows his basic tech stuff, knows that 1080p is 2K.
Just because some people he re think 1440p is 2k, does not make it the standard.
You have two options now. Continue to spread the wrong information.
Or, you can inform other people of the actual standard.
 

PeteBull

Member
Was Techland ever considered competent?
In terms of gameplay/making insteresting game, sure(altho personaly not my type of game), in terms of industry leading visuals/perfect optimisation- not in the slightest.
Not all devs can be naughty dog, playground games or sucker punch, actually those are exceptions that confirm the rule, most are way below that, especially if we count in multiplatform aspect of games.
 
Falcom released a last gen game that couldn't do 60fps on the PS5.

Nobody judged the PS5 hardware on that because Falcom are a bunch of kusoge clowns.

(disclaimer I love Falcom games and buy them all)
 

vkbest

Member
Falcom released a last gen game that couldn't do 60fps on the PS5.

Nobody judged the PS5 hardware on that because Falcom are a bunch of kusoge clowns.

(disclaimer I love Falcom games and buy them all)
Falcom engine sucks, have huge performance hit on transparencies.
 

Dane

Member
Game appears to be very one hardware bound, in this case, the GPU, apparently it does run well on PS4.
 

PeteBull

Member
Game appears to be very one hardware bound, in this case, the GPU, apparently it does run well on PS4.
"runs well" in this particular case means:
stable 30fps with reduced settings(according to DF xbone and ps4 runs at pc lowest setttings) and on top nasty cut to resolution, 864p so 1536x864 pixels
More info in the vid ofc
Remember when back in the day we all laughed at first titanfall running in 792p on xbox one? That game's res is only a smidge above, and yup-it looks like pixelated mess right away on first glance on both base last gen mashines despite optimalisation(optimalisation meaning turning down to lowest possible settings avaiable on pc).
 
Last edited:

Dane

Member
"runs well" in this particular case means:
stable 30fps with reduced settings(according to DF xbone and ps4 runs at pc lowest setttings) and on top nasty cut to resolution, 864p so 1536x864 pixels
More info in the vid ofc
Remember when back in the day we all laughed at first titanfall running in 792p on xbox one? That game's res is only a smidge above, and yup-it looks like pixelated mess right away on first glance on both base last gen mashines despite optimalisation(optimalisation meaning turning down to lowest possible settings avaiable on pc).

Considering that the PS5 and XSX have around 3x more power than the PS4 Pro and Xbox One X on GPU department and way more on CPU side, i'd expected that it should have ran waay better at 1440p60.
 

PeteBull

Member
Considering that the PS5 and XSX have around 3x more power than the PS4 Pro and Xbox One X on GPU department and way more on CPU side, i'd expected that it should have ran waay better at 1440p60.
Again no idea why u guys coming up with this 1440p60 mode, both xsx and ps5 got no 1440p60 mode, but 1080p60mode, xss has 1080p30 mode, no 60fps at all.

Ps4pro drops frames in its 1080p30 mode, u cant take it as a baseline, u can take xox as a baseline coz it doesnt drop fps in its 1080p30 mode, both current gen mashines got 1080p60fps mode which clearly means at least 2x gpu/cpu performance vs last gen, in terms of settings- no raytracing in this mode either just like in lastgen consoles and obviously other settings are far from maxed.
 
Last edited:

VFXVeteran

Banned
From what nxg reported it seems their engine is very outdated and doesn't support direct X 12 and async computation properly on console that's why is so heavy.
You can't do RT without DX12. I'm not sure what they use for the PS5 which probably has it's own DX12-like API for RT.

I'm saying they have 2 codebases. It's not outdated if it can do the latest features. We also can't say it's completely unoptimized because it doesn't perform like 4AGames graphics engine. Basically there is no way to tell what's going on unless we see the code, so I would rather not judge them as incompetent.
 

ethomaz

Banned

🍿🍿
 

elliot5

Member

seems like VRR update came thru for DL2? 80-110 FPS seems pretty good. Dunno if that could push to 1440p stable 60 fps otherwise, though. would need to check PC comparisons for the same GPU models


actually a 5700 XT hits over 60 FPS with high settings (on average?) so I would expect XSX to be able to handle 1440p/60 pretty well especially with VRR enabled.
 
Last edited:

OZ9000

Gold Member

seems like VRR update came thru for DL2? 80-110 FPS seems pretty good. Dunno if that could push to 1440p stable 60 fps otherwise, though. would need to check PC comparisons for the same GPU models


actually a 5700 XT hits over 60 FPS with high settings (on average?) so I would expect XSX to be able to handle 1440p/60 pretty well especially with VRR enabled.
Based on this, the PS5 and Xbox should hit 1440p60, not 1080p60
 

elliot5

Member
Based on this, the PS5 and Xbox should hit 1440p60, not 1080p60
even a 6500 xt which I think is similar, if not even more gimped, than a Series S GPU and it practically hits 1440p/30. Wonder if with a bit more love the Series S could hit 1440p/30
 

assurdum

Member
You can't do RT without DX12. I'm not sure what they use for the PS5 which probably has it's own DX12-like API for RT.

I'm saying they have 2 codebases. It's not outdated if it can do the latest features. We also can't say it's completely unoptimized because it doesn't perform like 4AGames graphics engine. Basically there is no way to tell what's going on unless we see the code, so I would rather not judge them as incompetent.
I haven't said the game doesn't support DX12. Just support it badly. You should ask to NXG it's him who said it in his video analysis.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
We shouldn't use one example as a generalization for the entire generation.
But you know that will happen, because that happens every single, damn time.

You can just use GAF search and you will find that people ask the same questions, use the same arguments (''x console is underpowered'', etc) every single generation, with the same answers (''Console potential usually unlocks at the end of the generation'') over and over.

Its like the collective memory has a shelf life of one generation, before the circle starts anew.

Using the same generalizations.
Was Techland ever considered competent?
They are more competent than your post.
 
Last edited:

adamsapple

Gold Member
Now that we know the game/engine has enough head room to run north of 100 FPS almost everywhere with the VRR unlocked mode ..

Can we put this topic to rest ? It's not an issue of the hardware being under powered, it's just the developers putting their priorities in one place and not another.
 

Swift_Star

Gold Member
Now that we know the game/engine has enough head room to run north of 100 FPS almost everywhere with the VRR unlocked mode ..

Can we put this topic to rest ? It's not an issue of the hardware being under powered, it's just the developers putting their priorities in one place and not another.
Has the brick wall guy posted again? He was claiming to hell it was the consoles fault.
 
I don't envy people who have to play it at 1080p in a screen without a good upscaling. Thanks God for my Bravia. Though the sharpness seems a bit improved with the last patch? Maybe it's just a placebo effect.

You know that the upscaling is done by the console and not the TV itself right???

Unless you change the HDMI system output to 1080p
 
Last edited:
I don't envy people who have to play it at 1080p in a screen without a good upscaling. Thanks God for my Bravia. Though the sharpness seems a bit improved with the last patch? Maybe it's just a placebo effect.
It's placebo..I thought the same thing at first but 1080p is still shit for thus game
 

b0uncyfr0

Member
Looking forward to someone digging into VRR mode.

But essentially VRR mode is out for the series x - people are reporting 80-110 fps which is way smoother than 60.

Close the thread - Yes, they did leave a shitload of performance on the table. It was patched in only a few days after release, they should've had this from the beginning! Also If 1080/120 is attainable, so is 1440p/60.
 
Last edited:

Kamina

Golden Boy
We have these threads every gen really.
Some devs use the power of the consoles they work with to a greater extent, others cheap out a little.
 

Cryio

Member
We have these threads every gen because people got used to the discrepancy between early console performance/visuals and PC performance/cost in the PS1/PS2/PS3 era.

During these particular console eras, games would be much more potent on consoles, and PC ports would look worse, run worse or generally require significantly more powerful systems than consoles for the same games.

PS1 era: we had 3D games while 3D acceleration was in its infancy on PC, so performance / visuals were poor. Poor was also performance scaling with framerate, resolution, AA and even Anisotropic filtering.

PS2 era: we started getting Pixel Shader 2.0 this era, where performance on PC was eskewed heavily by the extremely poor PS2.0 performance of GeForce 5 and even GeForce 4 being very poor in comparison to ATI PS2.0 9000 series.

PS3 era: this was the shader era. The irony here was the PS3/X360 were made of GPUs not really well designed for shaders, however PCs GeForce 8 and HD 3000 and above, were really good at pushing shaders compared to GeForce 7 and ATI X1300-1900. Performance on PC also advanced tremendously, especially on the GPU side.

PS4 generation: this was the compute era. Most GPUs in the PS3 era were good at pushing shaders, but weren't good at compute. AMD GPUs were powerhouses since 2011 with GCN1, while Nvidia only reached compute parity with the RTX 2000 Turing GPUs.

We also went from getting new tech and VASTLY faster GPUs every 6-12 months to every 24 months to every 36 months.

PS5 era: we're still in the full compute era, but we now have real time ray tracing on top. People want higher resolution, higher framerates, better visuals AND need new tech features with every console gen. This was feasible in the PS1, 2, 3 and even PS4 era (DirectX11, tessellation, GPU particles, screen space effects). However this time around the economy broke due to crypto, inflation and delayed technological advancement.

If this economy continues, PS6 era will be an even smaller jump than PS5. 8 years after PS5? We'll have 2 to 3 new GPU generations. People will want more reliable 120 fps. People will want more comprehensive ray tracing effects. People will want better visuals wholistically. And the kicker: we barely could do 4K in 2016-2017 on consoles on 8th gen. We barely can do 4K in 2020 on 9th gen. We'll barely be able to do a locked 4K on 10th gen. New consoles will still be 400-600$ for an entire system. But people will expect 8K60 and 8K120 resolutions, alongside all the other features mentioned above. Also, hell, 16xAF.

People are in for a rough awakening with the next console gen.
 
Last edited:

ethomaz

Banned
Maybe?

I have the feeling most GAFers will be blind if the game looks bad as last-gen games but have high framerate.

60fps game will look like last-gen 30fps games imo.
New Horizon game proved again my point.

Do you a want a generation jump in gaming? Stay on 30fps if not get used to last-gen graphics.
 
Top Bottom