• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry : Dragon's Dogma 2 PS5/Xbox Series X/S and PC performances

Bojji

Member
I know people weren’t complaining about gta4. I went back and read like 50 pages of that ot a while back. It’s pure excitement and sharing of gameplay moments. Only some complaining that it’s a bit dark.

Why are people using this fake argument than 30 belongs in the 90s?! We had 120hz crts back the. Too. I had 120hz eizo dude.
Resolutions and sizes of tv went up. Refresh rate is still refresh rate. 30 was, is and will remain playable. Oled didn’t change that. It’s still fine

I completed GTA4 on PS3 and enjoyed the game but performance was TERRIBLE, it didn't get any extra points for that, 20FPS don't help with aiming at all.

I was into lens of truth stuff and console comparisons as well but back then 30fps was the norm. So right now you see people freaking out when game is 30fps only and back then people were acting like that for 20-25FPS games (this wasn't uncommon on PS3). But number of people interested in technical stuff of games was smaller, for sure.
 
Last edited:

yamaci17

Member
I completed GTA4 on PS3 and enjoyed the game but conformance was TERRIBLE, it didn't get any extra points for that, 20FPS don't help with aiming at all.

I was into lens of truth stuff and console comparisons as well but back then 30fps was the norm. So right now you see people freaking out when game is 30fps only and back then people were acting like that for 20-25FPS games (this wasn't uncommon on PS3). But number of people interested in technical stuff of games was smaller, for sure.
that comparison is dumb anyways. not because gta 4 performed poorly. (gta 4 ran on a horrible CELL architecture that was difficult to work with, with split 512 mb memory. and of course it had an outdated GPU chip even for 2006 standards) despite thouse, gta 4 made huge advancements in graphics, animations and simulations over san andreas just over 4 years. of course most people managed to look past issues and framerates back then. games were advancing a lot, in your very eyes, you would see massive jumps.

gta 4 made the best out of PS3 while dragon dogma 2 squanders PS5 with ineffective rendering that barely looks nextgen

I'm pretty sure gta 4 npcs are smarter and has more stuff going on for them than dragon dogma 2 NPCs by the way
 
Last edited:

rofif

Can’t Git Gud
I completed GTA4 on PS3 and enjoyed the game but conformance was TERRIBLE, it didn't get any extra points for that, 20FPS don't help with aiming at all.

I was into lens of truth stuff and console comparisons as well but back then 30fps was the norm. So right now you see people freaking out when game is 30fps only and back then people were acting like that for 20-25FPS games (this wasn't uncommon on PS3). But number of people interested in technical stuff of games was smaller, for sure.
ps3 version of gta4 was worse true.
I played on 360 and it was great. I knew it was falling very low but the game was so awesome I did not cared.
As you say, it was doing SO MUCH MORE than san adnreas.
30fps was also standard on ps4 and we played our fav games this way.
Nothing magically changed on ps5. I don't suddenly have a brain tumor limiting me from playing at 30fps.
It's devs and their ability to make that 30fps bearable. that's it. Go return to bloodborne and see it plays great still.
TBH when I returned to bloodborne, the jaggy 1080p looked way worse than it's 30fps controlled. you know why? because I moved to a bigger tv on my desk, so 1080p no longer looks as good on screen that's bigger. But 30fps is still 30fps.
sure, oled makes 30fps a bit worse but not so much as people are saying. it's not 24fps and most games got some sort of motion blur to help.

It's fair people are freaking out because ps5 was mainly 60fps games and they got used to it. Of course.
But they can get used to 30 if it's worth it again and it's not the slow type.
And 1080p nowadays is also way better than 1080p from 2015, because we have TAA, fsr (god no) and dlss. 1080p can look pristine and just blurry. not pixelated and awful like 2015 1080p games. like bloodbrne.

I really don't want to talk about this anymore. People are just stubborn and they don't want to take a moment to get used and try something out of their scope. And that's fine, move on.
As to if DD2 does anything so well to deserve poor performance? probably not so much but I think it's quite impressive
 

Bojji

Member
that comparison is dumb anyways. not because gta 4 performed poorly. (gta 4 ran on a horrible CELL architecture that was difficult to work with, with split 512 mb memory. and of course it had an outdated GPU chip even for 2006 standards) despite thouse, gta 4 made huge advancements in graphics, animations and simulations over san andreas just over 4 years. of course most people managed to look past issues and framerates back then. games were advancing a lot, in your very eyes, you would see massive jumps.

gta 4 made the best out of PS3 while dragon dogma 2 squanders PS5 with ineffective rendering that barely looks nextgen

I'm pretty sure gta 4 npcs are smarter and has more stuff going on for them than dragon dogma 2 NPCs by the way

That's competently true, game suffered from poor framerate but I was blown away by generational jump in... EVERYTHING compared to GTA VC (last GTA I played on PS2). So yeah, poor performance could be excused because everyone could see how developers were pushing boundaries back then.

ps3 version of gta4 was worse true.
I played on 360 and it was great. I knew it was falling very low but the game was so awesome I did not cared.
As you say, it was doing SO MUCH MORE than san adnreas.
30fps was also standard on ps4 and we played our fav games this way.
Nothing magically changed on ps5. I don't suddenly have a brain tumor limiting me from playing at 30fps.
It's devs and their ability to make that 30fps bearable. that's it. Go return to bloodborne and see it plays great still.
TBH when I returned to bloodborne, the jaggy 1080p looked way worse than it's 30fps controlled. you know why? because I moved to a bigger tv on my desk, so 1080p no longer looks as good on screen that's bigger. But 30fps is still 30fps.
sure, oled makes 30fps a bit worse but not so much as people are saying. it's not 24fps and most games got some sort of motion blur to help.

It's fair people are freaking out because ps5 was mainly 60fps games and they got used to it. Of course.
But they can get used to 30 if it's worth it again and it's not the slow type.
And 1080p nowadays is also way better than 1080p from 2015, because we have TAA, fsr (god no) and dlss. 1080p can look pristine and just blurry. not pixelated and awful like 2015 1080p games. like bloodbrne.

I really don't want to talk about this anymore. People are just stubborn and they don't want to take a moment to get used and try something out of their scope. And that's fine, move on.
As to if DD2 does anything so well to deserve poor performance? probably not so much but I think it's quite impressive

Late into PS3 gen I switched to PC and played all games in 60 FPS, but I also bought PS4 in 2016 to check out PS exclusives and completed all interesting ones.

Few days ago I was thinking how was I able to play them in 30FPS no problem (I don't remember complaining)? I have TLoU2 on PS5 installed (PS4 build), yesterday I switched from 60FPS mode to 30FPS mode (game has the same GFX quality in both) and was blown away, IT WAS FINE! I could play entire game like that, low input lag and game looked ok in motion, and this on OLED tv. Key factor is camera motion blur, it smooths the picture to not have obvious stuttering, Sony devs were masters of 30FPS modes in PS4 era. Bloodborne has very low input lag and it's playable but it's not smooth at all (I have it installed all the time) with inproper framepacing.

At the same time I CAN'T play FFVII Rebirth in 30FPS, it stutters like motherfucker. FFXVI was not perfect but fine enough with their motion blur inplementation.

People have higher standards now and it's GOOD, if devs can't implement proper 30FPS mode they should allow players to have 60FPS modes or 40FPS modes. DD2 is a complete mess, game has CPU related frametime stuttering on consoles (in cities) and (on PS5 at least) stutters all the time because framerate is left in no man's land, VRR and 120hz panels can't help in any way. Pure incompetence from developer.
 

yamaci17

Member
That's competently true, game suffered from poor framerate but I was blown away by generational jump in... EVERYTHING compared to GTA VC (last GTA I played on PS2). So yeah, poor performance could be excused because everyone could see how developers were pushing boundaries back then.



Late into PS3 gen I switched to PC and played all games in 60 FPS, but I also bought PS4 in 2016 to check out PS exclusives and completed all interesting ones.

Few days ago I was thinking how was I able to play them in 30FPS no problem (I don't remember complaining)? I have TLoU2 on PS5 installed (PS4 build), yesterday I switched from 60FPS mode to 30FPS mode (game has the same GFX quality in both) and was blown away, IT WAS FINE! I could play entire game like that, low input lag and game looked ok in motion, and this on OLED tv. Key factor is camera motion blur, it smooths the picture to not have obvious stuttering, Sony devs were masters of 30FPS modes in PS4 era. Bloodborne has very low input lag and it's playable but it's not smooth at all (I have it installed all the time) with inproper framepacing.

At the same time I CAN'T play FFVII Rebirth in 30FPS, it stutters like motherfucker. FFXVI was not perfect but fine enough with their motion blur inplementation.

People have higher standards now and it's GOOD, if devs can't implement proper 30FPS mode they should allow players to have 60FPS modes or 40FPS modes. DD2 is a complete mess, game has CPU related frametime stuttering on consoles (in cities) and (on PS5 at least) stutters all the time because framerate is left in no man's land, VRR and 120hz panels can't help in any way. Pure incompetence from developer.
now try the remaster native build of tlou part 2. 30 fps mode is horrible. I literally downloaded both and gave them a try

special ps4 games often use special 30 fps implementations, but PS5, most devs seem to enable system wide 1/2 half sync and call it a day. I played gow ragnarok at 30 fps on ps4 and it was fine, on PS5 it is horrible. I theorized that this happens when a game has 60 FPS mode becuase developers think none would actively choose to play at 30 FPS so they don't even optimize it

on PC it is possible to get the best 30 fps/40 fps possible in all fronts aside from vsync smoothness. low lag, no tear, decent image stability, no judder. it is why I love PC. ironically PS5 30 fps experience will often be worse than PC 30 fps experience in 2024. it is the funniest 180. before VRR became mainstream on PC and before low latency frame limiters being popular, it was horrible to play at 30 FPS on PC because you would often use 1/2 vsync on 60 hz and it was incredibly laggy compared to consoles, which made many people think that playstation had special sauce 30 FPS. now it is the other way around

I'm sure with more games returning back to 30 FPS norm, proper 30 FPS implementations will make a comeback. but thankfully on PC you can do that with any game you want

being able to get the best out of 30/40 fps on pc with vrr is the reason I cling onto my aging ryzen 2700. But I guess dragon dogma 2 will be the 1st game I would ever be unable to lock to 30 lol
 
Last edited:

Bojji

Member
now try the remaster native build of tlou part 2. 30 fps mode is horrible. I literally downloaded both and gave them a try

special ps4 games often use special 30 fps implementations, but PS5, most devs seem to enable system wide 1/2 half sync and call it a day. I played gow ragnarok at 30 fps on ps4 and it was fine, on PS5 it is horrible. I theorized that this happens when a game has 60 FPS mode becuase developers think none would actively choose to play at 30 FPS so they don't even optimize it

on PC it is possible to get the best 30 fps/40 fps possible in all fronts aside from vsync smoothness. low lag, no tear, decent image stability, no judder. it is why I love PC. ironically PS5 30 fps experience will often be worse than PC 30 fps experience in 2024. it is the funniest 180. before VRR became mainstream on PC and before low latency frame limiters being popular, it was horrible to play at 30 FPS on PC because you would often use 1/2 vsync on 60 hz and it was incredibly laggy compared to consoles, which made many people think that playstation had special sauce 30 FPS. now it is the other way around

I'm sure with more games returning back to 30 FPS norm, proper 30 FPS implementations will make a comeback. but thankfully on PC you can do that with any game you want

being able to get the best out of 30/40 fps on pc with vrr is the reason I cling onto my aging ryzen 2700. But I guess dragon dogma 2 will be the 1st game I would ever be unable to lock to 30 lol

HOLY SHIT you are probably right. I tried to get games running smoothly on PC years ago in 30FPS and results were always terrible compared to PS4.

Now they are using lazy implementations of 30FPS modes in their games, this explains a lot of things. I tried GOWR in 30FPS on PS5 and it was bad even on 4K LCD, but GOW 2018 was ok on PS4...

So yeah, 60FPS is the standard right now and 30FPS modes are treated badly in many games.

Fuck Series X is a framerate monster! Smooth as silk and better than all other platforms. No wonder it's the console of choice this gen.

It's not better it looks like shit compared to PS5, but it can be smoother with VRR no doubt. PC is the best version anyway.
 
Last edited:

Giallo Corsa

Gold Member
now try the remaster native build of tlou part 2. 30 fps mode is horrible. I literally downloaded both and gave them a try

special ps4 games often use special 30 fps implementations, but PS5, most devs seem to enable system wide 1/2 half sync and call it a day. I played gow ragnarok at 30 fps on ps4 and it was fine, on PS5 it is horrible. I theorized that this happens when a game has 60 FPS mode becuase developers think none would actively choose to play at 30 FPS so they don't even optimize it

on PC it is possible to get the best 30 fps/40 fps possible in all fronts aside from vsync smoothness. low lag, no tear, decent image stability, no judder. it is why I love PC. ironically PS5 30 fps experience will often be worse than PC 30 fps experience in 2024. it is the funniest 180. before VRR became mainstream on PC and before low latency frame limiters being popular, it was horrible to play at 30 FPS on PC because you would often use 1/2 vsync on 60 hz and it was incredibly laggy compared to consoles, which made many people think that playstation had special sauce 30 FPS. now it is the other way around

I'm sure with more games returning back to 30 FPS norm, proper 30 FPS implementations will make a comeback. but thankfully on PC you can do that with any game you want

being able to get the best out of 30/40 fps on pc with vrr is the reason I cling onto my aging ryzen 2700. But I guess dragon dogma 2 will be the 1st game I would ever be unable to lock to 30 lol

Holy shit man, you might be on to something - there's countless posts on the internet (reddit especially) where they mention that the same 30fps games perform differently on the PS4 vs the PS5 in that it appears that when played on the PS5 (native PS4 versions) they have (more) judder compared to when played on a PS4 on the same TV to boot (even on OLEDs which have inherent judder with sub 60fps content due to their instant response time).

I stumbled upon this after countless debates/discussions in here when it came to 30fps games on OLED screens where some people say that they see no judder at all/can't feel it/see it Vs people like myself which can't do 30fps anymore on OLED due to said judder.

After many hours of fiddling with settings, I've found that if you have VRR for "unsupported games" enabled in the PS5 system options, said judder becomes actually worse to the point of 30fps games having actual stutter (at least, that's how I perceive it).

Took a look online about it and I stumbled on these posts describing EXACTLY what you're saying brother.

The more you know...

PS: For me, 30fps games on my LG C2 are literally a no-go due to it, it actually makes me feel dizzy/nauseous after a while to the point that had i actually witnessed it beforehand I'd have opted for a FALD/miniLED TV set.

Cheers and thanks for the explanation!
 
Last edited:

SKYF@ll

Member
This is a frame rate video from another YouTuber.
It's interesting that the frame rate of PS5 and XSX changes depending on the scene.
CPU or GPU drops, drops due to alpha effects, etc.
 
CPU limited areas (Towns) perform better on the PS5. Strange due to the XSX CPU clock being higher than the PS5.


Always remember the PS5 is the target platform for most games. It's the same issue the one x had, despite it being way more powerful than the 4 pro devs optimized for the pro minimizing the differences in performance.
 

Bernardougf

Gold Member
Why force RT on this 499 cheapass boxes from 4 years ago.. and than dont optimize accordingly... this devs man .. I just cant
 
Last edited:
Summary:

Features:
- High fidelity character rendering praised
- RTGI present on console(s) * minus Series S
- Light bounce off of surfaces praised, but noted that lighting can sometimes be muted in interiors
- Very little RT noise and breakup spotted in typical viewing distance
- Shadow maps look fine but can show aliasing.
- SSR used in water bodies, SSR artifacts typical of RE engine noted but not as bad as previous games
- Geometry and assets praised.
- Camera placement and lip sync in static dialogue scenes is underwhelming

PS5 / SX:
- PS5 / SX have matching visual settings, both use checker-board
- Currently, SX has issues with image resolve with a 'fine-grid' of CB artifacts, DF hopes this will be fixed with a patch soon.
- Pixel counting on SX is difficult due to above issue, PS5 tested and noted that it CB's to 2160p
- Both have unlocked frame rate between 30~45 FPS
- NPC draw is limited to very near the player to reduce CPU draw
- Combat can see frame rate drop to mid 20s

Performance:
- SX is roughly 10% faster in performance in GPU stress areas and traversal
- When CPU limited the PS5 can have a similar FPS advantage
- SX VRR @ 120hz output with low frame-rate compensation can make the game look 'a lot more smoother' but variances in city areas are still easily present / noticeable
- PS5 does not support system-level low frame-rate compensation, unless 120hz support is specifically patched into the game, so this is not applicable here

Series S:
- Series S strips out RTGI
- Shadow resolution is lower and flickery with some texture bugs also noted in Series S
- Series S version also has the same checker-boarding issue seen in the SX version
- Same unlocked juddery frame rate with performance on-par with PS5

PC section:
- Shader pre-comp took 2 minute on Ryzen 7800
- Some instances of shader related stutter still seen in opening areas despite the pre-comp
- The stutters were still less than previous RE engine games on PC
- The PC versions menu options are unruly and hard to select with mouse
- In city areas, frame rate takes a notable dip and frame time are not consistent so it doesn't feel smooth
- DLSS support present but not perfect, small vegetation is blurred

Conclusion:
- A lot to appreciate but notable areas that could use improvement
- Series S and X checker boarding resolve issue and CPU bottlenecks need some improvement.
There is no shader stutter what Alex described there was traversal stutters which affects all platforms.
 
So theoretically, would the allegedly upcoming PS5 Pro benefit this game without Capcom needing to patch it considering this?
In GPU limited scenarios it could om paper deliver 45% more fps. In CPU limited scenarios (which is what this game suffers from most) the gains in fps could be less than 5% as the 5 Pro only has 10% bonus to clock speed and nothing else.
 
It is such a rare issue that I doubt it will be addressed
Depends what you mean by rare every VRR gaming vendor has had LFC starting first with Nvidia who came to market with VRR first in 2013, then AMD, then Intel, then Microsoft and finally we have Sony come in with VRR on the PS5 and deciding they are too good to want to actually provide a quality VRR experience so they put out the most basic HDMI implementation and left it that. Sony literally lowered the standard of what one could expect out of a gaming VRR device. I'd expect that out of Nintendo and maybe even not because they're hardware and software tooling is made by Nvidia. Such a step backwards from what Nvidia figured out in 2013.
 

DavidGzz

Member
You have digital foundry gushing about the graphics and see how Capcom, a competent dev, isn't able to get this game at a stable frame rate in modern consoles and people are going to pretend the graphics are bad. Hey, if you got an issue with the art style or microtransactions, that's one thing, but you're blind. I'll listen to DF over random haters.
 

rofif

Can’t Git Gud
DF says you might want to wait to play this game till there are patches out, whereas Eurogamer's review gives it 5 stars. What on earth? Honestly, the performance is horrific, and the graphics look awful. The game may be great, but no way I'm picking it up till it runs better.
The graphics are amazing tho. Animations are amazing, hdr looks great. Particle effects and physics are too.

Seriously. Find some nice 4k feed of ps5 version and check out. Game looks way nicer in person and it’s a great game.
Runs like dogshit on ps5 but I don’t have a problem with it. Still plays just fine. It’s mostly 30. So it could use is some fps cap
 
Last edited:

Clear

CliffyB's Cock Holster
Finding it harder and harder to not respond sarcastically to people -and this goes especially for a lot of gaming YouTubers*- who claim x game "isn't optimized" based on nothing more than visual inspection and them spending a lot of money on their hardware.

Its incredibly ignorant and simple-minded. And although its not entirely unforgivable as most people have no clue how complicated modern software is, what I find most difficult not to react viscerally to is the vehemence and certainty with which the accusations are levelled.

GamersNexus is doing a pretty interesting series at the moment talking to one of the guys on Intel's driver team about what they do in terms of optimization. Remember this is just the abstraction layer between the program, the OS, and the hardware, and even in this thin sliver (which conceptually is just a pipe or bridge) in reality an extremely complex profiling system is required to diagnose faults and determine changes needed for optimal performance.

They aren't eyeballing this stuff, and neither are the skilled people building the applications themselves. So for the love of God please stop pretending a lay-person can simply point and say, "that's unoptimized" with any sort of surety.

This goes doubly so when the comparatives are so low resolution that the base assumption seems to be that "any game of a certain type is supposed to run like this". The workload of something like Dogma 2 is evidently way higher than, for example, Elden Ring in every aspect, but its being given zero credit for this.



*certain idiotic comments by YT "personalities" are what specifically inspired this post.
 
This is a frame rate video from another YouTuber.
It's interesting that the frame rate of PS5 and XSX changes depending on the scene.
CPU or GPU drops, drops due to alpha effects, etc.

Wow I didn't realize framerate can be much higher on PS5 in cities. Up to 10fps in some areas. Plenty of big frame-pacing spikes on Xbox (like on PC). Also the guy completely missed the incredibly worse IQ on Xbox. But this youtuber is a known Xbox fan, so it makes sense.
 
Last edited:

Wildebeest

Member
Finding it harder and harder to not respond sarcastically to people -and this goes especially for a lot of gaming YouTubers*- who claim x game "isn't optimized" based on nothing more than visual inspection and them spending a lot of money on their hardware.

Its incredibly ignorant and simple-minded. And although its not entirely unforgivable as most people have no clue how complicated modern software is, what I find most difficult not to react viscerally to is the vehemence and certainty with which the accusations are levelled.

GamersNexus is doing a pretty interesting series at the moment talking to one of the guys on Intel's driver team about what they do in terms of optimization. Remember this is just the abstraction layer between the program, the OS, and the hardware, and even in this thin sliver (which conceptually is just a pipe or bridge) in reality an extremely complex profiling system is required to diagnose faults and determine changes needed for optimal performance.

They aren't eyeballing this stuff, and neither are the skilled people building the applications themselves. So for the love of God please stop pretending a lay-person can simply point and say, "that's unoptimized" with any sort of surety.

This goes doubly so when the comparatives are so low resolution that the base assumption seems to be that "any game of a certain type is supposed to run like this". The workload of something like Dogma 2 is evidently way higher than, for example, Elden Ring in every aspect, but its being given zero credit for this.



*certain idiotic comments by YT "personalities" are what specifically inspired this post.
You absolutely do not need to be an intel engineer to notice that a game is running like poop and notice that it isn't doing anything so impressive that it has to bring a high-end PC to its knees. I think on youtube and twitch you are sadly going to have more "influencers" who are edging into shill territory who know that having a reputation for not talking shit about games from big publishers is going to pay them more in the long term and that gamers even appreciate a proper "good vibes only" shill. Not that I think GamersNexus are shills.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Finding it harder and harder to not respond sarcastically to people -and this goes especially for a lot of gaming YouTubers*- who claim x game "isn't optimized" based on nothing more than visual inspection and them spending a lot of money on their hardware.

Its incredibly ignorant and simple-minded. And although its not entirely unforgivable as most people have no clue how complicated modern software is, what I find most difficult not to react viscerally to is the vehemence and certainty with which the accusations are levelled.

GamersNexus is doing a pretty interesting series at the moment talking to one of the guys on Intel's driver team about what they do in terms of optimization. Remember this is just the abstraction layer between the program, the OS, and the hardware, and even in this thin sliver (which conceptually is just a pipe or bridge) in reality an extremely complex profiling system is required to diagnose faults and determine changes needed for optimal performance.

They aren't eyeballing this stuff, and neither are the skilled people building the applications themselves. So for the love of God please stop pretending a lay-person can simply point and say, "that's unoptimized" with any sort of surety.

This goes doubly so when the comparatives are so low resolution that the base assumption seems to be that "any game of a certain type is supposed to run like this". The workload of something like Dogma 2 is evidently way higher than, for example, Elden Ring in every aspect, but its being given zero credit for this.



*certain idiotic comments by YT "personalities" are what specifically inspired this post.
Wat1cc2.png
0c6ajw4.jpg


Then again we have people that are in the known and probably (part of ICE) they profiled lots of titles that do loudly wonder where is the HW progress being spent.
 

Clear

CliffyB's Cock Holster
You absolutely do not need to be an intel engineer to notice that a game is running like poop and notice that it isn't doing anything so impressive that it has to bring a high-end PC to its knees. I think on youtube and twitch you are sadly going to have more "influencers" who are edging into shill territory who know that having a reputation for not talking shit about games from big publishers is going to pay them more in the long term and that gamers even appreciate a proper "good vibes only" shill. Not that I think GamersNexus are shills.

Yeah but you need actual knowledge to know WHY its running like it does, and without that you can't know HOW to fix things. Because unlike the way smooth-brains like DF seem to believe, its more than pushing a few sliders around!

Hell, even in the DF video you can see a pretty decent spread of occupancy spread across the cores. Its not like its choking on a single over-burdened thread. So pray tell, what exactly is going wrong there?

Sorry, but stop with this retarded shit about a faster processor and/or GPU fixing every issue. You can't brute force every engineering issue because reality is more complicated than that. There are tolerances to be factored in for every step in the mechanism, simply adding more theoretical "grunt" doesn't mean its going to be utilized.

Sick of this arrogance.
 

rofif

Can’t Git Gud
Finding it harder and harder to not respond sarcastically to people -and this goes especially for a lot of gaming YouTubers*- who claim x game "isn't optimized" based on nothing more than visual inspection and them spending a lot of money on their hardware.

Its incredibly ignorant and simple-minded. And although its not entirely unforgivable as most people have no clue how complicated modern software is, what I find most difficult not to react viscerally to is the vehemence and certainty with which the accusations are levelled.

GamersNexus is doing a pretty interesting series at the moment talking to one of the guys on Intel's driver team about what they do in terms of optimization. Remember this is just the abstraction layer between the program, the OS, and the hardware, and even in this thin sliver (which conceptually is just a pipe or bridge) in reality an extremely complex profiling system is required to diagnose faults and determine changes needed for optimal performance.

They aren't eyeballing this stuff, and neither are the skilled people building the applications themselves. So for the love of God please stop pretending a lay-person can simply point and say, "that's unoptimized" with any sort of surety.

This goes doubly so when the comparatives are so low resolution that the base assumption seems to be that "any game of a certain type is supposed to run like this". The workload of something like Dogma 2 is evidently way higher than, for example, Elden Ring in every aspect, but its being given zero credit for this.



*certain idiotic comments by YT "personalities" are what specifically inspired this post.
These nouveo gamers think they are smarter than devs. They have no idea what the game is doing in the background. How everything is simulated and physics based.
Thank god Capcom didn’t cut down any of this stuff to get to 60fps. This crazy interactions and physics make the game special. Not 60fps.
In 10 years people will remember amazing game, not how it ran.

Edit: imagine if dark souls was significantly cut down to run at 60fps on 360. I think devs know what they are doing and consumers should focus on consuming and feedback after playing
 
Last edited:

Wildebeest

Member
Yeah but you need actual knowledge to know WHY its running like it does, and without that you can't know HOW to fix things. Because unlike the way smooth-brains like DF seem to believe, its more than pushing a few sliders around!

Hell, even in the DF video you can see a pretty decent spread of occupancy spread across the cores. Its not like its choking on a single over-burdened thread. So pray tell, what exactly is going wrong there?

Sorry, but stop with this retarded shit about a faster processor and/or GPU fixing every issue. You can't brute force every engineering issue because reality is more complicated than that. There are tolerances to be factored in for every step in the mechanism, simply adding more theoretical "grunt" doesn't mean its going to be utilized.

Sick of this arrogance.
I don't see why it is their problem to "fix the game" for the devs. Sometimes people do take on that role with fixes to override graphics settings, remove DRM, fix bugs with scripting with mods, and so on. But if you say that about things like Skyrim people normally get even more angry and say that it is the devs responsibility to do that, and it still leaves the sacred console only gamer without any recourse so it is perhaps racist or something to have those fixes and workaround in existence at all.
 

Clear

CliffyB's Cock Holster
I don't see why it is their problem to "fix the game" for the devs. Sometimes people do take on that role with fixes to override graphics settings, remove DRM, fix bugs with scripting with mods, and so on. But if you say that about things like Skyrim people normally get even more angry and say that it is the devs responsibility to do that, and it still leaves the sacred console only gamer without any recourse so it is perhaps racist or something to have those fixes and workaround in existence at all.

Pointing at a burning building and saying "its on fire" doesn't make a person a fire-fighter or a fire inspector FFS. Its rubber-necking, just with the added wrinkle that they get PAID for their uninformed opinions!

They aren't fixing shit. They are just more journo's just doing regular review work packaged with a patina of faux-expertise. They are the Edge magazine of graphics, and mostly just as full of it.
 

Wildebeest

Member
Pointing at a burning building and saying "its on fire" doesn't make a person a fire-fighter or a fire inspector FFS. Its rubber-necking, just with the added wrinkle that they get PAID for their uninformed opinions!

They aren't fixing shit. They are just more journo's just doing regular review work packaged with a patina of faux-expertise. They are the Edge magazine of graphics, and mostly just as full of it.
The world is full of people who use social cues to make it seem like you should have a lot more confidence in them than their arguments merit. And people who claim more than they can prove. But if we are just looking at things like performance of a game, that is claim that can have proof. It is a claim without proof to say something like that DD2 has some new revolutionary physics model that cannot be optimised such that the player would not notice much of a difference in a town and would prefer not having their CPU overheat and their frames drop. There are at least two claims without proof there.
 

Buggy Loop

Member
The world is full of people who use social cues to make it seem like you should have a lot more confidence in them than their arguments merit. And people who claim more than they can prove. But if we are just looking at things like performance of a game, that is claim that can have proof. It is a claim without proof to say something like that DD2 has some new revolutionary physics model that cannot be optimised such that the player would not notice much of a difference in a town and would prefer not having their CPU overheat and their frames drop. There are at least two claims without proof there.

And a claim from the devs themselves that they are looking into the issues on PC 🤷‍♂️

Nothing to do with a revolutionary AI/physics or whatnot.

bUt iT’s CoMPliCaTed!!

Please understand it’s complicated and performances is probably there because it’s intended! /s

Remember Jedi survivor?







The above cannot be done by anything else than a junior developer with no peer review from seniors in the team. Like nobody should be pushing this code with the performance graphs it has.

We can witness the incompetence. There’s so many tools now for that. But there’s nothing a modder can do at these depths of code, but outside aid like frame gen mod helped.

Devs are mostly incompetent with releases now. This is not a super controversial statement. The examples of it are piling up.
 

Clear

CliffyB's Cock Holster


That's not optimization. That's just adding menu options to cut down on workload at the cost of certain features. Capping frame-rate ironically even throws away performance, because the only way you do that is by waiting idle when you have time under cycle.

Proves my point about all these guys are is PC-Bro's who want more toggles and sliders to tinker with. That's the depth of their insight.
 

Wildebeest

Member
And a claim from the devs themselves that they are looking into the issues on PC 🤷‍♂️

Nothing to do with a revolutionary AI/physics or whatnot.

bUt iT’s CoMPliCaTed!!
The point of expertise is not to make a subject as muddy and complicated as it could be. The idea that someone presenting themselves as an "expert" who is claiming the performance of DD2 is bottlenecked by "a revolutionary physics sim", and is therefore worth it, is hypothetical.
 
Last edited:

King Dazzar

Member
Finding it harder and harder to not respond sarcastically to people -and this goes especially for a lot of gaming YouTubers*- who claim x game "isn't optimized" based on nothing more than visual inspection and them spending a lot of money on their hardware.

Its incredibly ignorant and simple-minded. And although its not entirely unforgivable as most people have no clue how complicated modern software is, what I find most difficult not to react viscerally to is the vehemence and certainty with which the accusations are levelled.

GamersNexus is doing a pretty interesting series at the moment talking to one of the guys on Intel's driver team about what they do in terms of optimization. Remember this is just the abstraction layer between the program, the OS, and the hardware, and even in this thin sliver (which conceptually is just a pipe or bridge) in reality an extremely complex profiling system is required to diagnose faults and determine changes needed for optimal performance.

They aren't eyeballing this stuff, and neither are the skilled people building the applications themselves. So for the love of God please stop pretending a lay-person can simply point and say, "that's unoptimized" with any sort of surety.

This goes doubly so when the comparatives are so low resolution that the base assumption seems to be that "any game of a certain type is supposed to run like this". The workload of something like Dogma 2 is evidently way higher than, for example, Elden Ring in every aspect, but its being given zero credit for this.



*certain idiotic comments by YT "personalities" are what specifically inspired this post.
What word would you rather people use instead of unoptimised? Shit, awful, poor, unfinished, badly performing, technically castrated? Does it matter what label is used? Performance simply isnt where it should be. It reads like you want to play a violin for the developers. Instead we could all of us talk about the complications we went through to earn our money, which Capcom want. But Capcom doesnt care, as long as we give them the money. And really we shouldn't worry either about the technical hurdles the developers need to overcome. So when they deliver an underperforming game, its right that it gets called out.
 

Clear

CliffyB's Cock Holster
What word would you rather people use instead of unoptimised? Shit, awful, poor, unfinished, badly performing, technically castrated? Does it matter what label is used? Performance simply isnt where it should be. It reads like you want to play a violin for the developers. Instead we could all of us talk about the complications we went through to earn our money, which Capcom want. But Capcom doesnt care, as long as we give them the money. And really we shouldn't worry either about the technical hurdles the developers need to overcome. So when they deliver an underperforming game, its right that it gets called out.

Well, it depends on what your intention is, and what your expectations are.

If you just want to vent or talk shit, you can just say what you feel. Noone should (and will) care that much though.

If you want to understand the likelihood/possibility of change, then you need to actually have a grasp of what enacting that entails. So more granularity is necessary if you want to know if waiting for a patch is worthwhile and what sort of time-scale it involves, what the trade-offs are likely to be, etc.

Bottom line is pretty simple. Low observed performance cannot be simply chalked up to a lack of optimization, because workload and complexity of workload is not a constant and needs to be factored in. Surely you must yourself in your life have run into a situation where what appears to be a simple task, actually ends up being unexpectedly difficult due to reasons you weren't immediately aware of or able to foresee?

Now imagine how much more unimaginably complex a piece of high-end software is, and how much a of a twat you come across as to the people working on such things when you start to demand they start waving a magic "optimization" wand at every issue irrespective of the difficulty involved.
 

King Dazzar

Member
Well, it depends on what your intention is, and what your expectations are.

If you just want to vent or talk shit, you can just say what you feel. Noone should (and will) care that much though.

If you want to understand the likelihood/possibility of change, then you need to actually have a grasp of what enacting that entails. So more granularity is necessary if you want to know if waiting for a patch is worthwhile and what sort of time-scale it involves, what the trade-offs are likely to be, etc.

Bottom line is pretty simple. Low observed performance cannot be simply chalked up to a lack of optimization, because workload and complexity of workload is not a constant and needs to be factored in. Surely you must yourself in your life have run into a situation where what appears to be a simple task, actually ends up being unexpectedly difficult due to reasons you weren't immediately aware of or able to foresee?

Now imagine how much more unimaginably complex a piece of high-end software is, and how much a of a twat you come across as to the people working on such things when you start to demand they start waving a magic "optimization" wand at every issue irrespective of the difficulty involved.
I guess, but that also comes across as no small element of semantics to me. Capcom themselves have said they are working on further improving performance, beyond the toggles just announced. This is something that could of surely been done prior to release. To my layman's mind reads as though they are further optimising the game. I'm happy to change that word to improving, if it helps.
 

Clear

CliffyB's Cock Holster
I guess, but that also comes across as no small element of semantics to me. Capcom themselves have said they are working on further improving performance, beyond the toggles just announced. This is something that could of surely been done prior to release. To my layman's mind reads as though they are further optimising the game. I'm happy to change that word to improving, if it helps.

Its more PR management than semantics to be honest.

The thing people tend to forget is that it telling a coder to "optimize" something is such a woolly instruction its basically meaningless!

For a start-off like any other sort of work the #1 thing is time-scale. Just because someone has the knowledge to do a thing it doesn't follow that they can achieve it within a reasonable or designated time. Nobody's getting an unlimited time or a blank cheque to do anything. They are going to be given a task, a schedule and will attempt to do the best they can within those parameters.

The problem I have is the inference that if something isn't perfect, then the people responsible somehow fucked up. And can always un-fuck what they did in no time. I cannot think of any complex real-world problem that works like that.
 
Always remember the PS5 is the target platform for most games. It's the same issue the one x had, despite it being way more powerful than the 4 pro devs optimized for the pro minimizing the differences in performance.

You should blame Microsoft for this. If they made the Series X massively more powerful this never would happen. But since the two systems are similar with each having their strengths you end up getting comparisons like this.
 

Clear

CliffyB's Cock Holster
Complete nonsense.

In terms of memory budgeting and other non-negotiable resource allocations PS5 is most likely the target. Logic is pretty obvious; Series X is interchangeable for the most part, Series S likely has SDK/profiling tools to ease down-speccing based on X, and PC being an open platform has no real "floor", just an unlimited ceiling and is practically the lowest risk (although the most work).

Just because most of the development work is done on PC it simply does not follow that its also the lead SKU. I think the distinction evades quite a few people because they are unfamiliar with the hard strictures of working to closed systems.
 

fatmarco

Member
That's not optimization. That's just adding menu options to cut down on workload at the cost of certain features. Capping frame-rate ironically even throws away performance, because the only way you do that is by waiting idle when you have time under cycle.

Proves my point about all these guys are is PC-Bro's who want more toggles and sliders to tinker with. That's the depth of their insight.
Where does he say optimization? He says "Improvements", which a FPS cap objectively is.

I don't see how giving us options is a non improvement in the context of a game where the developers objectively didn't set the games settings up correctly to begin with. Most games with roughly this set up have launched with a "Performance" Non-RT mode and a RT 30fps "Graphics" mode, both with frame rate caps (and sometimes a cap remover).

So again, these new "features" are bringing it in line with how the game should have launched to begin with, but my theory is they only wanted footage of the Ray Traced Global Illumination mode going around in review coverage footage to ensure the game looked its best. It's pretty clear they had all these features ready to go, but intentionally delayed them for that purpose.
 

adamsapple

Or is it just one of Phil's balls in my throat?
This is a frame rate video from another YouTuber.
It's interesting that the frame rate of PS5 and XSX changes depending on the scene.
CPU or GPU drops, drops due to alpha effects, etc.


120hz LFC is a band-aid solution and not the real answer to the games performance, but it really does make a perceptible differences in cases like this.
 

fatmarco

Member
120hz LFC is a band-aid solution and not the real answer to the games performance, but it really does make a perceptible differences in cases like this.
Yeah, after playing at 60hz for about 10 hours, seeing the Digital Foundry review mention it, and then switching my Xbox to 120hz it genuinely started feeling a lot better to play.

Still ridiculously awful in the main city, but everywhere else where combat takes place it's been a clear improvement.
 

Poppyseed

Member
The graphics are amazing tho. Animations are amazing, hdr looks great. Particle effects and physics are too.

Seriously. Find some nice 4k feed of ps5 version and check out. Game looks way nicer in person and it’s a great game.
Runs like dogshit on ps5 but I don’t have a problem with it. Still plays just fine. It’s mostly 30. So it could use is some fps cap
It does sound like there's a good game underneath the technical issues. I have a 4090 PC but I'm still going to wait for some patches.
 
Top Bottom