• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] Guardians of the Galaxy: PS5 vs Xbox Series X/S - A Great Game But 60FPS Comes At A Cost

Topher

Gold Member
still tho, the 2060 runs it better than the PS5, that is not what you would expect.

it might not run well on PC either, but it seems to run better on equivalent PC parts than on consoles

not to mention that it takes a resolution hit of 75% and lower settings just to get the game running at an unstable 60fps... that is just fucking weird

This game runs better than both consoles on a RTX 2060 actually. I'm wondering if the game simply performs better on Nvidia rather than AMD.
 
Last edited:

Bogroll

Likes moldy games
I remember it being very smooth, but I was also used to 30fps across the board for consoles last gen. I played primarily on PC for that reason. Now I find myself more open to playing multiplatform games on console because of the option for 60fps. These days I'm finding it extremely hard to go back to 30. Just feels like a jittery mess.



Yep. I was wondering if anyone would pick up on that. But, of course, there are plenty of people in that thread who are saying the lack of optimization is the reason XSX is running worse.

At this point, we might as well stop trying to declare winners because of the "optimization" card which either side will play when it is convenient to do so. Well, that and the fact the differences are so minuscule that it really doesn't matter anyway.
Yes but I would say the optimization card in both games are a legitimate call.
For GR it run like a bag of shit Series console's, X version is miles from PS5. Series S version is a joke.
GOTG just doesn't seem to scale down to 1080p very well with lower settings at 1/4 the resolution on X and PS5.
 

Tripolygon

Banned
That 53fps is a min, not an avg. PS5 is below the 60fps cap, at 53fps while the 2060S is at 65fps. It would be pointless to compare a segment that’s locked at 60fps. That’s why I didn’t include the XSX, only made a passing comment.
Both consoles hover around 60 during normal gameplay, drop to mid-high 50s during intense battles, and on rare occasions to low 40. In that opening cutscene, PS5 drops to low 50 for a few seconds and recovers back to 60 if you look at the frame time graph. Again the 60fps cap of consoles lowers their average.
 

S0ULZB0URNE

Member
1080p 30 fps for XSS , with graphical cut backs
200.gif
Yikes!
 

CrustyBritches

Gold Member
Both consoles hover around 60 during normal gameplay, drop to mid-high 50s during intense battles, and on rare occasions to low 40. In that opening cutscene, PS5 drops to low 50 for a few seconds and recovers back to 60 if you look at the frame time graph. Again the 60fps cap of consoles lowers their average.
In that sequence the lowest Ryzen 1600+2060S goes is 63fps. It has about 71fps avg/63fps min at MAX settings. PS5 has factually lower performance. I don't understand what you're getting at.
 
Last edited:

Sosokrates

Report me if I continue to console war
All of those games gave performance issues as well. I don't see your point.
Concern Concern said that it being 1080p@ 60fps on ps5/xsx is a reflection on the devs not the hardware, and you said that this is not the case and it rivals anything else out there which is not the case because theres plenty of cross gen games on PS5/XSX which do 60fps at mich higher then 1080p.
 

OmegaSupreme

advanced basic bitch
Concern Concern said that it being 1080p@ 60fps on ps5/xsx is a reflection on the devs not the hardware, and you said that this is not the case and it rivals anything else out there which is not the case because theres plenty of cross gen games on PS5/XSX which do 60fps at mich higher then 1080p.
Then the argument to be made is that this game is doing more than those. It's also brand new. Itll no doubt be patched. These guys made shadow of the tomb raider. They aren't incompetent.
 

Tripolygon

Banned
In that sequence the lowest Ryzen 1600+2060S goes is 63fps. It has about 71fps avg/63fps min at MAX settings. PS5 has factually lower performance. I don't understand what you're getting at.
My point is your comparison makes very little sense, a few seconds of an opening cutscene sequence that drops to the low 50s then recovers is not a good comparison point. Open a taxing level where both PS5 and XSX are raging in the mid-50s, this will give you a much better picture of how well your 2060S is doing. Also, note your boost clock in those moments as well, that will tell how much the GPU is being pushed.
 

Sosokrates

Report me if I continue to console war
Then the argument to be made is that this game is doing more than those. It's also brand new. Itll no doubt be patched. These guys made shadow of the tomb raider. They aren't incompetent.

Ironically sottr runs at a much higher res.
Also the visuals of gotg seem to be pretty normal for a AAA gen8 game.
 

SenkiDala

Member
I saw many people not believing the performance mode was on 1080p and then I got the game 2 days ago and I have to admit I agree with them, I was like "mmmh no it must be at least 1440p I almost don't see any difference on the image quality with the 4K mode" and it really is 1080p, wow, then one of the best 1080p I've ever seen, the checkerboard must be responsable of this.

Btw to me the performance mode is the mode to go, the difference between 30 and 60 is, at least to my eyes, obvious, the 30fps mode is sluggish as hell. but the difference between 4K and 1080 in this game is quite irrelevant. And also the performance mode is the default mode when you launch the game, so it's the decision that made the developers and I understand why.
 
Looks like a bad port on all consoles. 1080p60 on the big bois with severely reduced settings is a joke. Just for comparison, the game runs at 1080p120 on Ultra on an RTX 2070s. Even a GTX 1070 manages to deliver above 60 fps on ultra. This port is a joke.

I hope they patch the game. If I had to play it in the current state, XSX is the best version. I'd honestly wait tho.
 
Last edited:

Topher

Gold Member
A 5700XT manages 90 fps at 1080p on ultra. The consoles are massively underperforming.

Eh?

The game is quite heavy on the GPU even at 1080p. Not an apples-to-apples comparison but RX 5700's perf here at 1080p below Ultra setting should give you an idea of how demanding it is.

48fps vs 54fps

PC
PBF2DXw.jpg

PS5
YDZvoDE.png

Not an XT, but still....

Edit: Actually, I saw some benchmarks on YouTube that were around 90fps as well for XT so......yeah.....don't know.
 
Last edited:

CrustyBritches

Gold Member
My point is your comparison makes very little sense, a few seconds of an opening cutscene sequence that drops to the low 50s then recovers is not a good comparison point. Open a taxing level where both PS5 and XSX are raging in the mid-50s, this will give you a much better picture of how well your 2060S is doing. Also, note your boost clock in those moments as well, that will tell how much the GPU is being pushed.
That scene starts very alpha heavy from all the grass being rendered, and as you move towards the house it's less demanding. You're damaging controlling and make it sound like a little drop and recovery. That shit goes on for 8 secs straight and NEVER recovers to 60fps in the DF clip.
giphy-downsized-large.gif


Average clock during that scene for my 2060S is ~1920MHz with 2010MHz peak. Furthermore, I'm using Ultra/max settings, so "higher quality textures, improved texture filtering and - while subtle - more refined ambient occlusion. Geometry LODs also are much improved, meaning that pop-in is less obvious. Elsewhere, shadow draw and foliage density are boosted in quality mode too."

And I'm CPU-limited and this isn't even accounting for DLSS. Total victory for 2060S, nonetheless.
 
the damn 1070 outperforms the PS5... like, something is up here.
the 1070 had usually just barely better performance than the Xbox One X, the One X was somewhat between 1060 and 1070
Yep. Also puts the 1080p30 of the XSS into perspective. XSS should easily push 1080p60 at those settings.
 

CrustyBritches

Gold Member
This is definitely a game that favors Nvidia. 2060S is somewhere around a RX 5700 in relative performance, and in your benchmark it has ~23%*edit* 27% advantage.

The rest can be explained by being CPU-limited. I'm CPU-limited on Ryzen 1600. Flute benchmarks put PS5 CPU performance around a Ryzen 1700, and I'm sure they've reserved a core for OS and they have lower clockspeed(I'm running 3.8GHz).
 
Last edited:
This is definitely a game that favors Nvidia. 2060S is somewhere around a RX 5700 in relative performance, and in your benchmark it has ~23%*edit* 27% advantage.

The rest can be explained by being CPU-limited. I'm CPU-limited on Ryzen 1600. Flute benchmarks put PS5 CPU performance around a Ryzen 1700, and I'm sure they've reserved a core for OS and they have lower clockspeed(I'm running 3.8GHz).
According to that benchmark, a Ryzen 1600X can push 83 fps. So the consoles shouldn't be CPU limited when targeting 60 fps.

(The console CPUs are roughly 10-20% faster than a 1600X)
 

CrustyBritches

Gold Member
According to that benchmark, a Ryzen 1600X can push 83 fps. So the consoles shouldn't be CPU limited when targeting 60 fps.

(The console CPUs are roughly 10-20% faster than a 1600X)
I have Ryzen 1600(OC to 3.8GHz) and 2060S OC to 1900MHZ-2000MHz/14400MHz mem, and I've run the built-in benchmark dozens of times in different configs. I'm up to ~74fps avg now on 1080p/Ultra.

I get almost no performance increase with DLSS enabled = CPU-limited, almost guaranteed.
 
Last edited:
I have Ryzen 1600(OC to 3.8GHz) and 2060S OC to 1900MHZ-2000MHz/14400MHz mem, and I've run the built-in benchmark dozens of times in different configs. I get almost no performance increase with DLSS enabled = CPU-limited, almost guaranteed.
Sure it's CPU limited, but at what frame rate? You shouldn't be limited to sub 60 fps. PS5 sometimes drops to 40. CPU doesn't explain such huge drops.
 

Md Ray

Member
Looks like a bad port on all consoles. 1080p60 on the big bois with severely reduced settings is a joke. Just for comparison, the game runs at 1080p120 on Ultra on an RTX 2070s. Even a GTX 1070 manages to deliver above 60 fps on ultra. This port is a joke.

I hope they patch the game. If I had to play it in the current state, XSX is the best version. I'd honestly wait tho.
Uh, not in this section, no. The 2070S PC at 1080p is ahead by a couple of % but by and large, they're somewhat similar. I bet both PS5/XSX would get close to 120fps too (if unlocked) in areas where 2070S is doing 120fps with XSX probably matching the 2070S more often.

The settings are Ultra on the 2070S side. Note: there's a very small difference in terms of perf between the lowest preset and Ultra on PC (ranges from 5-8% avg), and the settings on XSX/PS5 in perf mode are probably in between PC's Low and Ultra which means the perf cost of Ultra and console's custom preset ought to be close.

dnZtjIT.jpg


YDZvoDE.png
 

TrebleShot

Member
Relax relax relax just needs more time in the oven a patch will sort most issues out. Game felt rushed out from announcement anyway.
 

NoviDon

Member
Insomniac wouldve had an upscaled 4k 60 version with limited raytracing, an native 4k 30 version full raytracing, and a 1440p upscaled raytracing and high performance version. and yet Square cant run 1080p and a solid 60 fps performance mode on current gen systems? please, this is pathetic.
 
Last edited:

Md Ray

Member
Eh?



Not an XT, but still....

Edit: Actually, I saw some benchmarks on YouTube that were around 90fps as well for XT so......yeah.....don't know.
That section in the screenshot is one of the heaviest you'll encounter in the beginning. Past this chapter, the 2070S, 5700 XT class GPUs can do ~100fps and more... Hence the confusion. The in-game benchmark is also not helping as it doesn't stress the HW like that one section does.
 

Md Ray

Member
Insomniac wouldve had an upscaled 4k 60 version with limited raytracing, an native 4k 30 version full raytracing, and a 1440p upscaled raytracing and 60 fps version. and yet Square cant run 1080p and a solid 60 fps performance mode on current gen systems? please, this is pathetic.
Tbf to Eidos Montreal, Insomniac had to deal with just one console spec for R&C as opposed to 7-8 different console versions EM had to work on for GotG (and that is excluding PC). It is not gonna be as easy.
 
Last edited:
some people here need a dose of reality about the looks of Guardian of the Galaxy and the limits in manpower and Budget of the team that developed it for many systems in parallel

Here is the PC Version maxed out:



and here some Ratchet and Clank Rift Apart Footage from me with the Raytracing Performance Mode enabled. (1440p/60)



So in Ratchet and Clank the Levels generally bigger and also feature the Games Stand out Point with its Portals wich load entire new Levels in less than a second.
also in Battles there is more going on.

People realy need to use their Brain. Only realy Big Studios will ever make (fully) use of Consoles Low Level APIs. In a medium Budget AAA / All Plattforms / Generations Title, the new Consoles will always be not used to their full Potential.

That Comparison above should make it pretty clear:
If properly used PS5 can output more than what Guardians of the Galaxy shows on PC @ Ultra Settings . Plus /Minus a couple Adjustments.

i tryd to show a comparably busy scene from both games, make the Experiment yourself - start both vids in their respective imbedded Window - wich Game looks better?

And remind yourself that is PC Ultra vs PS5 Exclusive ..
Then think again what the PS5 Version of Guardians of the Galaxy brings to the Table. ..
 
Last edited:

ethomaz

Banned
I saw many people not believing the performance mode was on 1080p and then I got the game 2 days ago and I have to admit I agree with them, I was like "mmmh no it must be at least 1440p I almost don't see any difference on the image quality with the 4K mode" and it really is 1080p, wow, then one of the best 1080p I've ever seen, the checkerboard must be responsable of this.

Btw to me the performance mode is the mode to go, the difference between 30 and 60 is, at least to my eyes, obvious, the 30fps mode is sluggish as hell. but the difference between 4K and 1080 in this game is quite irrelevant. And also the performance mode is the default mode when you launch the game, so it's the decision that made the developers and I understand why.
There is a massive difference in IQ from 1080p to 4k mode and it is not just the higher resolution. It is probably one of the game with the biggest gap between Performance and Quality.

That is why people are questioning why they have to go so low on the 60fps mode… why it really doesn’t make sense.

DF in most cases chooses 60fps over 30fps but in this case due the heavy compromises in the 60fps mode they choose to recommend the 4k30 mode.
 
Last edited:

ManaByte

Member
some people here need a dose of reality about the looks of Guardian of the Galaxy and the limits in manpower and Budget of the team that developed it for many systems in parallel

Here is the PC Version maxed out:



and here some Ratchet and Clank Rift Apart Footage from me with the Raytracing Performance Mode enabled. (1440p/60)



So in Ratchet and Clank the Levels generally bigger and also feature the Games Stand out Point with its Portals wich load entire new Levels in less than a second.
also in Battles there is more going on.

People realy need to use their Brain. Only realy Big Studios will ever make (fully) use of Consoles Low Level APIs. In a medium Budget AAA / All Plattforms / Generations Title, the new Consoles will always be not used to their full Potential.

That Comparison above should make it pretty clear:
If properly used PS5 can output more than what Guardians of the Galaxy shows on PC @ Ultra Settings . Plus /Minus a couple Adjustments.

i tryd to show a comparably busy scene from both games, make the Experiment yourself - start both vids in their respective imbedded Window - wich Game looks better?

And remind yourself that is PC Ultra vs PS5 Exclusive ..
Then think again what the PS5 Version of Guardians of the Galaxy brings to the Table. ..


I played through Guardians on the PS5 (and Platinumed Ratchet). Enjoyed Guardians a lot more. Thought it was more visually impressive, especially the character models (GAMORA!), and the variety in the worlds you visited and how even when you revisited a location they managed to make it look different so it didn't feel like a retread. And Guardians has no loading screens either (unless you're loading a save).
 
I played through Guardians on the PS5 (and Platinumed Ratchet). Enjoyed Guardians a lot more. Thought it was more visually impressive, especially the character models (GAMORA!), and the variety in the worlds you visited and how even when you revisited a location they managed to make it look different so it didn't feel like a retread. And Guardians has no loading screens either (unless you're loading a save).
i dont think any in game Character in other Games can hold up against Ratchet or Rivet with all their furr. (Edit - and DF claims they use actual LoD 0 Models - like usually used in cutscenes in other games ) And the Levels - i dont know the ones in Guardians of the Galaxy from the Videos if have seen do nothing realy for me..
They look good at times but many times it looks rather cheap.
But my Post was anyway more aimed show how poorly PS5 Versions has to be in order to achieve so less in Comparision to R&C..
 
Last edited:
I played through Guardians on the PS5 (and Platinumed Ratchet). Enjoyed Guardians a lot more. Thought it was more visually impressive, especially the character models (GAMORA!), and the variety in the worlds you visited and how even when you revisited a location they managed to make it look different so it didn't feel like a retread. And Guardians has no loading screens either (unless you're loading a save).

All the tunnels were loading screens
 
Top Bottom