• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGTech: Cyberpunk 2077 PS5 vs Xbox Series X|S Frame Rate Comparison (Next-Gen Update)

Sosokrates

Report me if I continue to console war
Why should they? It's not their fault that some people were fooled into believing that a theoretical peak TF number is the only thing that matters.

We even had a developer from Crytek explain why it's not the most important thing but he was promptly hounded and silenced.

The funny thing is we will probably go this entire gen without knowing the reasons why the PS5 has been punching above its wieght.

Alex from DF @34.42 thinks the way the industry is going, is in a direction which leverages compute more which will see greater returns for the XSX, and the PS5 is seeing greater gains at the start because of its has higher clocked fixed funtion hardware.

 

Mr Moose

Member
So I looked at the two screenshots outside toms diner and zoomed in because it the one with the highest difference

XSX vs PS5, performance mode

can you tell which one is which?

OGRsvAU.png
Right is PS5.
 

DJ12

Member
The funny thing is we will probably go this entire gen without knowing the reasons why the PS5 has been punching above its wieght.

Alex from DF @34.42 thinks the way the industry is going, is in a direction which leverages compute more which will see greater returns for the XSX, and the PS5 is seeing greater gains at the start because of its has higher clocked fixed funtion hardware.


That's it then, Alex has said so.

Couldn't pick a more pointless commentators opinion.

He doesn't understand closed boxes or even why AMD don't have more than 10 shaders I an array.
 
Wow, I'm starting to wonder if the trade off in image quality is even worth it right now
VRS makes no difference in image quality, it only reduces details in invisible portions of the screen and gives you a 20% increase in performance for essentially no drawbacks!

They probably use a software based version, otherwise it would be amazing how the series consoles would perform and look, you would not be able to believe your own eyes!
 

Lysandros

Member
VRS makes no difference in image quality, it only reduces details in invisible portions of the screen and gives you a 20% increase in performance for essentially no drawbacks!
That's quite an idealised descpription of VRS. It essentially reduces shader resolution in portions not always so 'invisible'.. A good implementation can be very subtle and worthy of trade for (minimal) IQ loss of course. Performance gain range is more like ~5% to 20% in general.

Edit: With that i missed the sarcasm i think, that's the second time.
 
Last edited:

Sosokrates

Report me if I continue to console war
That's it then, Alex has said so.

Couldn't pick a more pointless commentators opinion.

He doesn't understand closed boxes or even why AMD don't have more than 10 shaders I an array.

Its not just Alex that says this, on a recent EPIC presentation, the engineer says that more gpu compute is the most useful to them.
 

ethomaz

Banned
VRS makes no difference in image quality, it only reduces details in invisible portions of the screen and gives you a 20% increase in performance for essentially no drawbacks!

They probably use a software based version, otherwise it would be amazing how the series consoles would perform and look, you would not be able to believe your own eyes!
Invisible portions are not rendered.
VRS is applied to visible parts.
It can be subtle or very noticeable but it indeed downgrades the IQ for better performance.
And… no 20% better performance is a hypothetical case and probably only happens in cases the VRS is very noticeable.

It you take Gears use of VRS Tier 2 as example they could get up to 14% better performance… and near 14% is probably very rare cases.

Edit - Wait… sarcasm?
 
Last edited:

sircaw

Banned
Damn man some of y'all are relentless, here you go same phone under different lighting and a steadier hand lol.

Bonus duck donuts.




img-20220221-wa0003i6kun.jpg
i hope those donuts have not been sitting on the shelf like that copy to be unwrapped.

Might need to change the name to Yuk donuts in that case "lollipop_disappointed:
 

Dream-Knife

Banned
So I looked at the two screenshots outside toms diner and zoomed in because it the one with the highest difference

XSX vs PS5, performance mode

can you tell which one is which?

OGRsvAU.png
Right looks better. Colors pop more, but I'm colorblind.
I was joking, 5% isn't shit. Also I ignored Digital Foundrys because it was Tom.
What's wrong with Tom? I like him better than the other two idiots (not Richard OG).
So basically the Matrix demo.
Basically all games in the coming years.
 

Mr Moose

Member
Right looks better. Colors pop more, but I'm colorblind.

What's wrong with Tom? I like him better than the other two idiots (not Richard OG).

Basically all games in the coming years.
I haven't trusted a word from Toms mouth since his Genshin video, the dude is blind as fuck. And he said Fifa was 4k on Series S, he is constantly wrong.


Character detail is higher on PS5.
 

GHG

Gold Member
Oh you're from Dubai fam?

I'm also a middle eastern boy who now lives in the US, I haven't been to Dubai in years but I really want to go back there to see how much it's changed over the last 5 years.

Not from here, just your stereotypical British expat.

A lot has changed in 5 years, it feels like every other month there's something new to see or do here which keeps things fresh. It's funny because everytime I go back to the UK I feel like absolutely nothing has changed, it's strange to see from that perspective.
 
IMO CP2077 is a good benchmark this game was not even meant for previous gen consoles. I am sure they can optimize the engine more in future but this is what to expect this gen from these consoles. Unreal5 games will be similar in performance too
Yeah. CDPR has always been a PC first developer but it hasn’t really ran into problems with the console ports before mainly because the previous games were fairly early into the gen (Witcher 2 on 360, Witcher 3 on PS4/Xbone) where hardware was somewhat in-line with lower end PC specs. By the time Cyberpunk came out they were last gen consoles and waaaaaaay below minimum PC specs, it’s like they somehow expected the porting process to go as smoothly as it did before, when in reality they’re basically trying to fit a square peg in a round hole.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
How can the XSX have a higher resolution all the time if it has a lower floor!?

That's the RT mode, this is VGTech's recorded readings in Perf mode. SX almost always has a very slight advantage:


Kabuki Entrance - PS5 2176x1224, Series X 2304x1296
Near Police Station - PS5: 2435x1370, Series X: 2560x1440
Outside Tom's Diner - PS5: 2506x1410, Series X: 2631x1480
Corpo Start Building - PS5: 2656x1494, Series X: 2744x1544
Streetkid Start - PS5: 2062x1160, Series X: 2062x1160


Even still, a lower reading recorded once doesn't mean the average may be higher in RT mode either.
 

Elios83

Member
PS5 even better than what Tom was telling?
I'm shocked 👀 :goog_eek:

Btw at this point it's sad to see the usual people who are still trying to find winners between pretty much identical versions.
PS5 and XSX have proved to be the same in real world performance.
This will be the situation until and IF mid gen refreshes are released.
 
That's the RT mode, this is VGTech's recorded readings in Perf mode. SX almost always has a very slight advantage:





Even still, a lower reading recorded once doesn't mean the average may be higher in RT mode either.
Yep my mistake, but i would add that the dynamic nature of this makes it silly to say one way or the other who "wins".

For example on average when the res dips hard you are more likely to see the higher res on the XsX but thats not to say there are not instances where res favors PS5. Much like on average when the FPS takes a hit you are more than likely to see higher fps on PS5 but thats not to say there are some instances where the FPS favors XsX. Picking a winner here is just tribalism, that is all.

Key indicators that its really close.

"But but but VRR"

And

"Not bad for a $400 console"
 
Last edited:
Yep my mistake, but i would add that the dynamic nature of this makes it silly to say one way or the other who "wins".

For example on average when the res dips hard you are more likely to see the higher res on the XsX but thats not to say there are not instances where res favors PS5. Much like on average when the FPS takes a hit you are more than likely to see higher fps on PS5 but thats not to say there are some instances where the FPS favors XsX. Picking a winner here is just tribalism, that is all.

Key indicators that its really close.

"But but but VRR"

And

"Not bad for a $400 console"

I understand that winning is winning but sometimes it’s by so little that it doesn’t really matter. I doubt that most people are even going to notice these differences. Maybe between the XSS and the other two systems but not between the premium ones.
 
higher clocks with liquid metal thermal paste. not black magic at all.
Which is a first for consoles. Consoles have traditionally been slower clock but bigger/more cu and now ps5 is a first narrow but high clock system. Which is why for a 1st also using liquid metal.
 
I think it’s basically even, the PS5 is punching above it’s weight, but in a game with this much streaming (at least while driving) it‘s honestly what I‘d expect, that’s niche.

I’m playing in raytracing mode (30fps) on SX and I still get noticeable FPS drops blitzing through busy unknown areas at 180 MPH and getting bombarded with phone calls (I even crashed the game doing this), but every framerate drop there just seems like an IO hitch similar to what you got in Arkham Knight on old gen consoles like if you went too far too fast, like it basically pauses for a second trying to catch up (although shorter in CP2077). Stuff like that I would assume PS5 might handle better from the faster IO and the slightly smaller file size. The indoor stuff that’s more static the SX probably takes it when the streaming is negligible.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
IMO CP2077 is a good benchmark this game was not even meant for previous gen consoles. I am sure they can optimize the engine more in future but this is what to expect this gen from these consoles. Unreal5 games will be similar in performance too
Yeah, the Matrix Awakens demo basically sealed the deal. When they first revealed the Valley of the Ancient demo, they said that for software Lumens, BOTH the PS5 and XSX will target 1440p 30 fps, and for hardware accelerated Lumens using RT cores, they would be targeting 1080p 30 fps. 6 months later, it proved to be completely accurate. Both dropped frames in the same sections. Both ran at the same locked resolution. They are both essentially equal with each having their own strengths that come out in rare scenarios.

IIRC, The original Cyberpunk demo ran at 1080p 30 fps on a 1080 Ti. Both PS5 and XSX are basically equivalent to that card in standard rasterization. It is pretty remarkable that CD Project managed to double the FPS of the original vertical slice while making it look better. Though I know the crowd and some other features were downgraded.
 

ChiefDada

Gold Member
Interesting that it was so easy but MS engineers couldnt figure it out. You should apply for a job there, you clearly have a talent 👏

It's not that they necessarily couldn't figure it out, it's just Microsoft's architecture strategy diverted from AMD/Sony's approach of higher clocks with efficient cache. I don't know why they did this when they chose RDNA 2, but maybe they have undisclosed proprietary hardware/software that justifies their chosen design as it relates to memory.
 

Elios83

Member
I think the fact that the clocks are not extremely variable is the mysterious part.
It's not mysterious.
The system was designed with a power cap high enough that it doesn't need to downclock in pretty much all the real world code execution situations.
You could probably make a synthetic stress test made only of instructions that are associated to peak power consumption and you would see a downclock. But that's not a real world scenario and that's the reasoning behind Cerny's approach.
 
Top Bottom