• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Call of Duty Vanguard PS5 vs Xbox Series X|S Frame Rate Comparison

I know, I watched the interview.. But still they didn't implement VRS on PC. Why? I just don't get it.
It's only compatible with newer hardware. RX 5000 series and GTX 1000 series don't support it, for example. Maybe that was the reason. Although they implemented raytracing despite all of this, so maybe that's not a good reason.
 
Last edited:

Arioco

Member
It's only compatible with newer hardware. RX 5000 series and GTX 1000 series don't support it, for example. Maybe that was the reason.


Exactly like Ray Tracing, it's only supported by newer hardware, and devs did implement RT on PC. 🤷‍♂️

What's more, the graphic cards that support RT are THE SAME that support VRS. Why implement one and not the other when the devs said they wish they could implement VRS on all versions?
 

Cherrypepsi

Member
you can't make this up

people arguing about which technique is better to make parts of the screen look like shit

have we reached a new low in console wars?

are you guys still gaming?

4
 

Riky

$MSFT
Did they use it on Gears 5 PC? Maybe they thought it's not necessary as you can brute force your way on PC.
 

Darsxx82

Member
Sure all techs can be improved. But the hardware limitation of RDNA2 VRS will always be there. It seems to be a big problem as well as the devs apparently specifically mentionned it to Leadbetter.
At no time do they say that their use involves problems and even less that they are serious LOL

Again, I don't think you know what you are discussing or the implications of the comparison. You dont focus the tread correctly.
You are talking as if every Studio is going to implement, pay for and adapt their games to a VRS of their own creation like Activision for COD to the point that VRS hardware does not make sense ..

I repeat once more .... VRS COD software is only designed for the COD engine and made ad hoc. Even the base art and atmosphere of the game is adjusted so that the result is good (many phases during the night, dark places and with shadows, colors .....)

VRS by software, you will see it in very, very few cases and generally in proprietary engines.
VRS tier2 is open to any graphics engine and its implementation does not require extra work. It implies what has already been seen in several multiplatform games, such as that XSX benefits from its use while PS5 does not becaus lack dedicated Hardware.
It is a question of knowing whether or not its use in XSX for cross-platform games becomes widespread in the future. And, of course, their results will improve and it is only a matter of seeing the excellent result in Gears being the first steps.
 

Arioco

Member
Did they use it on Gears 5 PC? Maybe they thought it's not necessary as you can brute force your way on PC.


In my oppinion the PC is where VRS could be more useful (if and when it provides enough extra performance). PC user often like to play at frame rates way higher than the 120 fps we've seen on consoles. VRS could help them achieve that performance. That's another reason I don't get why they decided not to implement it on PC.
 

Lysandros

Member
Not sure if that is a good example. Doom Eternal runs at almost perfect 120 fps even on hardware that doesn't support VRS Tier 2, so if the game runs at a rock solid 120 fps is probably thanks to the engine itself, which is amazing and one of the best in the business, and not its particular VRR implementation . At best VRR Tier 2 allows for an overall higher rez (at the expense of degrading certain parts of the image, of course, so it's kind of a trade-off). The game uses DRS but the average rez is often higher on Series X than PS5. But how much of that extra rez is thanks to VRS and how much thanks to the compute and bandwidth advantage the Series X has even before we factor in VRS?
I find this to be very debatable, especially on whole system level where PS5 is in more comfortable position as to solid state and GPU cache bandwidth side. Further into the generation, it's very possible that we will talk more about XSX bandwidth 'disadvantage' than advantage.
 
Last edited:

Lysandros

Member
It's a technical tie and that's fine.

Though according to Digital Foundry, the Xbox devkits got a major software upgrade that substantially improves performance across all Series consoles. Back in January.
You'll see. Anytime now.
So you foresee a change in favor of XSX?

Edit: Nevermind, missed the sarcasm.
 
Last edited:

ToTTenTranz

Banned
I think it was sarcasm. But as we have seen recently even PS5 has performance improvements with updates.
Yes. Both consoles are expected to push more effectively from the hardware as developers get more time to try out new things.

IMO the guys at DF are/were just trying to save face after being thrown off-balance when they started to test the games and measure similar performance between both consoles. And have been doing so for the past year.

They got on the github 8-9TF train, then the PS5 came out with RDNA2 GPU clocks that effectively put it at a 18% distance in compute to SeriesX but 20% higher fillrates and geometry setup. Then they banked hard on the "it's probably downclocking all the time" theory because they were still stuck on that github idea, until Cerny came out to clarify that downclocks were rare.

Then they probably heard of the Series X getting a devkit upgrade that was supposedly solving the memory contention issues, and banked on that as well.
The Series X/S devkit probably did get an upgrade that solved those issues, but it's not like Sony stood still and didn't come up with optimizations of their own.

Then we had Alex go on forums claiming the PS5 is a less advanced console because that same hardware was supposed to release in 2019, and how DirectX12U VRS2 was the next big thing (that's actually getting replaced by software VRS in some modern engines).. and again that was neither proven nor validated with actual results.

I guess that's why they made such a big fuss around Control in screenshot mode. They finally got the "20% faster" vindication they were desperately trying to find... on a crossgen game and in a mode that is not meant for actual gaming. Yay..




Thankfully, both consoles have very similar raw performance and that simply means no version is getting capped by having to adapt the games to a slower competition. It's just the best outcome for gamers.
 
Last edited:
Yes. Both consoles are expected to push more effectively from the hardware as developers get more time to try out new things.

IMO the guys at DF are/were just trying to save face after being thrown off-balance when they started to test the games and measure similar performance between both consoles. And have been doing so for the past year.

They got on the github 8-9TF train, then the PS5 came out with RDNA2 GPU clocks that effectively put it at a 18% distance in compute to SeriesX but 20% higher fillrates and geometry setup. Then they banked hard on the "it's probably downclocking all the time" theory because they were still stuck on that github idea, until Cerny came out to clarify that downclocks were rare.

Then they probably heard of the Series X getting a devkit upgrade that was supposedly solving the memory contention issues, and banked on that as well.
The Series X/S devkit probably did get an upgrade that solved those issues, but it's not like Sony stood still and didn't come up with optimizations of their own.

Then we had Alex go on forums claiming the PS5 is a less advanced console because that same hardware was supposed to release in 2019, and how DirectX12U VRS2 was the next big thing (that's actually getting replaced by software VRS in some modern engines).. and again that was neither proven nor validated with actual results.

I guess that's why they made such a big fuss around Control in screenshot mode. They finally got the "20% faster" vindication they were desperately trying to find... on a crossgen game and in a mode that is not meant for actual gaming. Yay..




Thankfully, both consoles have very similar raw performance and that simply means no version is getting capped by having to adapt the games to a slower competition. It's just the best outcome for gamers.
Nice summary!
 

Lysandros

Member
Yes. Both consoles are expected to push more effectively from the hardware as developers get more time to try out new things.

IMO the guys at DF are/were just trying to save face after being thrown off-balance when they started to test the games and measure similar performance between both consoles. And have been doing so for the past year.

They got on the github 8-9TF train, then the PS5 came out with RDNA2 GPU clocks that effectively put it at a 18% distance in compute to SeriesX but 20% higher fillrates and geometry setup. Then they banked hard on the "it's probably downclocking all the time" theory because they were still stuck on that github idea, until Cerny came out to clarify that downclocks were rare.

Then they probably heard of the Series X getting a devkit upgrade that was supposedly solving the memory contention issues, and banked on that as well.
The Series X/S devkit probably did get an upgrade that solved those issues, but it's not like Sony stood still and didn't come up with optimizations of their own.

Then we had Alex go on forums claiming the PS5 is a less advanced console because that same hardware was supposed to release in 2019, and how DirectX12U VRS2 was the next big thing (that's actually getting replaced by software VRS in some modern engines).. and again that was neither proven nor validated with actual results.

I guess that's why they made such a big fuss around Control in screenshot mode. They finally got the "20% faster" vindication they were desperately trying to find... on a crossgen game and in a mode that is not meant for actual gaming. Yay..




Thankfully, both consoles have very similar raw performance and that simply means no version is getting capped by having to adapt the games to a slower competition. It's just the best outcome for gamers.
Nice documentary.
 

Shmunter

Member
Where did they say it's a "big problem"? Of course those developers are talking up their solution but the proof is in the eating as they say.

Gears 5 and Doom Eternal run at practically locked 120fps with hardware Tier 2 VRS whilst Call Of Duty runs anywhere from just over 60fps to sometimes 120fps.
It's not even close.

Also the Coalition said this,

"While we were able to implement VRS for all the passes that gave us the biggest bang for the buck, it was not plumbed into the entire engine due to time constraints. A deeper integration would allow VRS to provide even larger GPU savings."
“Talking up their solution”….Coalition (a Ms studio) talking up an Xbox feature 🤪

Comparing framerates across different games and engines? *Shakes head*

Do better Ricky!
 

Shmunter

Member
That's very weird. Not that I'm a fan of VRS of any kind anyways, but Richard himself said in Doom Eternal Next Gen Patch Analysis that hardware VRR Tier 2 was "way more interesting, much higher quality". And now it was beaten by a software solution early in the generation? And the downside to a software solution is the fact that it's more difficult to implement? Come on...
Also pretty much all AA methods have gone software based with temporal solutions, moving away from built in hardware. Flexibility and quality can be refined, hardware is static and immovable. Hardware VRS is redundant even before it had a chance to get used it seems.
 
Top Bottom