• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft Game Stack VRS update (Series X|S) - Doom Eternal, Gears 5 and UE5 - 33% boost to Nanite Performance - cut deferred lighting time in half

Loxus

Member
Architecture has little or nothing to do with it. Nvidia's GPus were the first to include VRS-compatible hardware. That it is present in XS and AMD RDNA 2 GPus has a lot to do with the coincidence in the process of development and cooperation between both companies.
The fact is that there are already quite a few games on the street using VRS on XS consoles and PC (Nvidia / AMD) but not so in the PS5 version. And no, Nvidia's focus with VRS is not eye tracking, that's just one of its utilities. Nvidia's purpose is to include in its GPUs all the possible technology and the necessary hardware to solve it with the lowest possible cost.

Finally, You are right, it is clear that the process of optimization and better use of a hardware is applicable to any platform. Here however we talk about the improvement in optimization and use of a technology that seems to be unique in Xbox consoles. That as I said may end up being little used or useless in the future, but it is there and, at least the Xbox Studios first party (which are many and using very different engines) are putting emphasis on developing and using this technology in their games on XS consoles for gain better performance.
Do you know how Nvidia GPUs do VRS?
Cause it isn't the same way as AMD, but somehow Nvidia GPUs can do VRS just find without RB+ ROPs.
 

Riky

$MSFT
"It's also interesting to note that Xbox Series consoles use the hardware-based tier two VRS feature of the RDNA2 hardware, which is not present on PlayStation 5. VRS stands for variable rate shading, adjusting the precision of pixel shading based on factors such as contrast and motion. Pre-launch there was plenty of discussion about whether PS5 had the feature or not and the truth is, it doesn't have any hardware-based VRS support at all"

"In combination with Series X's extra GPU grunt, the use of variable rate shading may explain why the Microsoft console achieves higher resolutions overall.'

DF spell it out nice and clearly.
 

Darsxx82

Member
Do you know how Nvidia GPUs do VRS?
Cause it isn't the same way as AMD, but somehow Nvidia GPUs can do VRS just find without RB+ ROPs.
Sure not in the same way as AMD or XSX (totally diferent architecture), but the important point is that it includes specific hardware for use at the lowest cost. It is exactly the same as the case of having or not having hardware for RT. If you don't have specific hardware on your GPU, using the technology doesn't make sense. That you see games that in XS, Nvidia and AMD RDNA2 use VRS and not so on PS5 I think it says a lot. If you add to that that SONY never mentions anything and while MS does not stop highlighting it and its use in XSX ...
 

assurdum

Banned
Sure not in the same way as AMD or XSX (totally diferent architecture), but the important point is that it includes specific hardware for use at the lowest cost. It is exactly the same as the case of having or not having hardware for RT. If you don't have specific hardware on your GPU, using the technology doesn't make sense. That you see games that in XS, Nvidia and AMD RDNA2 use VRS and not so on PS5 I think it says a lot. If you add to that that SONY never mentions anything and while MS does not stop highlighting it and its use in XSX ...
Sony never mentions anything about everything inside the ps5, just vague chats and Cerny presentation. That doesn't means ps5 is an empty box.
 
"It's also interesting to note that Xbox Series consoles use the hardware-based tier two VRS feature of the RDNA2 hardware, which is not present on PlayStation 5. VRS stands for variable rate shading, adjusting the precision of pixel shading based on factors such as contrast and motion. Pre-launch there was plenty of discussion about whether PS5 had the feature or not and the truth is, it doesn't have any hardware-based VRS support at all"

"In combination with Series X's extra GPU grunt, the use of variable rate shading may explain why the Microsoft console achieves higher resolutions overall.'

DF spell it out nice and clearly.
Nothing specific against the transcription, but a source would be helpful for anyone interested in double checking. It's the way to do.

EDIT: Source: https://www.eurogamer.net/articles/digitalfoundry-2021-doom-eternal-next-gen-patch-tested
 
Last edited:

onesvenus

Member
They are. VRS and sparse lighting completely off. These things were not completely off on the UE5 or Alpha Point demo.
How would you compute the performance increase of VRS on its own if it's not disabling all other performance optimizations?

It's obvious they are only talking about the effect VRS has on the rendering when nothing else is used. That's the way you measure techniques, although it's true that doesn't give you the full picture
 

Loxus

Member
Sure not in the same way as AMD or XSX (totally diferent architecture), but the important point is that it includes specific hardware for use at the lowest cost. It is exactly the same as the case of having or not having hardware for RT. If you don't have specific hardware on your GPU, using the technology doesn't make sense. That you see games that in XS, Nvidia and AMD RDNA2 use VRS and not so on PS5 I think it says a lot. If you add to that that SONY never mentions anything and while MS does not stop highlighting it and its use in XSX ...
Oh, so only Nvidia and Microsoft can have custom hardware but not Sony?

Look at the Tempest Engine, I/O Complex and the CPU FPU.
Sony have many custom parts not found on other APU/GPU.

Why wouldn't they have something custom for VRS, especially since it's needed for Foveated Rendering?
 

Corndog

Banned
VRS is not a good thing… we are supposed to be getting better image quality with new generations not worse.

I hate every single technique like this that’s designed to save performance over quality like screen space effects, and damn can they die already.

Some of this is just too funny to me, we had stable reflections in mario 64 but not games from 2021 xD

How is a lower quality feature something to be bragged about?!
Checkerboard rendering?
 

Schmick

Member
Oh, so only Nvidia and Microsoft can have custom hardware but not Sony?

Look at the Tempest Engine, I/O Complex and the CPU FPU.
Sony have many custom parts not found on other APU/GPU.

Why wouldn't they have something custom for VRS, especially since it's needed for Foveated Rendering?
With all due respect, its the Sony fans that have brought PS5 into the conversation. This thread was started following a tech talk from MS and solely discusses that talk.

And you seem to literally miss the first part of the response to the question which looking back to appears now as a loaded question. No where is it suggested that PS5 does not have a equivalent solution. D Darsxx82 clearly suggests that different techniques are used.

Sure not in the same way as AMD or XSX (totally diferent architecture), but the important point is that it includes specific hardware for use at the lowest cost. It is exactly the same as the case of having or not having hardware for RT. If you don't have specific hardware on your GPU, using the technology doesn't make sense. That you see games that in XS, Nvidia and AMD RDNA2 use VRS and not so on PS5 I think it says a lot. If you add to that that SONY never mentions anything and while MS does not stop highlighting it and its use in XSX ...
 
I said DF, Digital Foundry.
That's a bit like saying "to be or not to be - shakespeare"

Could be or could not be as we live in a reality where plenty information is second hand and wrong. It's imperative to know where it's coming, when it was said, etc. Also , imagine it was somehow proved wrong later in 2022, but we'd still be quoting that in 2023, being able to check the date of the article is useful as well.

Wikipedia also works like that, it's just a good modus operandi, to quote it you had to go to the article anyway.
You do realize it’s starting to appear on playstation games via software… and I hate it there too dude.

This isn’t a ps vs. xbox thing this is an vrs is a bad thing.
I also dislike Dynamic Resolution Scaling quite a bit but it's everywhere and in some cases avoids things I would hate even more like constant dips in performance.

I think it's a "worser than" rather than a downright horrible thing specially at 4K target resolutions where to be honest, I think if done right I won't exactly notice. If the hardware can't pull it otherwise, it's as valid of a method as anything else.

It won't be a miracle performer though, so I hope it's not overused or it'll be bad/lazy.
 

Loxus

Member
With all due respect, its the Sony fans that have brought PS5 into the conversation. This thread was started following a tech talk from MS and solely discusses that talk.

And you seem to literally miss the first part of the response to the question which looking back to appears now as a loaded question. No where is it suggested that PS5 does not have a equivalent solution. D Darsxx82 clearly suggests that different techniques are used.
Did you read his whole post?
 

Three

Member
For the last time dude. Reading comprehension is important.
I can't believe you're talking about reading comprehension when the person is replying to you coherently and following the conversation. This is how it started:
It will be interesting to see the difference this brings in Doom, it looked kinda iffy before.

The VRS we've seen so far does make it look worse in games like Doom Eternal. If this version improves it then that's great.
Which you replied to with:
I kept telling people that was Tier 1 and not Tier 2.

He told you that Doom was Tier 2. He's right. Stop trying to insult the person for being right.

How would you compute the performance increase of VRS on its own if it's not disabling all other performance optimizations?

It's obvious they are only talking about the effect VRS has on the rendering when nothing else is used. That's the way you measure techniques, although it's true that doesn't give you the full picture
Nothing wrong with comparing anything. I was just clarifying that the 5.7ms to 2.7ms is in fact comparing NO VRS (software, tier, tier 2) and No Sparse Lighting to their current best VRS and Sparse lighting. That's why the saving seems bigger than it actually is compared to things shown before which had some form of VRS and sparse lighting.

The problem comes from people using this comparison for anything other than what it is intended for. eg the likes of Ricky and Co claiming 3x performance advantage with "hardware SFS" in the past instead of comparing to PRT+ solutions games were already using or him trying to use this news now for redemption for VRS secret sauce.

That's not to take away anything from the hard work developers are doing but some are trying to frame it as some huge untapped potential that hasn't been realised yet. It's just the steady progression of optimisation you always see. HFW actually does something similar already using compute shaders.
 
Last edited:

Darsxx82

Member
Oh, so only Nvidia and Microsoft can have custom hardware but not Sony?

Look at the Tempest Engine, I/O Complex and the CPU FPU.
Sony have many custom parts not found on other APU/GPU.

Why wouldn't they have something custom for VRS, especially since it's needed for Foveated Rendering?
Who denies that PS5 may have some technology of its own?We are talking about whether it has specific hardware for VRS in particular and so far there is not a single sign of it at this point.

The question for you is... If PS5 has specific hardware for VRS, what is the reason that there are no games using VRS, the reason that Sony has not made a single mention at this point (quite the opposite of MS)?

It is simple.
 
Last edited:

Riky

$MSFT
Back to the topic it was good to see some pretty huge gains for Series S, when we then also factor in the gains MS showed from SFS we can see that the console will have some much better results in the future.

rXmbhFb.jpg
 

Darsxx82

Member
Sony never mentions anything about everything inside the ps5, just vague chats and Cerny presentation. That doesn't means ps5 is an empty box.
We are talking about dedicated hardware for VRS specifically... When you already have enough games with VRS on XS and on compatible AMD and Nvidia GPUs but not on the PS5 version... If you add to that the null mention of Sony on the subject, that there are sources indicating the lack of specific VRS hardware on PS5 (including a PS5 engineer in that private conversation)..... Sorry, but anyone with a bit of common sense should understand that there are more reasons to begin to understand that it does not exist compared to the opposite.
I am clear that if it happened the other way around (PS5 games using VRS and not in the XSX version and Sony highlighting the technology and a silent MS) you would be supporting the same evidence at this point
 

CeeJay

Member
That's a bit like saying "to be or not to be - shakespeare"

Could be or could not be as we live in a reality where plenty information is second hand and wrong. It's imperative to know where it's coming, when it was said, etc. Also , imagine it was somehow proved wrong later in 2022, but we'd still be quoting that in 2023, being able to check the date of the article is useful as well.

Wikipedia also works like that, it's just a good modus operandi, to quote it you had to go to the article anyway.

I also dislike Dynamic Resolution Scaling quite a bit but it's everywhere and in some cases avoids things I would hate even more like constant dips in performance.

I think it's a "worser than" rather than a downright horrible thing specially at 4K target resolutions where to be honest, I think if done right I won't exactly notice. If the hardware can't pull it otherwise, it's as valid of a method as anything else.

It won't be a miracle performer though, so I hope it's not overused or it'll be bad/lazy.
I just googled the entire quote and it came up with Eurogamer (DF) as the top search so it was pretty easy to fact check what Riky Riky wrote
 

Loxus

Member
Who denies that PS5 may have some technology of its own?We are talking about whether it has specific hardware for VRS in particular and so far there is not a single sign of it.

The question for you is... If PS5 has specific hardware for VRS, what is the reason that there are no games using VRS, the reason that Sony has not made a single mention at this point (quite the opposite of MS)?

It is simple.
Maybe because there are better solutions than VRS.

If VRS is so important, why aren't all games using it?

And why aren't you guys doing the math?

Let's use this as an example.
Tb87jBD.png

If we assume all the Microsoft numbers are part of the total frame time.
30fps = 33.33ms
60fps = 16.66ms
0.73ms speed up.

We never saw Nanite running on consoles at 60fps.

So 33.33ms - 0.73 = 32.6ms
1000 / 32.6 = 30.67fps
30.67fps - 30fps = .67, which translates to 1fps improvement.
What kinda flex is that?

I'm open minded, so let's say they managed to get Nanite running at 60fps, which would be pretty cool.

16.66ms - 0.73ms = 15.93ms
1000/15.93 = 62.77fps
62.77fps - 60fps = 2.77, which would be 3 fps improvement.

Good for keeping a lock 60, but how is this different from doing it via software only?
I think all this hardware VRS (outside of VR) is getting over hyped if you ask me.
 
Last edited:

DaGwaphics

Member
It will be interesting to see the difference this brings in Doom, it looked kinda iffy before.

The VRS we've seen so far does make it look worse in games like Doom Eternal. If this version improves it then that's great.

Not really. VRS can make the peripheral areas where it is used lower res, but since the key areas are not dropping resolution because the tech is used, the final image should look sharper during game play (in comparison to dropping res across the entire scene).
 

Three

Member
The question for you is... If PS5 has specific hardware for VRS, what is the reason that there are no games using VRS, the reason that Sony has not made a single mention at this point (quite the opposite of MS)?

It is simple.
No games using VRS? There are plenty of games with VRS and the fact that you haven't noticed or the game hasn't performed worse in comparison is a testament to that.

They just released another one a couple of days ago:


Games with VRS though, there are plenty that are doing a good job of it and performing just as well if not better.

With all due respect, its the Sony fans that have brought PS5 into the conversation. This thread was started following a tech talk from MS and solely discusses that talk.

And you seem to literally miss the first part of the response to the question which looking back to appears now as a loaded question. No where is it suggested that PS5 does not have a equivalent solution. D Darsxx82 clearly suggests that different techniques are used.
He brought up PS5 himself. He also said this:
Here however we talk about the improvement in optimization and use of a technology that seems to be unique in Xbox consoles
And then changed to this
Sure not in the same way as AMD or XSX (totally diferent architecture), but the important point is that it includes specific hardware for use at the lowest cost
And this simply isn't true even if you were to accept the suggestion that PS5 doesn't have the "lowest cost VRS". This would be trying to suggest that AMD cards have some capability Nvidia doesn't have and it simply isn't true. A lot of people don't understand hardware abstraction layers and APIs, that's all.
 

oldergamer

Member
I can't believe you're talking about reading comprehension when the person is replying to you coherently and following the conversation. This is how it started:
No, that's NOT how it started! Go back and look at my first post! What's with some of you arguing points that have little or nothing to do with what someone has said.

It started with my post saying:

"I kept telling people that were trying to claim it made image quality worse, that it wasn't the real fact."


He told you that Doom was Tier 2. He's right. Stop trying to insult the person for being right.

.... and? I never stated Doom was anything else. My entire post was talking about Dirt 5 (LAST YEAR) and referring to people making claims about VRS was the direct cause of poor image quality a year ago! Again I will state that I said "that's NOT true" last year. You want to tell me again how I'm "insulting" him for being "right" about something I wasn't arguing?
 
Last edited:

Riky

$MSFT
No, that's NOT how it started! Go back and look at my first post! What's with some of you arguing points that have little or nothing to do with what someone has said.

It started with my post saying:

"I kept telling people that were trying to claim it made image quality worse, that it wasn't the real fact."




.... and? I never stated Doom was anything else. My entire post was talking about Dirt 5 (LAST YEAR) and referring to people making claims about VRS was the direct cause of poor image quality a year ago! Again I will state that I said "that's NOT true" last year. You want to tell me again how I'm "insulting" him for being "right" about something I wasn't arguing?

Yes you were talking about Dirt 5, they are just literally making things up now due to desperation.
DF said that Doom Eternal was the first third party game to use TIER 2 VRS, Dirt 5 appeared way before that and DF did an interview with Springate, they were by deduction using Tier 1.
Also all the talk about VRS artifacts was superceded by Codemasters admitting they had in fact used the wrong Series S settings on the Series X version and having to apologise and fix it.
 
Last edited:

oldergamer

Member
Yes you were talking about Dirt 5, they are just literally making things up now due to desperation.
DF said that Doom Eternal was the first third party game to use TIER 2 VRS, Dirt 5 appeared way before that and DF did an interview with Springate, they were by deduction using Tier 1.
Also all the talk about VRS artifacts was superceded by Codemasters admitting they had in fact used the wrong Series S settings on the Series X version and having to apologise and fix it.
You hit the nail on the head. I couldn't find that DF interview but I remember it.
 
I just googled the entire quote and it came up with Eurogamer (DF) as the top search so it was pretty easy to fact check what Riky Riky wrote
The way I got to it as well. But I stand by what I suggested/said as sometimes it's not as clear.

Source is more proof than a quote, it's just good nettiquette.
Maybe because there are better solutions than VRS.
Well, games like Horizon FW, Call of Duty Modern Warfare 2020 and probably a few more are using Tier 1 implementations on PS4 so if there was a better solution to get the performance they want I'm sure they would go there instead as it's CPU reliant, and we know how Jaguar cores are underpowered.
If VRS is so important, why aren't all games using it?
Because they don't need to.

This is like Dynamic Resolution, it's not better to use it against not using it, it's something you use when your game can't hit your resolution/framerate target. And as with the start of Dynamic Resolution, it'll probably get increasingly more use throughout the generation on XSS and XSX at least.

As pointed before in this thread, this can be considered good or bad, I certainly notice when Dynamic Resolution is aggressive and feel it's a downgrade, VRS will be similar, but output resolutions are higher than ever too.
 
Last edited:

elliot5

Member
No, that's NOT how it started! Go back and look at my first post! What's with some of you arguing points that have little or nothing to do with what someone has said.

It started with my post saying:

"I kept telling people that were trying to claim it made image quality worse, that it wasn't the real fact."




.... and? I never stated Doom was anything else. My entire post was talking about Dirt 5 (LAST YEAR) and referring to people making claims about VRS was the direct cause of poor image quality a year ago! Again I will state that I said "that's NOT true" last year. You want to tell me again how I'm "insulting" him for being "right" about something I wasn't arguing?

Yes you were talking about Dirt 5, they are just literally making things up now due to desperation.
DF said that Doom Eternal was the first third party game to use TIER 2 VRS, Dirt 5 appeared way before that and DF did an interview with Springate, they were by deduction using Tier 1.
Also all the talk about VRS artifacts was superceded by Codemasters admitting they had in fact used the wrong Series S settings on the Series X version and having to apologise and fix it.
Listen y’all im not trying to be combative. I’m sorry if there’s been a misunderstanding. Maybe you were referring to Dirt 5 oldgamer at first but it wasnt made explicit until after my first response to you. That’s okay. I also recall people poopooing on Dirt 5. I understand what your position is.

My take on Dirt 5 is that it is using VRS Tier 2

Here is the devs explicitly saying it in this amd video at 4:21. Yes its for their new graphics cards for pc, but thats because its what AMD is selling directly in late 2020. Occams razor would be that they would use this on XBS as its already integrated in the game engine and XBS supports it, so why not? Same reason every other dev like CDPR and TC and ID use it. If Dirt 5 produced poor artifacting image quality with VRS Tier 2 i think we can all agree its due to poor implementation and not a universal reflection of the optimizations quality. That’s all.
 
Last edited:

Riky

$MSFT
Listen y’all im not trying to be combative. I’m sorry if there’s been a misunderstanding. Maybe you were referring to Dirt 5 oldgamer at first but it wasnt made explicit until after my first response to you. That’s okay. I also recall people poopooing on Dirt 5. I understand what your position is.

My take on Dirt 5 is that it is using VRS Tier 2

Here is the devs explicitly saying it in this amd video at 4:21. Yes its for their new graphics cards for pc, but thats because its what AMD is selling directly in late 2020. Occams razor would be that they would use this on XBS as its already integrated in the game engine and XBS supports it, so why not? Same reason every other dev like CDPR and TC and ID use it. If Dirt 5 produced poor artifacting image quality with VRS Tier 2 i think we can all agree its due to poor implementation and not a universal reflection of the optimizations quality. That’s all.


I don't think he's talking about you, neither was I it was the other poster.
Springate never said Tier 2 VRS, he just said VRS. Also Tier 2 VRS has at least three different performance profiles according to The Coalition so that muddies the waters further. We know DF did a deep dive on Dirt 5, we know they then came out and said that Doom Eternal was the FIRST third party game to use Tier 2 VRS, so that strongly implies Dirt 5 didn't. We just don't have a definitive answer really.
We also know that what people were blaming on VRS turned out to be the Series S settings applied to the Series X version at launch, Codemasters apologised and fixed it.
 

Darsxx82

Member
Maybe because there are better solutions than VRS.
What were those best solutions on PS5 in those games where the XS and PC version offered VRS?

If VRS is so important, why aren't all games using it?
1.Because it is new technology and not every platform is compatible in a time full of crossgen games.
2.Because its use does not always make sense if the characteristics of the game are not the most appropriate or it was not made with VRS in mind.
3.Because its use is unnecessary if performance and IQ goals are already achieved.
And why aren't you guys doing the math?

Let's use this as an example.
Tb87jBD.png

If we assume all the Microsoft numbers are part of the total frame time.
30fps = 33.33ms
60fps = 16.66ms
0.73ms speed up.

We never saw Nanite running on consoles at 60fps.

So 33.33ms - 0.73 = 32.6ms
1000 / 32.6 = 30.67fps
30.67fps - 30fps = .67, which translates to 1fps improvement.
What kinda flex is that?

I'm open minded, so let's say they managed to get Nanite running at 60fps, which would be pretty cool.

16.66ms - 0.73ms = 15.93ms
1000/15.93 = 62.77fps
62.77fps - 60fps = 2.77, which would be 3 fps improvement.
I don't know what this has to do with the issue of the absence or presence of VRS hardware in one console or another.😅
Good for keeping a lock 60, but how is this different from doing it via software only?
I think all this hardware VRS (outside of VR) is getting over hyped is you ask me.

The difference with a software solution is that you are not limited to a single graphics engine of your own and the developer saves time and money that is lost in creating a software solution of his own. Hardware VRS is implemented "automatically" and the developer decides its usefulness or not when it comes to reaching a performance and IQ target.
Apart from using hardware VRS, it is also compatible with the use of software VRS where it is present.
 

DaGwaphics

Member
For those that are talking 33fps vs. 30fps, I think you are missing the point. Video seems to be alluding to the overall gains associated with putting the render time savings to use in another area, whether that is boosting overall resolution or increasing effects, etc. On a 30fps game, I don't think the target is to increase the fps to 33fps, rather it is to use the savings to improve the final images that are returned at 30fps.
 
Last edited:

Darsxx82

Member
Maybe because there are better solutions than VRS.

If VRS is so important, why aren't all games using it?

And why aren't you guys doing the math?

Let's use this as an example.
Tb87jBD.png

If we assume all the Microsoft numbers are part of the total frame time.
30fps = 33.33ms
60fps = 16.66ms
0.73ms speed up.

We never saw Nanite running on consoles at 60fps.

So 33.33ms - 0.73 = 32.6ms
1000 / 32.6 = 30.67fps
30.67fps - 30fps = .67, which translates to 1fps improvement.
What kinda flex is that?

I'm open minded, so let's say they managed to get Nanite running at 60fps, which would be pretty cool.

16.66ms - 0.73ms = 15.93ms
1000/15.93 = 62.77fps
62.77fps - 60fps = 2.77, which would be 3 fps improvement.

Good for keeping a lock 60, but how is this different from doing it via software only?
I think all this hardware VRS (outside of VR) is getting over hyped if you ask me.

No games using VRS? There are plenty of games with VRS and the fact that you haven't noticed or the game hasn't performed worse in comparison is a testament to that.

They just released another one a couple of days ago:


Games with VRS though, there are plenty that are doing a good job of it and performing just as well if not better.


He brought up PS5 himself. He also said this:

And then changed to this

And this simply isn't true even if you were to accept the suggestion that PS5 doesn't have the "lowest cost VRS". This would be trying to suggest that AMD cards have some capability Nvidia doesn't have and it simply isn't true. A lot of people don't understand hardware abstraction layers and APIs, that's all.
No idea what you're talking about. You probably have a reading comprehension problem.

Then
1.There are many games using VRS on XSX and PC that don't use it on PS5. It has to do with the compatibility and existence of specific hardware.

2. Software VRS and hardware VRS. You should be aware of the differences. Otherwise there is no need to continue the discussion.
 

elliot5

Member
For those that are talking 33fps vs. 30fps, I think you are missing the point. Video seems to be alluding to the overall gains associated with putting the render time savings to use in another area, whether that is boosting overall resolution or increasing effects, etc. On a 30fps game, I don't think the target is to increase the fps to 33fps, rather it is to use the savings to improve the final images that are returned at 30fps.
And in other cases ensure a game running at an unstable 27 hits 30 locked. It’s just optimization and like the doom engineer says you take everything you can get.
 

Loxus

Member
No idea what you're talking about. You probably have a reading comprehension problem.

Then
1.There are many games using VRS on XSX and PC that don't use it on PS5. It has to do with the compatibility and existence of specific hardware.

2. Software VRS and hardware VRS. You should be aware of the differences. Otherwise there is no need to continue the discussion.
What are the performance gains of no VRS vs software VRS vs hardware VRS is the real question, especially since hardware VRS only gives 1-3fps improvement.
 

Riky

$MSFT
What are the performance gains of no VRS vs software VRS vs hardware VRS is the real question, especially since hardware VRS only gives 1-3fps improvement.


Explained here why they moved Gears Tactics to Tier 2 from software Tier 1, the increase depends on several factors so can be a lot bigger than 1-3fps.
 

Darsxx82

Member
What are the performance gains of no VRS vs software VRS vs hardware VRS is the real question, especially since hardware VRS only gives 1-3fps improvement.
The performance improvement in some games is usually between 5-30% depending on cases, scenes and conditions. The usual is in the range between 8-15%. You can see it in the video.

Then, there is no need to compare with VRS software in that aspect. If a developer sets out to create their VRS via software for their graphics engine and game they can achieve similar results. That's not the question. The case is to create a solution that does not require the work of the developer, works independently of the graphics engine and does not involve any cost of its use but the opposite. You have to ask yourself how many Studios there are with their own graphic engine that see it useful to create a software solution for VRS. You will hardly see it except in exceptional cases where they make the effort profitable as Activision that uses the same engine for COD and all of them cut by the same pattern (full of dark or night scenes, post-processing effects, motion blur ...).

That said, it compatible using VRS software and VRS hardware at the same time. Who knows if when the time comes even Activision decides to investigate in such an option when it is part of Xbox.
 

Killer8

Member
Obviously didn't watch the video where they say the opposite several times.

They said the savings in render time can be put into increasing the resolution and can thereby create better image quality. That isn't the same as "VRS increases image quality". It's clever wording by them to describe one specific usage of the technology.

VRS is literally, by design, a way of degrading image quality in a way that is just enough to a) free up render time while b) the player doesn't notice (debatable in the games released using VRS so far). Of course that is useful for optimization, in the same way that lowering a graphics setting's quality might be, but how you then use that freed up rendering time is going to depend on a case-by-case basis. It's not nearly as simple as "you turn on VRS you get better image quality, simple as".

Some developers may devote that freed rendering time to higher quality effects, some may devote it to better framerates. Some, like Microsoft claim, may devote it to higher overall dynamic resolution so that despite these degradations, the overall image as a whole may benefit. But then there are developers who are already seeing their games not reaching their performance targets, dipping to say 28fps or 55fps - VRS might be able to bring that up to speed, so at that point it just becomes another tool in the box to get it running. No improvement to effects or image quality will be happening in those instances.
 

Riky

$MSFT
They said the savings in render time can be put into increasing the resolution and can thereby create better image quality. That isn't the same as "VRS increases image quality". It's clever wording by them to describe one specific usage of the technology.

VRS is literally, by design, a way of degrading image quality in a way that is just enough to a) free up render time while b) the player doesn't notice (debatable in the games released using VRS so far). Of course that is useful for optimization, in the same way that lowering a graphics setting's quality might be, but how you then use that freed up rendering time is going to depend on a case-by-case basis. It's not nearly as simple as "you turn on VRS you get better image quality, simple as".

Some developers may devote that freed rendering time to higher quality effects, some may devote it to better framerates. Some, like Microsoft claim, may devote it to higher overall dynamic resolution so that despite these degradations, the overall image as a whole may benefit. But then there are developers who are already seeing their games not reaching their performance targets, dipping to say 28fps or 55fps - VRS might be able to bring that up to speed, so at that point it just becomes another tool in the box to get it running. No improvement to effects or image quality will be happening in those instances.

That was a better explanation.
Point is though that as seen with the 2077 1.5 patch the pixel counters who were looking at blown up still images didn't see this implementation of Tier 2 VRS, so the chances of seeing it on your TV whilst playing a game as it's dynamic frame by frame is almost zero.
So the degradation to picture quality you can't actually perceive is far outweighed by the performance gains.
 

Riky

$MSFT
And in other cases ensure a game running at an unstable 27 hits 30 locked. It’s just optimization and like the doom engineer says you take everything you can get.

This is a great point, If we look at Hitman 3 we get a locked 4k 60fps in 99% of the game apart from a few drops in one foliage heavy level, could Tier 2 VRS correct this, most probably.
 

Killer8

Member
That was a better explanation.
Point is though that as seen with the 2077 1.5 patch the pixel counters who were looking at blown up still images didn't see this implementation of Tier 2 VRS, so the chances of seeing it on your TV whilst playing a game as it's dynamic frame by frame is almost zero.
So the degradation to picture quality you can't actually perceive is far outweighed by the performance gains.

The Cyberpunk 2077 1.5 patch is a perfect example of the balancing act between framerate and resolution. I believe it runs at a higher average resolution on Xbox Series X than PS5. Yet, it also uses VRS. And it also runs worse than PS5. This raises many such questions:

How would performance differ on XSX if it ran at the same resolution as PS5? Would the resolution vs framerate balancing act tip back in favor of framerate on XSX?
How would performance differ without VRS? Would it still need it just to achieve parity with PS5 in a like for like resolution scenario?
Did the higher dynamic resolution on XSX make VRS not as noticeable? In fact do you need resolution to be higher to not notice it? If the resolution was like for like on PS5 and XSX, would pixel counters have been able to spot its implementation more easily?

If VRS was not noticeable, sure this is good from an image degradation point of view. You could say it is not harming it. But it also did not really improve the image either which runs counter to Microsoft's claims. And it certainly was not a revelation for performance either if the game is still running worse. So how has Microsoft's theoretical ~50% actually translated into something for the developers?
 

ZywyPL

Banned
If we assume all the Microsoft numbers are part of the total frame time.
30fps = 33.33ms
60fps = 16.66ms
0.73ms speed up.

We never saw Nanite running on consoles at 60fps.

So 33.33ms - 0.73 = 32.6ms
1000 / 32.6 = 30.67fps
30.67fps - 30fps = .67, which translates to 1fps improvement.
What kinda flex is that?

I'm open minded, so let's say they managed to get Nanite running at 60fps, which would be pretty cool.

16.66ms - 0.73ms = 15.93ms
1000/15.93 = 62.77fps
62.77fps - 60fps = 2.77, which would be 3 fps improvement.

Good for keeping a lock 60, but how is this different from doing it via software only?
I think all this hardware VRS (outside of VR) is getting over hyped if you ask me.

LMAO, that's not how math works, not at all xD Take away 33% of 33.33ms, what do you get? Something like 44-45FPS, that's A LOT of performance boost, that's what actually might help achieving Nanite at 60FPS somewhere in the future.
 
I don’t think you understand how it works. You lower quality of low visibility areas and can raise quality of highly visible areas.
I understand. They hope you don’t notice the lower quality areas but it’s still on screen. It’s a trade off.
Checkerboard rendering?
What about it?

It’s obviously worse than native. Usually it’s worse than whatever resolution it’s starting from but for some games it’s better to have i.e. checkerboard 4k than 1440p. Days gone, resident evil, dmc 5 do a good job but native 4k would still be better.

But it’s not comparable to what we are talking about because it’s just a reconstruction method ; it’s not saving any performance it’s just trying to look better than the resolution it’s starting from.
 
Last edited:

Riky

$MSFT
The Cyberpunk 2077 1.5 patch is a perfect example of the balancing act between framerate and resolution. I believe it runs at a higher average resolution on Xbox Series X than PS5. Yet, it also uses VRS. And it also runs worse than PS5. This raises many such questions:

How would performance differ on XSX if it ran at the same resolution as PS5? Would the resolution vs framerate balancing act tip back in favor of framerate on XSX?
How would performance differ without VRS? Would it still need it just to achieve parity with PS5 in a like for like resolution scenario?
Did the higher dynamic resolution on XSX make VRS not as noticeable? In fact do you need resolution to be higher to not notice it? If the resolution was like for like on PS5 and XSX, would pixel counters have been able to spot its implementation more easily?

If VRS was not noticeable, sure this is good from an image degradation point of view. You could say it is not harming it. But it also did not really improve the image either which runs counter to Microsoft's claims. And it certainly was not a revelation for performance either if the game is still running worse. So how has Microsoft's theoretical ~50% actually translated into something for the developers?

It's an interesting case, when you look at it in detail though it becomes clearer, the drops on Series X are not GPU related and here is why I believe that to be the case.
We all know that the game uses a DRS scale that has upper and lower bounds identical between the two consoles, the entire point of this scale is that when GPU bound it will reduce resolution to keep the framerate up until it reaches the lower bound.
What we see in those stress areas as pixel counted by VGtech is that resolution is still higher on Series X. Why would this be the case when DRS would lower the resolution to compensate, well when you look closely at those areas they are very AI heavy so in fact CPU bound, that's why the framerate drops.
We see in the NXgamer analysis he found that in combat heavy scenes the reverse is true, the Series X outperforms PS5 framerate wise, this again makes sense as they are more GPU bound with effects, the 2 hour presentation on VRS we just witnessed shows big gains when implemented on these effects.
So why does the CPU seem to be the problem, well the general consensus is that these last gen engines do not use the VA and the Series consoles compression blocks, therefore the CPU has more to do.
We can also see this in reverse in loading times, the Series X matches the PS5 in loading yet we know the PS5 has over twice the throughput, so either the Series X has its own custom hardware that makes up this difference (not the case) or simply the engine doesn't take full advantage of the PS5 either.
 
Last edited:

assurdum

Banned
This is a great point, If we look at Hitman 3 we get a locked 4k 60fps in 99% of the game apart from a few drops in one foliage heavy level, could Tier 2 VRS correct this, most probably.
The thing you won't understand VRS (as CBR or other tech) not work flawless in every single scenario; you can't just put it blindly to save performance. It's all about the whole engine pipeline and can cause visible degradation in the IQ (Doom Eternal says hi). Now post your laugh gif as always.
 
Last edited:

Riky

$MSFT
The thing you won't understand VRS (as CBR or other tech) not work flawless in every single scenario; you can't just put it blindly to save performance. It's all about the whole pipeline and can cause visible degradation in the IQ (Doom Eternal says hi). Now post your laugh gif as always.

He's an article about plugging it into an existing engine and saving performance.


They talk about the image in Doom Eternal and how it is increased by Tier 2 VRS , also DF confirm this in their breakdown calling the PS5 version "visibly blurrier" , so yes Doom Eternal does say hi, with an up to 30% resolution advantage.
 

assurdum

Banned
He's an article about plugging it into an existing engine and saving performance.


They talk about the image in Doom Eternal and how it is increased by Tier 2 VRS , also DF confirm this in their breakdown calling the PS5 version "visibly blurrier" , so yes Doom Eternal does say hi, with an up to 30% resolution advantage.
Dude stop to post things from MS channel as a PR and go back to look in your precious DF channel the video comparison about Doom Eternal before to talk ps5 look more blurred. And anyway it's not even thatmy point. You can't apply VRS to every engine and think it will work flawless. It's not that easy peasy.
Here we go laugh gif. God bless the fanboy ignorance. Poor child and people blame me just troll in this thread and why I don't discuss. Why have to discuss against the idiocy.
 
Last edited:

Riky

$MSFT
Dude stop to post things from MS channel as a PR and go back to look in your precious DF channel the video comparison about Doom Eternal before to talk ps5 look more blurred. And anyway it's not even thatmy point. You can't apply VRS to every engine and think it will work flawless.

I provided you the article, it's a developer blog, aimed at developers.
You bring up a game I gave you what an independent pixel counter said, you don't like it so you just dismiss it. You provide no proof of what you are saying at all, as usual.
 
Last edited by a moderator:
Top Bottom