• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft Game Stack VRS update (Series X|S) - Doom Eternal, Gears 5 and UE5 - 33% boost to Nanite Performance - cut deferred lighting time in half

The lesser- yes I am aware. The improved version, no. Right?

No, not yet, but it works because it's been tested on a real shipped game. Doom Eternal Series X. There hasn't been an update for Series X released with this new VRS in place, though, but one of these improved methods is now working and running in a real game. And the rest of these techniques will be available to future games that put in the work to implement them.
 
My understanding was vrs does not get “improved” it’s the underlying formulas and uses of the shaders within the graphics pipeline that are being improved to increase the efficiency of vrs rendering. It’s not like VRS got updated to v 2.1.1 in the xdk and now all games get an improvement.

Believe it or not, it actually DID get improved with Microsoft's new shader model releases. New capabilities were introduced, which Series X still supports apparently since the VRS featureset was always beyond Tier 2.
 

Loxus

Member
I seriously had no idea vrs was this versatile.

Hj5G5og.png
Do you know how all this translate to how much frame time and frame rate improvement you get?

Don't have time to watch both those 1hr long videos.
 
Last edited:
Do you know how all this translate to how much frame time and frame rate improvement you get?

Don't have time to watch both those 1hr long videos.

It depends on a case by case or game by game basis, but the one consistent fact is that it will automatically allow the Series X|S GPUs to, on average, utilize their available resources better, producing higher resolutions and achieving better framerate by limiting cases where you're missing your frame time budget. It can all go into making the games better depending.

Funny enough, though they stress in this particular case that alpha point is just a demo, and not a real game, Coalition says that early tests so far suggest that Nanite + Unreal Engine 5 and VRS is a potentially bigger performance win than non nanite based games.

They said they never saw greater than an average shading rate of greater than 1.2 in Gears 5 with VRS, and yet in unreal 5 and nanite they seem to be getting even better performance.

They do stress though that the results of 1.4 are from a demo, and not a real game, but they do believe it makes sense that they'd be getting better performance with VRS using nanite based unreal 5 titles.

sTLiEmY.png
 
They also mention that some objects in a game world, such as the digital screens in Doom Eternal didn't work right for Tier 2, and so id's solution for that was to simply turn tag that specific object, and tell it to use just tier 1 VRS, and just like that everything else around it could be tier 2 while tier 2 is operating on the rest of the image. Very versatile.

They point out the advantages to tier 1 even though majority of their focus was entirely on tier 2.

iq7YC00.png
 

oldergamer

Member
It was not tier 1.

Ot just finished watching these and it’s really cool stuff. The lighting vrcs was most impressive with it being truly undetectable. And the deblocking May be useful to clean up artifacts seen in Halo or Doom in some scenarios.

The part on hdr tone mapping vrs is also cool. This headroom will allow for more graphical effects and higher framerate modes. The lighting vrs even seemed like it can help cut down on the costs for raytracing by reducing the amount of rays necessary to calculate lighting per pixel block
It was tier 1. The example used back then was dirt 5.
 

GHG

Member
No, it is not a wait and see or promises. They make it clear that the implementation of VRS is evolving and improving and that these are expected to be seen both in UE5 and in future games with the ID engine.
Obviously there are only concrete and quantifiable results in games already released with VRS... but there the idea that they leave is that there will be evolution both in the implementation on different engines (compatibility with UE5/Nanite), as well as in IQ results and performance.

Another thing is that you want to be skeptical and show doubts (it is OK and respectable) about what they venture about the usefulness of VRS, but that is a different topic.

It's already clear the tech is useful, that's never been in doubt for me. However how much of a difference it will make going forwards remains to be seen. It's not the only technique developers are improving while everything else stays static.

So for anyone to hold this up as the one thing that will make all the difference going forwards is a bit odd to say the least.
 

RaZoR No1

Member
Good Burger Reading GIF


So.. does that mean, better utilizitation of the GPU? VRS was the tech to reduce the "pixels" for the areas, which are not in the focus or was is something else?
 
It was a meme because the expectations was Xbox tools would improve. But somehow, PS tools would remain stagnate. Which birthed the Xbox tools meme in the comparison threads.

Nobody ever suggested, not once, PS5 wouldn't improve. But the fact does remain that Xbox consoles has hardware support for more advanced graphics features that PS5's GPU does not support. That's all people were saying. Nobody is saying new techniques and tools won't be created to get the most from PS5.

In fact, don't even compare to PS5 in the context of these advancements as that isn't the point. The point is just that it will mean better games for Xbox with no regard for what it means for PS5 or how they compare.
 

ToTTenTranz

Banned
It saved 2.9ms total render time.
The Series consoles have been shown to use UE5 at 30FPS. That's 33ms per frame.
With this, the Series consoles run at 30ms per frame.


It's a 10% boost, from 30FPS to 33FPS. A bit less than that in the case of Series X.
I also don't get why they're claiming an initial Nanite processing time of 5.57ms, considering last year they claimed it was taking less than 2.1ms on the Series X.
Is it not the same Alpha Point demo from Coalition? Perhaps they're using worst-case scenarios (best case for VRS time savings).

It's still a great implementation, but nowhere near what's been claimed in the thread title.





I get that some people seem to need some kind of validation / vindication for throwing around stuff they knew very little about in the past, but this is unfortunately not their time yet.
Don't worry, though. If you believe it hard enough then it'll come true. Or at least that's what they say on my 6yo's cartoons.
 

LiquidMetal14

hide your water-based mammals
Good stuff. Need those Gears 5 VRS improvements on PC. It's glitchy for me when I use it even though it seems to run better so I disable it.
 
It saved 2.9ms total render time.
The Series consoles have been shown to use UE5 at 30FPS. That's 33ms per frame.
With this, the Series consoles run at 30ms per frame.


It's a 10% boost, from 30FPS to 33FPS. A bit less than that in the case of Series X.
I also don't get why they're claiming an initial Nanite processing time of 5.57ms, considering last year they claimed it was taking less than 2.1ms on the Series X.
Is it not the same Alpha Point demo from Coalition? Perhaps they're using worst-case scenarios (best case for VRS time savings).

It's still a great implementation, but nowhere near what's been claimed in the thread title.





I get that some people seem to need some kind of validation / vindication for throwing around stuff they knew very little about in the past, but this is unfortunately not their time yet.
Don't worry, though. If you believe it hard enough then it'll come true. Or at least that's what they say on my 6yo's cartoons.
This small boost could be the difference in some cases between a locked 60fps game and one that drops frames and causes screen tearing. It might not be revolutionary as some are claiming but it very well could be a nice QoL feature that makes gaming performance more stable moving forward. Your bad attitude is kind of gross tbh.
 

sendit

Member
Nobody ever suggested, not once, PS5 wouldn't improve. But the fact does remain that Xbox consoles has hardware support for more advanced graphics features that PS5's GPU does not support. That's all people were saying. Nobody is saying new techniques and tools won't be created to get the most from PS5.

In fact, don't even compare to PS5 in the context of these advancements as that isn't the point. The point is just that it will mean better games for Xbox with no regard for what it means for PS5 or how they compare.

happiness forgetting GIF by University of California
 
It saved 2.9ms total render time.
The Series consoles have been shown to use UE5 at 30FPS. That's 33ms per frame.
With this, the Series consoles run at 30ms per frame.


It's a 10% boost, from 30FPS to 33FPS. A bit less than that in the case of Series X.
I also don't get why they're claiming an initial Nanite processing time of 5.57ms, considering last year they claimed it was taking less than 2.1ms on the Series X.
Is it not the same Alpha Point demo from Coalition? Perhaps they're using worst-case scenarios (best case for VRS time savings).

It's still a great implementation, but nowhere near what's been claimed in the thread title.





I get that some people seem to need some kind of validation / vindication for throwing around stuff they knew very little about in the past, but this is unfortunately not their time yet.
Don't worry, though. If you believe it hard enough then it'll come true. Or at least that's what they say on my 6yo's cartoons.

You're literally saying stuff that goes against 2 + hours worth of presentations across 2 videos from different game developers that worked on actual games using some of this already, and who are actively using everything you're trying to claim is not so. I commend your boldness.
 

ToTTenTranz

Banned
You're literally saying stuff that goes against 2 + hours worth of presentations across 2 videos from different game developers that worked on actual games using some of this already

You mean the math part where saving 3ms results in 33FPS instead of 30FPS, or the part where I linked to a video presentation from the same developers whose statements you're egregiously twisting to claim ridiculous performance boosts?



I think that's just the 4k vs 1080p difference. Vertex counts go up with resolution on Nanite.
Nope, UE5 demos on new-gen consoles have always been 1080p internal resolution.

uRgrClM.jpg



You're right that Nanite depends on resolution, and on a 4K internal resolution it would cost 4x more than 1080p.
 

Shmunter

Member
Wait, so they’re implementing their own software vrs “reinhardluma2x” instead?

Seems to be echoing what the COD devs found where software was more flexible and more performant on balance. Also soon to be a Microsoft tech.
 
Last edited:

Loxus

Member
In theory these are amazing gains.

It will be interesting to see if/when we start seeing this in practice.

Probably not any time soon.
It depends on if it's 1440/30 or 1440/60.
If it's 30fps = 33.33ms, it's only 2-3 fps performance increase.

But if it's 60fps = 16.66ms, that's 11-12 fps performance increase which is pretty good if you ask me.

If we use UE5 Matrix demo as a benchmark, VRS isn't looking good.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
It depends on if it's 1440/30 or 1440/60.
If it's 30fps = 33.33ms, it's only 2-3 fps performance increase.

But if it's 60fps = 16.66ms, that's 11-12 fps performance increase which is pretty good if you ask me.

If we use UE5 Matrix demo as a benchmark, VRS isn't looking good.

I don't see a studio like Coalition aiming for anything but 60 FPS for the next Gears after how well Gears 5 was received for it's performance.
 

Corndog

Banned
VRS in Doom Eternal didn't get any worse, that's not entirely true. There is hit but more wins.

Thanks to VRS, you achieved in moments up to 30% more average resolution than what you got on PS5 and you achieved a definitely more noticeable crisp image while playing. Another thing is that by zooming in on objects out of the player's attention you see that they have a certain loss of detail.... But that's the idea around VRS.

How will it evolve and how much will VRS help the XSeries? We will see it, but, in this video, the authors present it as an almost obligatory option for use in XSeries for future games both in UE5 and in the ID engine.

P.S. Apparently Cyberpunk 1.5 uses VRS and the result is practically invisible even at 400% zoom according to Alex on twitter.
Might be necessary given they don’t have a tenser core equivalent.
 

Loxus

Member
I don't see a studio like Coalition aiming for anything but 60 FPS for the next Gears after how well Gears 5 was received for it's performance.
With how next gen games are running on these consoles.

You think it's possible for them to achieve 4k60 on XBSX and 1440/60 on XBSS?

It would be good if they can do it, but I think they going to do a Performance/Quality mode.
 

Hoddi

Member
You mean the math part where saving 3ms results in 33FPS instead of 30FPS, or the part where I linked to a video presentation from the same developers whose statements you're egregiously twisting to claim ridiculous performance boosts?




Nope, UE5 demos on new-gen consoles have always been 1080p internal resolution.

uRgrClM.jpg



You're right that Nanite depends on resolution, and on a 4K internal resolution it would cost 4x more than 1080p.
Yes, but those Alpha Point numbers that they're showing in the video are at 4k. I can't really think of a different reason why else it would go from 2.1ms to 5.6ms.
 
Last edited:

Loxus

Member
I wouldn’t be surprised if some of it helps Sony as well. My understanding is they don’t have tier 2, but some of it may help tier 1 stuff. I guess we will see.
Depends, don't forget the PS5 and Foveated Rendering.

Foveated Rendering is still dependent on a form of VRS to work.

When reading the patent on this, it reads like it's grouping pixels together depending on the object similar to Nanite groups triangles to objects.
 

adamsapple

Or is it just one of Phil's balls in my throat?
With how next gen games are running on these consoles.

You think it's possible for them to achieve 4k60 on XBSX and 1440/60 on XBSS?

It would be good if they can do it, but I think they going to do a Performance/Quality mode.

No, I don't think we should expect 4K. The UE5 demo had incredible IQ and it's reconstructing from a low resolution. Improvements and optimizations will get even better, aiming for native 4K when reconstruction is that good is pointless.
 

Shmunter

Member
Why is he comparing VRS to the geometry engine? Both consoles have a geometry engine and Xbox has a VRS support built-in as well (not sure bout ps5?).
ZxrRj72.jpg
The suggestion is the PS5 GE is a custom design that culls much earlier on. No need to shade something at all if it's been removed from the equation upfront.

It's plausible with Sony doing the PSVR2 eye tracking to dynamically adjust quality based on the users focus. Sounds like a lot more going on than just variable shading.
 
Last edited:

elliot5

Member
Sony have their own solutions

this is comparing apples to oranges. this guy is like "computing shaders on less triangles isn't as gainful as not having those triangles to begin with!"

like, no shit? But that is still possible on XSX as far as I'm aware via mesh shaders. According to the beyond3d forums this guy even said you'd use the Geometry Engine with VRS later in the pipeline (but couldn't say at the time if PS5 had VRS due to NDA lol). Y'all need to stop resorting to this man's tweet I've seen it like 10 times.
It was tier 1. The example used back then was dirt 5.
what are you talking about? I'm referring to DOOM Eternal. That uses T2 on XSX/S and PC. Same with Gears 5/Tactics. Same with Cyberpunk 2077 (according to Alex from DF). Even Dirt 5 was T2 AFAIK, but just not well implemented. It's a tool that needs to be custom implemented and utilized properly for the best performance-quality result.
unknown.png
 

3liteDragon

Member
No, I don't think we should expect 4K. The UE5 demo had incredible IQ and it's reconstructing from a low resolution. Improvements and optimizations will get even better, aiming for native 4K when reconstruction is that good is pointless.
I've seen a few on here always complain about why for example Insomniac & Guerrilla go for native 4K in their quality modes & waste GPU resources towards higher native rez instead of using it for something else instead. People need to understand that these are cross-gen games we've been getting so far & they're not gonna be pushing these new consoles to their absolute limits anyway, I expect current-gen only titles starting next year to be running at a 1440p resolution minimum, depending on what else they're using the GPU for with their respective games.
 

oldergamer

Member
this is comparing apples to oranges. this guy is like "computing shaders on less triangles isn't as gainful as not having those triangles to begin with!"

like, no shit? But that is still possible on XSX as far as I'm aware via mesh shaders. According to the beyond3d forums this guy even said you'd use the Geometry Engine with VRS later in the pipeline (but couldn't say at the time if PS5 had VRS due to NDA lol). Y'all need to stop resorting to this man's tweet I've seen it like 10 times.

what are you talking about? I'm referring to DOOM Eternal. That uses T2 on XSX/S and PC. Same with Gears 5/Tactics. Same with Cyberpunk 2077 (according to Alex from DF). Even Dirt 5 was T2 AFAIK, but just not well implemented. It's a tool that needs to be custom implemented and utilized properly for the best performance-quality result.
unknown.png
I dont know what you are talking about. You were the person that replied to my post saying dirt 5 used vrs tier 2 which im certain isnt correct. It used tier 1 for performance. No game had used tier 2 when dirt 5 had released.
 

elliot5

Member
I dont know what you are talking about. You were the person that replied to my post saying dirt 5 used vrs tier 2 which im certain isnt correct. It used tier 1 for performance. No game had used tier 2 when dirt 5 had released.
I pulled that picture off AMD's website talking about DX12 VRS... you don't get that level of granularity on Tier 1 VRS. It uses Tier 2.

You said "I kept telling people that was Tier 1 and not Tier 2." to Mr Moose who referred to DOOM Eternal, to which I said [Doom Eternal] used Tier 2. Nobody had made any reference to DIRT 5 until you did lol. Then I corrected you again saying DIRT 5 did use Tier 2...
 
Top Bottom