• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphics Tech | OT

Some people can't tell the difference between RT and normal rasterization of these features showing in games. I'd like to clear up some of these misconceptions.

RT shadows - the catch to recognizing this algorithm is in first understanding how RT shadows work. Since we aren't just casting simple 1 ray casts to get sharp shadows.. You need to recognize how a sharp shadow looks and a RT shadow looks using importance sampling (or area lights).

Here is a screen of ray-traced shadows vs. cascade soft shadows

Cascade will be blurred all around the edge of the shadow.. In the case of Demon Souls, I have NO idea what kind of shadows those are but they fail to look like RT shadows. In fact, they fail to look like cascade shadow volumes too. The very low sampling around the edges is troubling. I hope they aren't using small 1-2 ray casts around the edges to mimic soft shadows. IF those are RT shadows, then that would explain their low cost and thus why they can render this scene @ 60FPs, but they have got to be the ugliest shadows I've ever seen in a game tbh. Let's hope that's NOT RT shadows or at the very least it's cleaned up in the 30FPS mode of the game.

I have no idea why it looks bad to you, especially after I focused your attention on them.
Can you prove (with pictures) WHY they are bad? Which game has better shadows to you? Their last game (SotC remake) has better shadows too?

To me they look better than anything non-RT - shadow softness depends on object distance, it has all variety that you expect from RT shadows, and they fits perfectly to the enviorment .You don't see that on any other non-RT game.
Ratchet and Clank have amazing shadows, but sure they are not THAT realistic.

And how you explain this noise pattern then? You can clearly see it in all RT games, also in GT7 trailer (which has noise that has no other explanation other than low quality RT).
Yes, some games (Cryengine based for example) uses dithering or smoothing pattern on shadows that looks similar, but those are used to smooth low resolution shadows and looks different.
 

VFXVeteran

Banned
I have no idea why it looks bad to you, especially after I focused your attention on them.
Can you prove (with pictures) WHY they are bad? Which game has better shadows to you? Their last game (SotC remake) has better shadows too?

To me they look better than anything non-RT - shadow softness depends on object distance, it has all variety that you expect from RT shadows, and they fits perfectly to the enviorment .You don't see that on any other non-RT game.
Ratchet and Clank have amazing shadows, but sure they are not THAT realistic.

And how you explain this noise pattern then? You can clearly see it in all RT games, also in GT7 trailer (which has noise that has no other explanation other than low quality RT).
Yes, some games (Cryengine based for example) uses dithering or smoothing pattern on shadows that looks similar, but those are used to smooth low resolution shadows and looks different.

Sure.. before you can know what a rendering feature should look like, you have to use a reference image of what it should look like. In this case, we always revert back to an actual path-tracing that a lot of modern films use. I've taken a grab zoomed in of an actual path-traced area light shadow here:

kIHrDBO.png


Notice the soft edges that show no noise at all. Also take note how the edge is sharp at the bottom of the shadow right underneath the pawn geometry and slow becomes blurred the further away the geometry is from the light source. That's the true hallmark of a RT area light.

Now look at Demon Souls shadows compared to Tomb Raiders' RT shadows (which we know for certain is using RT as advertised):

KRxDns3.png


Which shadow looks the closest to the correct implementation of the first image?

There's no bias in these comparisons.. they are sheer facts that can't be argued. The DS shadows look nowhere near what a true RT implementation should look like.
 
Sure.. before you can know what a rendering feature should look like, you have to use a reference image of what it should look like. In this case, we always revert back to an actual path-tracing that a lot of modern films use. I've taken a grab zoomed in of an actual path-traced area light shadow here:

kIHrDBO.png


Notice the soft edges that show no noise at all. Also take note how the edge is sharp at the bottom of the shadow right underneath the pawn geometry and slow becomes blurred the further away the geometry is from the light source. That's the true hallmark of a RT area light.

Now look at Demon Souls shadows compared to Tomb Raiders' RT shadows (which we know for certain is using RT as advertised):

KRxDns3.png


Which shadow looks the closest to the correct implementation of the first image?

There's no bias in these comparisons.. they are sheer facts that can't be argued. The DS shadows look nowhere near what a true RT implementation should look like.
Those from DS are much lower quality, obviliously, as expected but its defenetely a step up from shadow maps last-gen. No more - no less.
 

VFXVeteran

Banned
Those from DS are much lower quality, obviliously, as expected but its defenetely a step up from shadow maps last-gen. No more - no less.

It depends on the shadow implementation. To say that those shadows look better than ANY shadow implementation last gen is a stretch. There are some really good shadow cascade algorithms that look better than DS. Also keep note that DS is limited by bandwidth and so will have to shortcut everything RT expensive (just like Spiderman Miles Morales reflections). Sometimes it's worth just to keep the rasterization implementation instead because it can look better for less cost.

In Witcher 3, you can see it's shadow maps look extremely well done.

v8GTVUf.png


I'm sure I can find other games that use conventional shadow techniques (rasterizing) but do them better than what we see here in Demon Souls at significantly less cost.
 
Sure.. before you can know what a rendering feature should look like, you have to use a reference image of what it should look like. In this case, we always revert back to an actual path-tracing that a lot of modern films use. I've taken a grab zoomed in of an actual path-traced area light shadow here:

kIHrDBO.png


Notice the soft edges that show no noise at all. Also take note how the edge is sharp at the bottom of the shadow right underneath the pawn geometry and slow becomes blurred the further away the geometry is from the light source. That's the true hallmark of a RT area light.

Now look at Demon Souls shadows compared to Tomb Raiders' RT shadows (which we know for certain is using RT as advertised):

KRxDns3.png


Which shadow looks the closest to the correct implementation of the first image?

There's no bias in these comparisons.. they are sheer facts that can't be argued. The DS shadows look nowhere near what a true RT implementation should look like.
There are sharp shadows in real life too, it depends on the light source and its relative location.



Demon souls shadows are far less noisy than that image would imply. That was a drastic change in brightness with a moving object. RTX minecraft path tracing shows similar massive noise when an object moves and there is a strong change in brightness.


Look at how clean the left wall shadows and right wall shadows are.
 
Last edited:

VFXVeteran

Banned
There are sharp shadows in real life too, it depends on the light source and its relative location.



Demon souls shadows are far less noisy than that image would imply. That was a drastic change in brightness with a moving object. RTX minecraft path tracing shows similar massive noise when an object moves and there is a strong change in brightness.


Look at how clean the left wall shadows and right wall shadows are.


I don't think DS 60FPS gameplay mode is using RT shadows. If they are, then they are nothing special to write home about. They could just as easily get away with rasterization shadows. As you always like to say, the differences with RT is negligible. It's certainly going to be more negligible with the console games that don't have enough bandwidth to brute force good approximations.
 
Last edited:
I don't think DS 60FPS gameplay mode is using RT shadows. If they are, then they are nothing special to write home about. They could just as easily get away with rasterization shadows. As you always like to say, the differences with RT is negligible. It's certainly going to be more negligible with the console games that don't have enough bandwidth to brute force good approximations.
UE5 has good approximations for global illumination(expected to run at 60fps on ps5), and as I said in the other thread svogi was considered viable had the consoles had 2.5Tflops, from what I heard.

In one or two generations when full path tracing, or the equivalent, becomes viable than we'll see the benefits of hardware ray tracing.

The marbles demo with full path tracing is still more impressive than basically all the other uses of rtx we've seen.
 

VFXVeteran

Banned
UE5 has good approximations for global illumination(expected to run at 60fps on ps5), and as I said in the other thread svogi was considered viable had the consoles had 2.5Tflops, from what I heard.

You won't see 60FPS with UE5 or any other GI implementation on consoles unless you are willing to render all the way down to 1080p.
 

GreyHand23

Member
You won't see 60FPS with UE5 or any other GI implementation on consoles unless you are willing to render all the way down to 1080p.

They already said they are expecting to hit 60 fps @ 1440p with Lumen by next year. It's still a new process and hasn't been finalized. Also why are launch games being judged as if this is the extent of what developers will be able to achieve? Every generation we know as developers learn the hardware, the graphics improve. The difference between a launch PS4 game and The Last of Us 2 is pretty massive.
 
Last edited:

VFXVeteran

Banned
They already said they are expecting to hit 60 fps @ 1440p with Lumen by next year. It's still a new process and hasn't been finalized. Also why are launch games being judged as if this is the extent of what developers will be able to achieve? Every generation we know as developers learn the hardware, the graphics improve. The difference between a launch PS4 game and The Last of Us 2 is pretty massive.

I don't know what the settings will be for Lumen, or how it will look. We just know nothing about it. You can't get 60FPS @ 1440p in Crysis with SVOGI and that uses similar technique.

In answer to your next question, that's because graphics don't improve based on tech unless it's on an open ended hardware like the PC. There is nothing from last gen that was improved upon to be made better at the end of this generation. The hardware stayed the same for consoles and the only thing that changed was the art direction. Anisotropic filtering didn't get better. Games still ran at 4k/30FPS. There was still the same type of shaders used (PBR). SSR was used throughout the entire generation. Normal maps, parallax occlusion mapping, SSAO, GI using light probes, still cast 1 shadow per light source, having a fixed LOD for both shadowing and geometry etc..

We have to be honest with ourselves and look at the tech for what it is. RTX is expensive. There is no way around it. If the hardware doesn't have DLSS or some equivalent, the devs aren't going to somehow extract bandwidth where there is none to be found.

We are not in the days of the PS3 where the hardware was hard to develop on because it was something radical. These are x86 boxes with typical GPUs and memory subsystems.
 
Last edited:
You won't see 60FPS with UE5 or any other GI implementation on consoles unless you are willing to render all the way down to 1080p.

I don't know what the settings will be for Lumen, or how it will look. We just know nothing about it. You can't get 60FPS @ 1440p in Crysis with SVOGI and that uses similar technique.
You can probably get svogi at 30fps at 4k on ps5, 60fps at 1440p, crysis probably has a few framerate burning gpu selling rtx bells and whistles.

2.5Tflops was expected to be enough for 1080p svogi, the ps5 has 4x that so it should be able to handle 4x the 1080p resolution, or 4k.

Already epic said iirc that ue5 lumen runs notably higher than 30fps on ps5 dev kits, and they expect to reach 1440p 60fps. That's what they claim, that's what we know.
 

VFXVeteran

Banned
You can probably get svogi at 30fps at 4k on ps5, 60fps at 1440p, crysis probably has a few framerate burning gpu selling rtx bells and whistles.

2.5Tflops was expected to be enough for 1080p svogi, the ps5 has 4x that so it should be able to handle 4x the 1080p resolution, or 4k.

Already epic said iirc that ue5 lumen runs notably higher than 30fps on ps5 dev kits, and they expect to reach 1440p 60fps. That's what they claim, that's what we know.

Again, without settings we don't know anything about quality. I can push anisotropic filtering as being included in 60FPS, but until I know it's at the max setting of 16X, it doesn't really amount to anything. With RT, the more rays you cast, the higher the burden and the cleaner the render... optimization or not. It costs data no matter what the form. To say that a PS5 can run svogi @ 1440p/60FPS isn't the same thing as me saying PS5 can't run Crysis Remake @ 1440p/60FPS that uses svogi. Don't equate the two as facts when one has a clear metric since it's out now on hardware and the other is not.
 
Last edited:

GymWolf

Gold Member
Imagine having to zoom or post other examples to even notice if rtx are used or not...

Next big thing since 2d to 3d my ass :lollipop_squinting:

Let's see if anyone needs accurate analysis to notice realistic animations or physics in a game...
 
I'm really disappointed that no one can notice global illumination. That's a heart breaker. Also, 8k textures.. it's so night/day it's not even funny.
The problem is when you have prebaked lighting that can be put side by side with real life photos and it is difficult to tell which is real and which is realtime.

Of course if this was prior to the advances in faking it would be very notable.
Again, without settings we don't know anything about quality. I can push anisotropic filtering as being included in 60FPS, but until I know it's at the max setting of 16X, it doesn't really amount to anything. With RT, the more rays you cast, the higher the burden and the cleaner the render... optimization or not. It costs data no matter what the form. To say that a PS5 can run svogi @ 1440p/60FPS isn't the same thing as me saying PS5 can't run Crysis Remake @ 1440p/60FPS that uses svogi. Don't equate the two as facts when one has a clear metric since it's out now on hardware and the other is not.
You remember the crysis wireframes showing flat walls with extremely heavy tesselation just to bring down performance on amd cards? I wouldn't be surprised if they push excessive rtx features to try and sell nvidia gpus.
 
Last edited:

Rikkori

Member
You can probably get svogi at 30fps at 4k on ps5
Easily. I ran KCD earlier to test & even with Lighting on ultra high at 1800p on an RX 480 it stayed comfortably above 30 fps, and that GPU is below the one in an X1X. A PS5 will have no issue adding SVOGI to absolutely anything.
 
Last edited:

VFXVeteran

Banned
Easily. I ran KCD earlier to test & even with Lighting on ultra high at 1800p on an RX 480 it stayed comfortably above 30 fps, and that GPU is below the one in an X1X. A PS5 will have no issue adding SVOGI to absolutely anything.

KCD GI settings aren't the same as Crysis Remake's GI settings. Crysis is way more expensive. Also, I have no doubt that the PS5 could probably do 4k/30 or even 1440p/30 with RT GI on. But that would be the ONLY RT feature it will have. My point in all of this is that the consoles only have enough power for 1 RT feature (maybe 2 if the other feature is pretty light like RT shadows). It will be used in a limited capacity. In no way will the consoles have multiple RT features like in Cyberpunk, Metro, and Control.
 

OmegaSupreme

advanced basic bitch
Not sure if this is the place for this but why can't we get the hair right in 98 percent of games? It's usually just a clump of mud that barely moves. I want my handsome/hot hero to have beautiful flowing hair. Is that too much to ask?
 

Rikkori

Member
KCD GI settings aren't the same as Crysis Remake's GI settings.
Yes, but they are adjustable. If you want you can easily equalize them through console commands and even put it way above, but there's simply no point - the law of diminishing returns. Also, in Crysis Remastered they are still not doing full scene SVOGI either, so performance & results are more or less near each other.

Crysis is way more expensive.
Nope, it can run even on the switch. Otherwise:

"Performance
The performance depends on which GI settings are used. Usually on Xbox One it takes 3-4 ms of GPU time and on an average PC it takes 2-3 ms (AO + Sun bounce, no point lights, low-spec mode). The fastest configuration is AO only mode; this provides large scale AO at a cost of less than 2 ms on Xbox One."

Also, I have no doubt that the PS5 could probably do 4k/30 or even 1440p/30 with RT GI on.
We're not talking about RTGI.
You said
"You can't get 60FPS @ 1440p in Crysis with SVOGI and that uses similar technique. "
That is unequivocally wrong. SVOGI is simply not that expensive for the shipped versions.



But that would be the ONLY RT feature it will have.
Wrong again, Minecraft is path-traced on XSX. And keep in mind this is just the beginning, there's insane room for optimisations down the road, particularly for denoisers. There's no reason why they couldn't do multiple RT effects even in "AAA" titles, but it just doesn't make sense usually because you still get more bang 4 buck from other techniques.
 

VFXVeteran

Banned
Yes, but they are adjustable. If you want you can easily equalize them through console commands and even put it way above, but there's simply no point - the law of diminishing returns. Also, in Crysis Remastered they are still not doing full scene SVOGI either, so performance & results are more or less near each other.


Nope, it can run even on the switch. Otherwise:

"Performance
The performance depends on which GI settings are used. Usually on Xbox One it takes 3-4 ms of GPU time and on an average PC it takes 2-3 ms (AO + Sun bounce, no point lights, low-spec mode). The fastest configuration is AO only mode; this provides large scale AO at a cost of less than 2 ms on Xbox One."


We're not talking about RTGI.
You said

That is unequivocally wrong. SVOGI is simply not that expensive for the shipped versions.




Wrong again, Minecraft is path-traced on XSX. And keep in mind this is just the beginning, there's insane room for optimisations down the road, particularly for denoisers. There's no reason why they couldn't do multiple RT effects even in "AAA" titles, but it just doesn't make sense usually because you still get more bang 4 buck from other techniques.



It's funny how you refute me with using PC hardware instead of next-gen hardware metrics. You don't know whether what I'm saying is wrong or not.

1) Prove that SVOGI isn't being used in the entire scene.

2) How can you refute me saying that SVOGI is quite expensive using the settings I'm using yet you are saying it only takes 2-3ms and not knowing what the settings are? You are basically saying, "you're wrong about it being expensive on ultra settings on the PC when they say it only takes 2-3ms but I have no idea what the parameter values were to get that 2-3ms so I assume it's at the max settings of the PC ultra settings. Do you see how misleading that argument is?

If I say anisotropic filtering is expensive at 16x, you can't come back and say - no it's pretty cheap - even though the console is only running it at 4x.

3) Lastly, if I make a claim without having any proof... how can you refute my claim also having no proof? Neither of us can determine what res/fps the next-gen consoles can run Crysis remake SVOGI with. What's for certain though is that we will find out. And I can bet, my claim will be closer to reality than yours.
 

Rikkori

Member
It's funny how you refute me with using PC hardware instead of next-gen hardware metrics. You don't know whether what I'm saying is wrong or not.

1) Prove that SVOGI isn't being used in the entire scene.
All the info you need to understand that is in the previous post I made. Read the cryengine entry and then watch the console command in the video.


2) How can you refute me saying that SVOGI is quite expensive using the settings I'm using yet you are saying it only takes 2-3ms and not knowing what the settings are? You are basically saying, "you're wrong about it being expensive on ultra settings on the PC when they say it only takes 2-3ms but I have no idea what the parameter values were to get that 2-3ms so I assume it's at the max settings of the PC ultra settings. Do you see how misleading that argument is?

Again, watch the video and add 1 and 1 together. It's all there.

If I say anisotropic filtering is expensive at 16x, you can't come back and say - no it's pretty cheap - even though the console is only running it at 4x.

3) Lastly, if I make a claim without having any proof... how can you refute my claim also having no proof? Neither of us can determine what res/fps the next-gen consoles can run Crysis remake SVOGI with. What's for certain though is that we will find out. And I can bet, my claim will be closer to reality than yours.

I have presented proof, but you seem not to understand , so here it goes:

We have an idea about general SVOGI costs for DIFFUSE (!) illumination. You can see this in the cryengine wiki. We KNOW what Crysis Remastered is doing & it's ALSO diffuse illumination variant (JUST LIKE KCD!) - look at console command. We KNOW what the full cost of it is (in this config 14% hit), and you can SEE IT in the video above, which is done on a 1070 Ti. A 1070 Ti is strictly INFERIOR to both PS5 & XSX. Right now full scene SVOGI (in cryengine) is experimental and not used in ANY shipped product, and don't even ask me - you go find it and figure it out.

Do you now understand how 1440p 60 fps + SVOGI (equivalent to CR's highest SVOGI setting) is a piece of cake & fully doable on consoles?
 
Last edited:

VFXVeteran

Banned
All the info you need to understand that is in the previous post I made. Read the cryengine entry and then watch the console command in the video.




Again, watch the video and add 1 and 1 together. It's all there.



I have presented proof, but you seem not to understand , so here it goes:

We have an idea about general SVOGI costs for DIFFUSE (!) illumination. You can see this in the cryengine wiki. We KNOW what Crysis Remastered is doing & it's ALSO diffuse illumination variant (JUST LIKE KCD!) - look at console command. We KNOW what the full cost of it is (in this config 14% hit), and you can SEE IT in the video above, which is done on a 1070 Ti. A 1070 Ti is strictly INFERIOR to both PS5 & XSX. Right now full scene SVOGI (in cryengine) is experimental and not used in ANY shipped product, and don't even ask me - you go find it and figure it out.

Do you now understand how 1440p 60 fps + SVOGI (equivalent to CR's highest SVOGI setting) is a piece of cake & fully doable on consoles?

SVOGI for diffuse is still SCENE wide! WTF? Every single leaf on a tree has to evaluate the GI contribution on that leaf. You are talking about the rendering equation NOT the scene. Diffuse is 1 contribution to the rendering equation. Specular would be another one but that's covered with RT reflections for mirror-like surfaces. So you are saying things that don't make sense from a graphics programmer's perspective. Any technique that covers the "scene" means every object in the scene is going to contribute to the overall illumination.

Now that we've gotten that cleared up.. you showed me a video of a guy changing a boolean value of turning SVOGI on or off in the console. Where are my settings for how many rays cast? How about how much storage I allocated for BVH? What's my limit on recursion? How many bounces can I make? These are implicit settings that the gamer never sees when he goes from "Low" to "Can it run Crysis" settings.

Total%20Illumination%20location.jpg



You guys should stop trying to be armchair programmers because you look really silly talking to one with your vague comments that show you have no idea how any of this stuff works.
 
Last edited:

Rikkori

Member
SVOGI for diffuse is still SCENE wide! WTF? Every single leaf on a tree has to evaluate the GI contribution on that leaf. You are talking about the rendering equation NOT the scene. Diffuse is 1 contribution to the rendering equation. Specular would be another one but that's covered with RT reflections for mirror-like surfaces. So you are saying things that don't make sense from a graphics programmer's perspective. Any technique that covers the "scene" means every object in the scene is going to contribute to the overall illumination.

Now that we've gotten that cleared up.. you showed me a video of a guy changing a boolean value of turning SVOGI on or off in the console. Where are my settings for how many rays cast? How about how much storage I allocated for BVH? What's my limit on recursion? How many bounces can I make? These are implicit settings that the gamer never sees when he goes from "Low" to "Can it run Crysis" settings.

Total%20Illumination%20location.jpg



You guys should stop trying to be armchair programmers because you look really silly talking to one with your vague comments that show you have no idea how any of this stuff works.

The current default configuration provides only diffuse GI. For specular lighting we still need light probes.
Integration ModeGI computations may be used in several ways:
0 = AO + Sun bounce
Large scale ambient occlusion (static) modulates or replaces the default ambient lighting.
Single light bounce (fully real-time) is supported for sun and, with limitations, for projectors.
This mode takes less memory and only the opacity is voxelized. This also works acceptably on consoles.
1 = Diffuse GI mode (experimental)
GI completely replaces the default diffuse ambient lighting.
Two indirect light bounces are supported for sun and semi-static lights. Use _TI in light name.
Single fully dynamic light bounce is supported for projectors. Use _TI_DYN in light name.
Default ambient specular is modulated by the intensity of the diffuse GI.
2 = Full GI mode (very experimental)
Both ambient diffuse and ambient specular lighting is computed by voxel cone tracing.
This mode works fine only on high-end modern PCs.

April 2020:


I'm only gonna work with mode 0 because frankly that's the only one that I'd consider valid for shippable game

I tried to be fair with you, but you really didn't deserve it. I'm done.
 

VFXVeteran

Banned
I tried to be fair with you, but you really didn't deserve it. I'm done.

You have said nothing to the contrary of what I'm talking about. GI for specular is indeed blurred reflections or simply a single reflected ray cast into the scene.

Again, if you aren't hired to do this kind of stuff then you really shouldn't be trying to "show up" a developer who has gotten paid in the realworld for knowing this stuff.
 
Top Bottom