• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry, Alan Wake 2 PC - Rasterisation Optimised Settings Breakdown - Is It Really THAT Demanding?

winjer

Gold Member
With mesh shaders who knows, this is first game using it. Usually 3070 and 2080ti are on par outside of some RT workloads.

Exactly, that's why I said that it might be because of pixel fill rate. As this is the one thing that kinda line's up with the performance we get from each GPU.
And since mesh shaders increase the amount of primitives to rasterize, it might be the main performance factor for this game, at those settings.
Remember that comparison is using low RT settings, so neither the 3070 or the 2080Ti are being pushed in this metric.

EDIT: and this is not the first game using Mesh Shaders. UE5 supports Mesh shaders and Primitive shaders for nanite.
 
Last edited:

Bojji

Member
Exactly, that's why I said that it might be because of pixel fill rate. As this is the one thing that kinda line's up with the performance we get from each GPU.
And since mesh shaders increase the amount of primitives to rasterize, it might be the main performance factor for this game, at those settings.
Remember that comparison is using low RT settings, so neither the 3070 or the 2080Ti are being pushed in this metric.

EDIT: and this is not the first game using Mesh Shaders. UE5 supports Mesh shaders and Primitive shaders for nanite.

Isn't nanite running in software mode in games released so far?

Nope. Chinese MMORPG Justice was the first game to use Mesh Shaders.

Yeah, Alex mentioned this one few months ago on YT. But it's hilarious that we have two games since 2018.
 

Vergil1992

Member
I should focus these videos on PC, on how they work in different configurations, what are the recommended settings according to the different GPUs, if it is well optimized and things like that, but that would be asking too much of Alex, what he likes is to compare PS5 with PC to make a fool of the console and that's it, it's an absurd comparison.
¿Why?

Have you read the article? If the analysis is precisely what you ask for...

Another issue is that it bothers you for some reason that it is compared to PS5 (until launch day they did not have the Xbox code), but different graphic settings are exposed in the analysis. Consoles are used as an example because they are usually what Alex calls "optimized settings."


This video is very interesting for all those who have GPUs similar to PS5/XSX. In fact, this video is much more useful than typical PS5/XSX comparisons, which are essentially the same almost every time.
 
Last edited:

Mr Moose

Member
Wait since when PS5 has mesh shaders? Didn't they repeat like a mantra that only XSX had mesh shaders for years?

What's their new goal post on this odd predicament?
magic-shia-la-beouf.gif
 

Gaiff

SBI’s Resident Gaslighter
Wait since when PS5 has mesh shaders? Didn't they repeat like a mantra that only XSX had mesh shaders for years?

What's their new goal post on this odd predicament?
It has Primitive Shaders with API functions that presumably allow it to effectively use Mesh Shaders. I'm still reading about it but am confused. No one really explains how they got it working. Primitive Shaders were actually introduced with AMD Vega but DX doesn't provide support for it so nothing can be really done with them on pre RDNA2 cards.
 

Mr Moose

Member
Lysandros Lysandros The captures were done with a 3600 with Vsync Off.

kigG7E8.jpg


The 3070 is unusually strong here, comfortably beating the 2080 Ti when they're usually within 1-5% of one another, with the 2080 Ti even beating it sometimes. The 2080 Ti is 29% faster than the PS5, so within the norm. The PS5 is in turn 9% faster than the 2070S, again, within normal range. The 3070 is a bit of an outlier, albeit, not a massive one.
I wouldn't mind seeing more AMD cards tested. Does he only have two AMD cards?
 

Vergil1992

Member
I wouldn't mind seeing more AMD cards tested. Does he only have two AMD cards?
It seems that no, AMD does not exist for Alex on PC.

And they are the most valid for comparing performance with PS5/XSX, but Alex seems to be obsessed with comparing them with Nvidia GPUs. In this case, the game also technically favors Nvidia GPU, as was already the case with Control.


In this case, from what I have been able to find out, PS5 would be close to a 6700XT. Between a 6600XT and a 6700XT.
 

winjer

Gold Member
Isn't nanite running in software mode in games released so far?

You are probably mistaking Lumen for Nanite.
Nanite is a technique for virtualized geometry, which relies on mesh shaders, or primitive shaders, or emulation of mesh shaders through compute.
Lumen is a ray-tracing technique. Software lumen uses Signed Distance Fields. Hardware Lumen use hardware ray-tracing.
 

Lysandros

Member
PS5 being ~10% faster than RTX 2070S is expected, since its GPU is faster in quite a few areas besides pixel fill rate. RTX 2080/S is generally a better match (as shown in this game also) but PS5 is still ahead of those in fixed function throughput including fill rate.

How architecturally closer RDNA2 GPUs like RX 6700 with high pixel fill rate (156.8 Gpixel/s) compare to Nvidia equivalents in this game by the way, do we have benchmarks? This could shade some light on pixel fill rate theory.
 
Last edited:

winjer

Gold Member
PS5 being ~10% faster than RTX 2070S is expected, since its GPU is faster in quite a few areas besides pixel fill rate. RTX 2080/S is generally a better match (as shown in this game also) but PS5 is still ahead of those in fixed function throughput including fill rate.

How architecturally closer RDNA2 GPUs like RX 6700 with high pixel fill rate (156.8 Gpixel/s) compare to Nvidia equivalents by the way, do we have benchmarks?

The only site I know that used to benchmark specific metrics of a GPU, was anandtech. But they stopped doing GPU reviews a few years ago.
But they still made for the 5700XT and 2070S.

8k8mPwX.png
 

Lysandros

Member
The only site I know that used to benchmark specific metrics of a GPU, was anandtech. But they stopped doing GPU reviews a few years ago.
But they still made for the 5700XT and 2070S.

8k8mPwX.png
Sorry i edited my post later. I mean on Alan Wake 2, showing let's say RTX 2080/S versus RX 6700.
 

Zathalus

Member
PS5 being ~10% faster than RTX 2070S is expected, since its GPU is faster in quite a few areas besides pixel fill rate. RTX 2080/S is generally a better match (as shown in this game also) but PS5 is still ahead of those in fixed function throughput including fill rate.

How architecturally closer RDNA2 GPUs like RX 6700 with high pixel fill rate (156.8 Gpixel/s) compare to Nvidia equivalents by the way, do we have benchmarks?
RDNA1+2 are roughly 10% behind Turing. A 2070 Super and 5700XT are equal in basically everything (clock speed, bandwidth, shaders, ROPs), but the 2070 Super is 10% faster. A 6700XT is only 10% faster then a 2080 despite having a 23% advantage in compute and a 40% advantage in pixel rate.
 

Gaiff

SBI’s Resident Gaslighter
PS5 being ~10% faster than RTX 2070S is expected, since its GPU is faster in quite a few areas besides pixel fill rate. RTX 2080/S is generally a better match (as shown in this game also) but PS5 is still ahead of those in fixed function throughput including fill rate.

How architecturally closer RDNA2 GPUs like RX 6700 with high pixel fill rate (156.8 Gpixel/s) compare to Nvidia equivalents in this game by the way, do we have benchmarks? This could shade some light on pixel fill rate theory.

Relevant GPUs using DLSS/FSR are highlighted. This is at 1080p High so won't 100% match Alex's results but gives us something to cross-reference.
yvOFxwX.png


If we use the 3070 as a baseline:

3070: 100%
2080 Ti: 92.9% (Alex: 90.3%)
3060 Ti: 87%
6700 XT: 82.3%
2080: 75.3%
6600 XT: 75.3%
PS5: 69.9% (Only in Alex's benchmark, it would probably be at around 64-70fps here so probably 75-80%)
2070S: 63.9% (Alex)

2070: 62.4%

Since those sites obviously don't benchmark a PS5, it would fall within the range of a 2080/S. Probably still below the 6700 XT.
 
Last edited:

Zathalus

Member
Relevant GPUs using DLSS/FSR are highlighted. This is at 1080p High so won't 100% match Alex's results but gives us something to cross-reference.
yvOFxwX.png
Yeah this proves the game doesn't favour either Nvidia or AMD heavily if RT is not involved. Seems like it is optimized for everything.
 

Lysandros

Member
Only rasterization. No RT.
performance-2560-1440.png
Thanks, sadly those specific GPUs aren't on the list. I was curious about your mesh (primitive) shaders/pixel fill rate theory since equivalent RDNA2 GPUs have higher pixel fill rates compared to Nvidia ones so in this context expected to have higher performance. Based on RX 6600 xt results here, this doesn't seem to be the case since it apparently performs worse than RTX 3060 which has a meager 85 Gpixel/s of pixel fill rate.
 
Last edited:

acm2000

Member
Wait since when PS5 has mesh shaders? Didn't they repeat like a mantra that only XSX had mesh shaders for years?

What's their new goal post on this odd predicament?
from my understanding PS5 supports Mesh shaders in hardware but not Amplification shaders (xbox and pc do) which Remedy didnt use for Alan Wake 2. The more cynical people could probably claim the ps5 is holding back pc/xbox by not allowing Remedy to make use of that feature. :messenger_tears_of_joy:
 

Bojji

Member
You are probably mistaking Lumen for Nanite.
Nanite is a technique for virtualized geometry, which relies on mesh shaders, or primitive shaders, or emulation of mesh shaders through compute.
Lumen is a ray-tracing technique. Software lumen uses Signed Distance Fields. Hardware Lumen use hardware ray-tracing.

Only rasterization. No RT.
performance-2560-1440.png

Yeah I know it can be accelerated by hardware but so far I don't see any proof that we have seen any game with hardware nanite, only software trough compute. In Alan wake 6600XT is 50% faster than 5700XT thanks to hardware MS.

Now look at this:

performance-1920-1080.png
performance-1920-1080.png
performance-1920-1080.png


Games with nanite, 5700XT is basically on par with 6600XT, either all of them are running software nanite or hardware acceleration is worth shit (now).
 

winjer

Gold Member
Yeah I know it can be accelerated by hardware but so far I don't see any proof that we have seen any game with hardware nanite, only software trough compute. In Alan wake 6600XT is 50% faster than 5700XT thanks to hardware MS.

Now look at this:

performance-1920-1080.png
performance-1920-1080.png
performance-1920-1080.png


Games with nanite, 5700XT is basically on par with 6600XT, either all of them are running software nanite or hardware acceleration is worth shit (now).

But we do know that UE5 uses mesh shaders and primitive shaders.
The 6600XT is using mesh shaders and the 5700XT is using primitive shaders.
 

Gaiff

SBI’s Resident Gaslighter
Thanks, sadly those specific GPUs aren't on the list. I was curious about your mesh (primitive) shaders/pixel fill rate theory since equivalent RDNA2 GPUs have higher pixel fill rates compared to Nvidia ones so in this context expected to have higher performance. Based on RX 6600 xt results here, this doesn't seem to be the case since it apparently performs worse than RTX 3060 which has a meager 85 Gpixel/s of pixel fill rate.
That was actually winjer winjer 's theory. Additionally, I wouldn't rely on all the specs sheets because as I said in this thread, NVIDIA underreports their clocks. The 2080 Ti's page has it at 1545MHz Boost Clock which is completely not true. This is the absolute minimum boost clock and will NEVER happen in real life gaming scenarios because that would mean a 2080 Ti that's like 20% slower than the average one because of those anemic clocks. Real ones are in the 1890-1920MHz range.

For instance, the Techpowerup page lists the 2080 Ti as having a pixel fill rate of 136 Gpixels/S but that's based on the 1545MHz boost clock reported by NVIDIA.

23nCYSF.png

dyq0cmN.png


With real world clocks of 1900MHz, the 2080 Ti's pixel fill rate is actually around 167Gpixels/s, a whopping 23% above what NVIDIA reports, simply because the gaming clocks are much higher than the pathetic 1545MHz boost reported.

Another example is the 2070S' 113.3 Gpixels/s vs the 5700 XT's 121.9. 2070S's boost clocks is said to be 1770MHz. But in reality? It's more like 1920Mhz.

Euduagk.png
DdOLt0M.png

With real clocks (1920 MHz), the 2070S slightly edges out the 5700 XT with 122.8 Gpixels/s

AMD's boost clocks are generally reliable. NVIDIA's clocks with Ampere and especially Turing, aren't. Didn't really check Lovelace. Whatever the case, the pixel fill rate advantage of the 3070 over the 2080 Ti is like 10%. They have similar boost clocks but the 3070 has an additional 8 ROPs and slightly higher average clocks too. Another factor in addition to the pixel fill rate advantage of the 3070 is the massive 2xFP32 advantage of the Ampere and Lovelace architectures over their predecessors. This could perhaps explain the not-so-massive but still significant 8-11% advantage.
 

Bojji

Member
But we do know that UE5 uses mesh shaders and primitive shaders.
The 6600XT is using mesh shaders and the 5700XT is using primitive shaders.

Ok, that would explain it. DX12U don't use primitive shaders in RDNA1 (or Vega) but game engine probably can.

Too bad they didn't test pascal for comparison.
 
from my understanding PS5 supports Mesh shaders in hardware but not Amplification shaders (xbox and pc do) which Remedy didnt use for Alan Wake 2. The more cynical people could probably claim the ps5 is holding back pc/xbox by not allowing Remedy to make use of that feature. :messenger_tears_of_joy:
So it's another goal post then? PS5 supports Mesh Shaders but it doesn't support the True Mesh Shaders?
 

adamosmaki

Member
So now he shifts goalposts and says the 3070 is 50% faster than a ps5 while it’s actually a little more than 40%. Also no shit that a card that costs 2 times a ps5 non disk editions outperforms a ps5. It’s such a weird flex to make. That pc is at least 1200 dollars, at the very least. But then again, it’s Alex so he’ll do anything to make the ps5 look bad for some reason.
What the hell is up with Ps5 fanboys and DF?
He said nearly 50% and is 43% so where exactly did he say something wrong? Also a 3070/6700/6750xt all of which perform similar cost usually between 320-370 so how is that exactly twice yhe ps5
 

winjer

Gold Member
Ok, that would explain it. DX12U don't use primitive shaders in RDNA1 (or Vega) but game engine probably can.

Too bad they didn't test pascal for comparison.

We have to remember that when Turing and RDNA1 were being developed, MS was still looking to the specs for the DX12_2 standard.
AMD presented the Primitive Shaders and NVidia presented the Mesh Shaders. In the end, MS chose Mesh Shaders.
The thing with RDNA1 and Primitive Shaders is not that it is old tech, but rather that it is the path not taken.
Unlike a GPU like Pascal or Polaris, which have no support for either Mesh or Primitive Shaders.
 

Gaiff

SBI’s Resident Gaslighter
So it's another goal post then? PS5 supports Mesh Shaders but it doesn't support the True Mesh Shaders?
Amplification shaders or task shaders as they're called outside of DX12, is just an optional stage before mesh shaders that allows for a couple of additional and nicer functions. I still don't 100% understand the difference between mesh and primitive shaders. There's plenty of documentation about mesh shaders, but precious little about primitive shaders, let alone its implementation in a single title.
 

Gaiff

SBI’s Resident Gaslighter
My guess is Hellblade 2 will probably use Mesh Shaders and Task Shaders as well.
 

Senua

Member
What the hell is up with Ps5 fanboys and DF?
He said nearly 50% and is 43% so where exactly did he say something wrong? Also a 3070/6700/6750xt all of which perform similar cost usually between 320-370 so how is that exactly twice yhe ps5
I get why PS warriors are against Alex, he poo-poos their favourite toy, but the rest of DF? That's just dumb
 

shamoomoo

Member
What the hell is up with Ps5 fanboys and DF?
He said nearly 50% and is 43% so where exactly did he say something wrong? Also a 3070/6700/6750xt all of which perform similar cost usually between 320-370 so how is that exactly twice yhe ps5
If the performance figures was 33%, would you count that as nearly 40% in difference?
 

SlimySnake

Flashless at the Golden Globes
I cant believe Alex didnt mention that the RT transparency setting handles reflections. I feel so stupid because i didnt read the description at all.

the reflections are fantastic and not a big hit to the framerate. in the city, i got only a 20% hit to performance and they look great. in the forest which is very happy, im getting drops to 35 fps in the swamp area but going to leave them on because i doubt most of the game is set in that area.
 

sinnergy

Member
It's good that MS exclusives are showing true power of Mesh Shaders! /s
I read reports , some areas run worse on PS5, could explain if these are the different results of primitive vs mesh shaders, I read mesh shader implementation is a bit more to the metal and primitive shaders are driver driven . Especially in forest areas, on series X here , I didn’t notice drops .
 

Bojji

Member
I read reports , some areas run worse on PS5, could explain if these are the different results of primitive vs mesh shaders, I read mesh shader implementation is a bit more to the metal and primitive shaders are driver driven . Especially in forest areas, on series X here , I didn’t notice drops .

Yeah, this could be it. It also can be that their engine performs better on Xbox, Control had higher framerate (with unlocked frame limit in photo mode) on XSX.

Some games prefer PS5 architecture and some prefer Xbox solution, this is the story of this generation, mesh shaders may or may not have performance impact in this scenario.
 
What can I expect to get on a 2080ti?

I want games to look good, but if there was one last game I wish I could run fine before having tp upgrade, it's this one, I hate to think I already need to upgrade.
 

Bojji

Member
Frame gen off:

OH6wJGn.jpg


Frame gen on:

Wpp9kZ7.jpg


4K DLSS P "High" preset with Path Tracing on low. Game looks and runs great on 4070ti. Screenshots are way darker than actual gameplay with HDR.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Playing it maxed out on a R73700X/3080 10GB system (1440p, DLSS Quality, RT Max) and locked it at 40FPS.

Plays well enough with all bells and whistles, with a small sacrifice to fluidity.
Interesting. Im playing on a 3080 with a 11700k and found that i could get 4k dlss performance at 60 fps with no ray tracing dropping to around 50 fps in the city when staring at reflections. In the swamps though, it dropped to 35 fps. didnt feel too bad though.

I will try 1440p dlss quality rt maxed out and see if i can lock it to 40 fps without a big drop off in image quality. thanks!
 
Top Bottom