• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry, Alan Wake 2 PC - Rasterisation Optimised Settings Breakdown - Is It Really THAT Demanding?

Gaiff

SBI’s Resident Gaslighter
Also it seems the ps5 outperforms the 2070 super while before Alex always mentioned ps5 couldn’t even touch a 2060 performance?
That was with copious amounts of ray tracing. In Watch_Dogs Legion, the PS5 in RT mode performed on the level of a 2060S.

Alex insisted on comparing consoles to a 2060S in optimized settings guides but it pretty much always lost to the PS5/SX. The closest equivalents without RT are the 2070S/2080 or sometimes even 2080S. If you throw in lots of RT, they start leaving the PS5 behind but this doesn't happen all that often and you need to have so many rt effects that the cards will also start to fall into unplayable territory anyway.
 
He has said that about certain games that used hardware RT. He didn't say that about Alan Wake 2, which uses software RT, on consoles.
So now he shifts goalposts and says the 3070 is 50% faster than a ps5 while it’s actually a little more than 40%. Also no shit that a card that costs 2 times a ps5 non disk editions outperforms a ps5. It’s such a weird flex to make. That pc is at least 1200 dollars, at the very least. But then again, it’s Alex so he’ll do anything to make the ps5 look bad for some reason.
 

peronmls

Member
If you’re using frame generation then you’re not running at 100+ fps and I say this as someone who owns a 4090. I really wish people would stop counting fake frames. It’s like taking $50 worth of bills and combining it with another $50 worth of counterfeit bill then, trying to pass it off as $100. You still only have $50. I guess Nvidia’s idiotic marketing is working.
So how well does it run on 4090 then???
 

Gaiff

SBI’s Resident Gaslighter
So now he shifts goalposts and says the 3070 is 50% faster than a ps5 while it’s actually a little more than 40%. Also no shit that a card that costs 2 times a ps5 non disk editions outperforms a ps5. It’s such a weird flex to make. That pc is at least 1200 dollars, at the very least. But then again, it’s Alex so he’ll do anything to make the ps5 look bad for some reason.
The cheapest 3070 is $360 USD now. The cheapest in my region is also $360 USD with currency conversion. It shouldn't cost twice the MSRP of a digital PS5 unless PC hardware prices are really high in your region. That card is also 3 years old by now. You should be able to build a rig centered around a 3070 for around $900-ish but I wouldn't recommend purchasing an 8GB card in 2023 unless it's entry-level.

At that price point, best go with the 6800 which is faster in raster, but you do lose DLSS and some ray tracing performance. Still, you also get twice the VRAM.

Hopefully, NVIDIA does come out with their Super cards so they can knock down prices some. It's desperately needed at the moment.
 
Last edited:

Leonidas

Member
So now he shifts goalposts and says the 3070 is 50% faster than a ps5 while it’s actually a little more than 40%.
Where did he shift goalposts? You misquoted him...

He says nearly 50%. The PS5 was fluctuating between 57/58 FPS in the comparison. The 3070 was 83 FPS. The 3070 was 2-4 FPS away from being 50% faster.

I think "nearly" fits the description well. My 3070 (before I upgraded to a much faster GPU) would have been 50% faster (or more) because of a 5% overclock.

Also no shit that a card that costs 2 times a ps5 non disk editions outperforms a ps5. It’s such a weird flex to make.
RTX 3070, a GPU that is much faster than the PS5s GPU in hardware RT, launched 3 years ago at $500.
 
Last edited:

Senua

Member
Can Alex be a whiny PC fanboy? Yes, but man some of you PS5 gamers ITT are even more fragile. The video is informative and his statements are backed up. The game seems to perform about right on every system relative to its power. Why is this upsetting some of you?
 

rofif

Can’t Git Gud
sad-lonely.gif
He is right tho. 100fps with frame gen is 50-60 real fps.
 

drotahorror

Member
Wait wut? Why does ever seen to have a hate boner for DF all of a sudden?

This vid was actually helpful for getting my 3070 performance up.

Edit: Image resolve is blurry though. I countered it by adding CAS through Reshade
No offense to DF but his settings are literally the Medium preset in game. I checked my settings after seeing the optimized settings and they were exactly the same already and I went with medium.
 

shamoomoo

Member
Where did he shift goalposts? You misquoted him...

He says nearly 50%. The PS5 was fluctuating between 57/58 FPS in the comparison. The 3070 was 83 FPS. The 3070 was 2-4 FPS away from being 50% faster.

I think "nearly" fits the description well. My 3070 (before I upgraded to a much faster GPU) would have been 50% faster (or more) because of a 5% overclock.


RTX 3070, a GPU that is much faster than the PS5s GPU in hardware RT, launched 3 years ago at $500.
Alex would still be wrong,if the difference was 45% then say nearly 50% would be accurate but it wasn't. His usage of nearly 50% was to made fps difference seem even bigger, also the drop on the PS5 is at a stressed area as we don't know who fast the game actually runs.


The PS5 could possibly run the game above 60 but have heavier drops because of its CPU.
 

Lysandros

Member
Where is that image from?
Further questions: Yet again, what's the freaking CPU used in PC and is the Vsync engaged on it like the PS5 version?.. Some users are reporting higher than usual performance cost for Vsync on this title, of about 10-15% (needs confirmation). Knowing all too well the 'little games' played by the poster of the comparison and all...
 
Last edited:

Mister Wolf

Member
He is right tho. 100fps with frame gen is 50-60 real fps.

How? Frames is short for Frames Of Animation. Whatever is indicated in benchmarking is literally the amount of animation per second the TV/Monitor is displaying. I know the 30 Series owners are annoyed they don't have access to it but this sillyness gotta stop. Too much sour grapes. Latency and Framerate are two separate units of measurement. The former for response time of a signal, the latter for animation.
 
Last edited:

shamoomoo

Member
Further questions: Yet again, what's the freaking CPU used in PC and is the Vsync engaged on it like the PS5 version?.. Some users are reporting higher than usual performance cost for Vsync on this title, of about 10-15% (needs confirmation). Knowing all too well the 'little games' played by the poster of the comparison and all...
Probably Ryzen 3600 as he has that CPU and a 12900k.
 

Leonidas

Member
Alex would still be wrong,if the difference was 45% then say nearly 50% would be accurate but it wasn't.
The difference was around 45% though.

83 FPS (3070) is 43-46% faster than 57-58 FPS (PS5)
84 FPS would have been 45-47% faster than PS5


The PS5 could possibly run the game above 60 but have heavier drops because of its CPU.
That's irrelevant since there is only one PS5.
 
Last edited:
  • Like
Reactions: amc

baphomet

Member
4k DLSS Quality with max settings, path tracing, frame gen I'm getting between 80-120fps.

DLSS performance had it locked at 120fps.
 
Last edited:

Zathalus

Member
Alex would still be wrong,if the difference was 45% then say nearly 50% would be accurate but it wasn't. His usage of nearly 50% was to made fps difference seem even bigger, also the drop on the PS5 is at a stressed area as we don't know who fast the game actually runs.


The PS5 could possibly run the game above 60 but have heavier drops because of its CPU.

Further questions: Yet again, what's the freaking CPU used in PC and is the Vsync engaged on it like the PS5 version?.. Some users are reporting higher than usual performance cost for Vsync on this title, of about 10-15% (needs confirmation). Knowing all too well the 'little games' played by the poster of the comparison and all...
The CPU would not matter. At no point does this game stress the CPU. My older laptop has a 5800H and it runs it north of 100fps when I lower all the settings to get into a CPU limited scenario.

The PS5 is performing like a 2080 here, in line with almost all releases. What exactly is the controversy here?
 

Lysandros

Member
Alex would still be wrong,if the difference was 45% then say nearly 50% would be accurate but it wasn't. His usage of nearly 50% was to made fps difference seem even bigger, also the drop on the PS5 is at a stressed area as we don't know who fast the game actually runs.


The PS5 could possibly run the game above 60 but have heavier drops because of its CPU.
Not only %50 isn't accurate for those who have a basic familiarity with certain a thing called "mathematics", this 43% percent difference is specific to this forest area/section and it is highly speculative and wishful to present it as an absolute average to feed the narrative at hand. Especially considering the important questions of CPU and Vsync differences.
 
Last edited:

Leonidas

Member
Not only %50 isn't accurate for those who have a basic familiarity with certain a thing called "mathematics", this 43% percent difference is specific to this forest area/section and it is highly speculative and wishful to present it as an absolute average to feed the narrative at hand. Especially considering the important questions of CPU and Vsync differences.
Its the most fair comparison he could have made. You people would complain more if he showed the difference of other areas, since the PS5 is locked down to 60 FPS.

The 3070 will be over 50% faster in areas where the PS5 is locked but the 3070 is not.
 

DenchDeckard

Moderated wildly
If you’re using frame generation then you’re not running at 100+ fps and I say this as someone who owns a 4090. I really wish people would stop counting fake frames. It’s like taking $50 worth of bills and combining it with another $50 worth of counterfeit bill then, trying to pass it off as $100. You still only have $50. I guess Nvidia’s idiotic marketing is working.
Mines running at 60 ish frames without frame generation. So I'm getting the input response of 60fps but perceived framerate of 120 plus on my oled. So yes, it is pretty amazing text. It still looks and has the fluidity of 120hz and with the controller input response of 60dps it looks and feels incredible.
 

shamoomoo

Member
The CPU would not matter. At no point does this game stress the CPU. My older laptop has a 5800H and it runs it north of 100fps when I lower all the settings to get into a CPU limited scenario.

The PS5 is performing like a 2080 here, in line with almost all releases. What exactly is the controversy here?
Why not at the same settings as performance mode and your CPU is better than PS5. By show much,I don't know.
 

shamoomoo

Member
Its the most fair comparison he could have made. You people would complain more if he showed the difference of other areas, since the PS5 is locked down to 60 FPS.

The 3070 will be over 50% faster in areas where the PS5 is locked but the 3070 is not.
Unless I've missed something,we don't know how fast the 3070 runs,the only thing we know is the lower bounds is 43% faster than PS5.
 

Zathalus

Member
Why not at the same settings as performance mode and your CPU is better than PS5. By show much,I don't know.
Because I wanted to demonstrate CPU limits? The 5800H is indeed faster but the difference is like 25%. Even a 3600 CPU should be above 80fps at all times.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
gif-shaking-head-42.gif


Needing to run with fsr2 or dlss as he recommends isn't a good port.
Alex is learning to cope as are many here.

Wait are we still NOT recommending people to use DLSS, if you have the horsepower then use DLAA, native rendering is for fools.
 

Vergil1992

Member
Further questions: Yet again, what's the freaking CPU used in PC and is the Vsync engaged on it like the PS5 version?.. Some users are reporting higher than usual performance cost for Vsync on this title, of about 10-15% (needs confirmation). Knowing all too well the 'little games' played by the poster of the comparison and all...
He uses a Ryzen 3600. He has said it in one of his tweets.

I don't quite understand your point with or without v-sync. If the game was locked at 60fps with vsync it is true that we cannot know its real performance, but Alex has compared in stressful situations where on PS5 it drops to 50fps (approximately) and 80fps on a 3070. Vsync activated or deactivated would not make any difference. He have made the comparison correctly. It would be necessary to see if that 43% higher performance is on average or not, but there is nothing reprehensible here.

Vsync on or off It has no impact on performance.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
What about between adaptive and triple buffered?
Triple buffering below the display's max refresh rate shouldn't have any impact on the raw fps, especially not when it's almost flat like in Alex's example. It might change the frame pacing but that's it. Unless it's bugged which happens at times. Maybe it has issues with Alan Wake but I haven't been able to reproduce them.
 
Last edited:

Zathalus

Member
So what is wrong with PC being recommended to use it? Especially with DLSS. If you don't want to use RT you get away with not using DLSS. Like a 4070 will do 1440p 60 fps no problem without RT. But anyone sensible will just enable DLSS quality and get that RT goodness.
 
Last edited:

rofif

Can’t Git Gud
How? Frames is short for Frames Of Animation. Whatever is indicated in benchmarking is literally the amount of animation per second the TV/Monitor is displaying. I know the 30 Series owners are annoyed they don't have access to it but this sillyness gotta stop. Too much sour grapes. Latency and Framerate are two separate units of measurement. The former for response time of a signal, the latter for animation.
It’s just interpolation. A bit better interpolation. I like it well enough but when someone says the game is running at 100 fps and later it’s revealed it runs at 100fps with fg… that’s like saying my dick is 50cm measure from my butt
 

Denton

Member
So now he shifts goalposts and says the 3070 is 50% faster than a ps5 while it’s actually a little more than 40%. Also no shit that a card that costs 2 times a ps5 non disk editions outperforms a ps5. It’s such a weird flex to make. That pc is at least 1200 dollars, at the very least. But then again, it’s Alex so he’ll do anything to make the ps5 look bad for some reason.
You PS fanboys are so insufferably idiotic
 

Vergil1992

Member
What about between adaptive and triple buffered?
Adaptive Vsync: if you fall below the target frequency, v-sync is deactivated.

Vsync triple buffered: v-sync at all times.

I ask again, what's the point? Be that as it may, it does not reduce performance. The only difference is that if it were adaptive v-sync in those circumstances (50fps) tearing would appear when v-sync was disabled (although sometimes it is better because it reduces the latency of frame drops). This is easy to solve, try playing any game to activate and deactivate vsync, whether adaptive or triple buffered, and you will see that the performance will be the same.
 

Mister Wolf

Member
It’s just interpolation. A bit better interpolation. I like it well enough but when someone says the game is running at 100 fps and later it’s revealed it runs at 100fps with fg… that’s like saying my dick is 50cm measure from my butt

You do know interpolation add frames of animation?
 

Mr Moose

Member
Adaptive Vsync: if you fall below the target frequency, v-sync is deactivated.

Vsync triple buffered: v-sync at all times.

I ask again, what's the point? Be that as it may, it does not reduce performance. The only difference is that if it were adaptive v-sync in those circumstances (50fps) tearing would appear when v-sync was disabled (although sometimes it is better because it reduces the latency of frame drops). This is easy to solve, try playing any game to activate and deactivate vsync, whether adaptive or triple buffered, and you will see that the performance will be the same.
Ask Alex, he said triple buffered v adaptive does have a performance hit.
 

rofif

Can’t Git Gud
You do know interpolation add frames of animation?
It doesn’t add frames to animations.
It has current frame and next frame. It constructs a frame to put in between.
It doesnt fucking touch animations. What are you talking about.

It’s just measuring movement between two frames and AI tries to figure out where would this newly created frame be between those two frames. That includes any moving objects. It’s just interpolation and these frames look awful if you check the recordings. It looks ok in motion because it’s just 1 frame every 2nd or 3rd frame. It cheats your brain.

It’s not rocket science
 
Last edited:

Vergil1992

Member
Ask Alex, he said triple buffered v adaptive does have a performance hit.
In no game that I've tried (another issue is double-buffered vsync). What Alex may be referring to is that the frametime may be more unstable and run less 'smooth', but the frame rate is going to be the same.

Example 1: You have a game that runs at 60fps. The game is dropped to 52fps and is running with adaptive vsync. V-sync in this case will be completely disabled.

Example 2: You have a game that runs at 60fps. The game drops to 52fps and is running with vsync triple buffering. Vsync will remain activated at all times.

But the framerate will be the same in both cases. It won't make a difference if you change adaptive v-sync to standard.
 
Last edited:

Vergil1992

Member
I am talking about a video before. I'll try and find it.
It doesn't really matter, I don't think you're lying. If he said that, he was either wrong or taken out of context. According to the screenshot above she said the same thing I'm saying now. He would be contradicting himself.

Regardless of what Alex says, anyone can verify that he will not lose a single frame using adaptive or normal vsync.
 
Last edited:
Top Bottom