• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: The Witcher 3 PC - Next-Gen - Game-Changing Visuals But What About Performance?

adamsapple

Or is it just one of Phil's balls in my throat?
D7UVpDZ.png







The 5 people here saying they're having a great experience on 4090s, this setup above is more closer to what most people might have.

And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
 

Alex11

Member
D7UVpDZ.png







The 5 people here saying they're having a great experience on 4090s, this setup above is more closer to what most people might have.

And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
Not even that, I think a 2080/3070 would be more realistic, no? And I imagine the experience would be much worse.

Gotta say, a shame with this update as I was really looking forward to this.
 

skit_data

Member
I’ve noticed that the game hitches for about half a second every now and then on PS5 (seem to be related to quest completion mostly). Haven’t had it happen during fights or anything but it’s mildly annoying. It’s possible they’ve fixed it already because I pretty much only played it on launch day.
 

Thebonehead

Banned
The 5 people here saying they're having a great experience on 4090s
Ahem. Make that 6 as I also have a 12900k / 4090 combo

Runs great for me when I tried.

That was after popping down the shop to get some milk in my 911 turbo s though as that seems to make a difference apparently. Maybe I should have taken the Tesla instead for my eco credentials
 

Dr.D00p

Member
DX12 strikes again.

What a stinking pile of hot garbage its been for (PC) games, in terms of performance, if not features.

No wonder Sony & its developers get so much more out the PS5, using their own custom OpenGL based API tool set, whilst Xbox developers are lumped with the DX12 turd-like envoirement.
 

GymWolf

Member
Looking at eurogamer's article, it looks like CDPR went with Nvidia's RTX tools once again, as if they didn't learn how much of a clusterfuck that resulted with Cyberpunk.

You know Nvidia's RTX (anti-)optimization tools have gone too far when even their 2000€ "consumer" GPU fails to provide decent framerates on a 7 year-old game that got the raytracing treatment.
Unless they're actually trying to wrap people's minds about buying the RTX 5090 for a modest 4000€, a year from now.


Considering how the Matrix demo looks on the measly PS5 / Series X at 4K30, just imagine what a 5x more powerful RTX 4090 should be able to do.
It's not "something that looks like Witcher 3 RTX", that much we should all agree on.
I think that matrix demo run at 1440p on console.
 

Filben

Member
Not having reached Novigrad or another taxing location, I'm running 3440x1440 with RT maximum, DLSS Balanced and capped at 40fps (DLSS quality manages 40fps in many cases, but e. g. the village by the well in White Orchard is only 36-37). So DLSS balanced should give enough headroom and I'm going to use that until I notice a too big of a image quality drop. RTX 3080, Ryzen 3600.

Feels good enough and looks amazing. Wish they'd offer the same DualSense support as on PS5, but some features are missing, unfortunately.

RT mode on PS5 has way too unstable 30fps and it's by far not as smooth as some 30fps capped titles, not to mention 40fps on PC, despite some highly annoying #stutterstruggle, which, in truth, is quite ruining pc gaming for me.
 

ToTTenTranz

Banned
I think that matrix demo run at 1440p on console.
IIRC it's 1080p reconstructed through TSR to 4K, but I'm assuming temporal reconstruction technologies are just here for everyone to use (literally everyone after AMD made FSR2 open source), so a RTX4090 could and should use it in any situation.


DX12 strikes again.

What a stinking pile of hot garbage its been for (PC) games, in terms of performance, if not features.
This has nothing to do with DX12, but rather with the low performance DX11->DX12 "wrapper" that the devs decided to use in order to run Nvidia's RTX SDK, instead of updating the game engine.
AFAIR there's no discernible performance deficit between DX12 "high-level" mode and DX11.
 
Last edited:

damidu

Member
so no way to get any resemblance of playable with direct x12 on my 3070,
and no rt with directx 11.
plus add shader stutter on top.

ps5 perf mode it is.
 

rodrigolfp

Haptic Gamepads 4 Life
D7UVpDZ.png







The 5 people here saying they're having a great experience on 4090s, this setup above is more closer to what most people might have.

And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
I am having a far better experience on my 3080 12GB than any console with their slideshow and massive input lag and no RT shadows, reflections and mods...
 

tvdaXD

Member
Not even that, I think a 2080/3070 would be more realistic, no? And I imagine the experience would be much worse.

Gotta say, a shame with this update as I was really looking forward to this.
I have a 2080Ti, I settled for the 30FPS experience, with max 60fps in certain area's on 1440p with DLSS on performance. All settings maxed tho, because I am a sucker for graphics.
Once I reached Novigrad though, and it loaded in the NPC's the first time you're there... It tanked to almost single digits of performance, it's terrible! I hope they fix it soon...
 

adamsapple

Or is it just one of Phil's balls in my throat?
I am having a far better experience on my 3080 12GB than any console with their slideshow and massive input lag and no RT shadows, reflections and mods...

Congratulations. But the picture shows that a more moderate 3080 setup is struggling to reach 30 FPS in stress areas in DF's testing, which is closer to the experience most other people are likely having.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
Congratulations. But the picture shows that a more moderate 3080 setup is struggling to reach 30 FPS in stress areas in DF's testing, which is closer to the experience most other people are likely having.
With max settings, thing that consoles can only dream about and with far better input lag still. Dont know what their problem is but I never go below 40 fps on Novigrad at 4k Dlss quality alter the stutters fixes.
 
Last edited:

brian0057

Banned
Cyberpunk 2077 being a broken mess at launch was unfortunate.
CDPR breaking a perfectly working Witcher 3 over 7 years after launch is truly impressive.
 
Last edited:

Filben

Member
it's like living in the car instead because that's all you need apparently with most games being GPU bound. If specs for CPU rises I'm gonna upgrade. Until then that shiny car is all I need.

Oh, and of course property prices gonna need to drop.
 

ToTTenTranz

Banned
Imagine celebreting poverty.

And a 4090 is for running gazilions of games, not only a few exclusives like a ps5.

Dude Nvidia duped you into paying 2000€ for a GPU that plays the same games as a $500 console, with minimal visual differences that you wouldn't discern had you not watched Digital Foundry videos with 400% zoomed stills.

That's okay, we all make unreasonable choices in our lives. Just don't expect us to celebrate yours. Try nvidia's subreddit for that.
 

Gaiff

Gold Member
Dude Nvidia duped you into paying 2000€ for a GPU that plays the same games as a $500 console, with minimal visual differences that you wouldn't discern had you not watched Digital Foundry videos with 400% zoomed stills.

That's okay, we all make unreasonable choices in our lives. Just don't expect us to celebrate yours. Try nvidia's subreddit for that.
Don't you own a 6900 XT?
 

ToTTenTranz

Banned
Don't you own a 6900 XT?
Yes, and it was an unreasonable expenditure of 450€ (after selling my Vega 64 for 550€) in the middle of 2021, and only after I tried for about 3 months to get a 6800XT at MSRP through AMD.com's weekly stock drops.
At some point I could either lose the opportunity to sell my Vega 64 for more than what I paid for it (because back then the eth PoS transition was planned for late 21) or just spend 450€ to get the 6900XT.

Note: in my country I couldn't get Nvidia GPUs at MSRP.
 
Last edited:

Gaiff

Gold Member
Yes, and it was an unreasonable expenditure of 450€ (after selling my Vega 64 for 550€) in the middle of 2021, and only after I tried for about 3 months to get a 6800XT at MSRP through AMD.com's weekly stock drops.
At some point I could either lose the opportunity to sell my Vega 64 for more than what I paid for it (because back then the eth PoS transition was planned for late 21) or just spend 450€ to get the 6900XT.

Note: in my country I couldn't get Nvidia GPUs at MSRP.
So, you paid 1000 euros for a card that plays the same games as a $500 consoles with even less visual differences?

Doesn't sound like you have a lot of ground to be lecturing anyone on their spending.

I also like that the 4090 is 2000 euros because apparently, selling your old GPU is a non-factor when it comes to the 4090, but you make sure to deduct the price you sold your GPU for from what you paid for the 6900 XT.

It almost sounds like you're being disingenuous.
 

ToTTenTranz

Banned
So, you paid 1000 euros for a card that plays the same games as a $500 consoles with even less visual differences?

Doesn't sound like you have a lot of ground to be lecturing anyone on their spending.

I also like that the 4090 is 2000 euros because apparently, selling your old GPU is a non-factor when it comes to the 4090, but you make sure to deduct the price you sold your GPU for from what you paid for the 6900 XT.

It almost sounds like you're being disingenuous.

Be my guest and link to my posts bragging about buying a 6900XT while shitting on all console gamers because they're poor.


I admitted just now it was an unreasonable expenditure myself.
Paying 1000€ for a 6900XT 18 months ago wasn't a good catch, and I should have persisted on trying to get a 6800XT at MSRP. I did try to secure a high resale value on my Vega 64 at the time, but I now know it wasn't a sensible option.


The narrative you're trying to push just isn't here.
 

Gaiff

Gold Member
Be my guest and link to my posts bragging about buying a 6900XT while shitting on all console gamers because they're poor.


I admitted just now it was an unreasonable expenditure myself.
Paying 1000€ for a 6900XT 18 months ago wasn't a good catch, and I should have persisted on trying to get a 6800XT at MSRP. I did try to secure a high resale value on my Vega 64 at the time, but I now know it wasn't a sensible option.


The narrative you're trying to push just isn't here.
You're the one who claimed that NVIDIA tricked the other poster into buying a 2000 euros GPU to play console games with barely any visible differences. Considering that the 4090 absolutely dunks on the 6900 XT that you spent 1000 euros for, who are you mocking exactly? If the 4090 runs games with visuals indiscernible from a PS5, the 6900 XT might as well be running PS3 graphics.
 
Last edited:

geary

Member
Anyone found a stable optimized settings version for 60 FPS on 1440 with RT and ~max settings on 3080?
 

Gaiff

Gold Member
Speaking of being disingenuous..
You're the one owning a 6900 XT which is an even worse purchase than the 4090 even considering the time of their releases. And you're right, you should have gone for the 6800 XT. The 6900 XT is a piece of crap top of the line card that can't even run ray tracing decently.
 
Last edited:

ToTTenTranz

Banned
You're the one owning a 6900 XT which is an even worse purchase than the 4090 even considering the time of their releases. And you're right, you should have gone for the 6800 XT. The 6900 XT is a piece of crap top of the line card that can't even run ray tracing decently.
Aaaww little bunny got offended that I didn't buy a GPU from his Nvidia overlord?

Considering your recent behavior you could just assume what you're really doing here. It's okay, you can tell us the truth.
 

Arsic

Member
Just another game where CPU limited and shader compilation dunks performance. It’s the gold standard for lazy devs these days.

It’s not something you should miss at this point. It’s so common that it obviously is talked about by all developers on how to avoid it. You’d have to actively go out of your way to develop a game or update to do this.

As for ray tracing, only a handful of devs have made it useful and effective without destroying performance. It’s absolutely a waste of resources. Years of this tech existing and it’s generally a failure to implement unless you like 30fps or stuttery gameplay .

Basically put this to ultra and don’t use ray tracing, and you don’t get much of an improvement from what was here before.
 

GHG

Gold Member
Dude Nvidia duped you into paying 2000€ for a GPU that plays the same games as a $500 console, with minimal visual differences that you wouldn't discern had you not watched Digital Foundry videos with 400% zoomed stills.

That's okay, we all make unreasonable choices in our lives. Just don't expect us to celebrate yours. Try nvidia's subreddit for that.

"minimal differences". Yeh ok.

Some of us are playing the game looking like this with performance far better than anything consoles can offer in their performance modes:

RKQl7Q.png


Meanwhile you're crying and ranting here. It's a shame.
 

rodrigolfp

Haptic Gamepads 4 Life
Dude Nvidia duped you into paying 2000€ for a GPU that plays the same games as a $500 console
I don't own a 2000€ GPU. OverHeat does. And same games? How are Star Citizen, WoW Dragon Flight and Pokémon Scarlet on some $500 console, with perfect IQ, mods, tons of controllers options, lowest input lag, high framerates? And how is the input lag, performance and IQ, mods of The Witcher 3 : https://www.neogaf.com/threads/digi...formance-modes-tested.1648321/#post-267196989 ?
 
Last edited:

yamaci17

Member
D7UVpDZ.png







The 5 people here saying they're having a great experience on 4090s, this setup above is more closer to what most people might have.

And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
This is dishonesty AT ITS PEAK.

1) the scene is HEAVILY CPU LIMITED
2) it is MAX settings (ultra+, increased draw distance, npc count and lods over consoles)
3) ryzen 3600 is pretty much similar to consoles. it even lacks 2 cores. they're teh same arhitecture. you can't practically expect more performance out of same architecture on different platforms.

exact same scene on consoles run around 23-25 FPS as well. it is clearly what ZEN 2 architecture can output in their horribly optimized code. it has nothing to do with RTX 3080 (it is severely underutilize due to heavy CPU bottleneck)

Sel40KV.jpg



Throwing some pics out there, CLAIMING it is running at 30 FPS SPECIFICALLY THERE (because you're comparing the spot where it is dropping 24, implying that console is always locked at 30, even at such places.)

you will have the exact same 30 or even 40 FPS locked experience with that rig in OTHER locations that NOVIGRAD.

IF and only IF your "magic" consoles would be able to lock to 30 in Novigrad wheras 3600 couldn't, THEN you would have a point.
 
Last edited:

yamaci17

Member
Not even that, I think a 2080/3070 would be more realistic, no? And I imagine the experience would be much worse.

Gotta say, a shame with this update as I was really looking forward to this.
no it wouldn't because it is a CPU limitation. the user dishonestly picked the most problematic spot in the game, Novigrad, which destroys all CPUs, including 12900k.

the other user "rofif" claiming 12900k is "merely" getting 17 FPS over consoles is also a BLATANT lie and dishonesty, once again, AT ITS PEAK. not only he disregards consoles running 23-24 FPS there, he also disregards settings disperancy. why? because it is a PC, apparently. 12900k at its best day should be around 1.8 times faster than consoles in terms of single thread performance which is what limiting the novigrad performance. the fact that it performs 1.95 over consoles is enough proof that you practically get what you ARE SUPPOSED TO GET in RELATION to what consoles are getting. if said cutbacks could be applied to PC, difference would be even bigger to some degree

the consoles drop to 24 FPS there (due to CPU, once again) with CUTBACKS to ray tracing and other settings that are not even achieveable on PC.

meanwhile I can get an almost locked 40 FPS with my lowend cpu 2700 and rtx 3070



would it be HONEST to claim that my game runs like this EVERYWHERE, or let's say, NOVIGRAD?? no it wouldn't. i literally show in the exact same video how Novigrad performs.

yet people here do not see anything wrong with comparing console's flawless performance in places OTHER THAN NOVIGRAD (in other words, locked 30 FPS), to PC's Novigrad performance, AS IF consoles are able to lock to 30 FPS in those problematic scenes whereas PC can't.

anyone with a 2700 / 3600 + 3070/2070 will get a locked 30 or 40 FPS in open world skellige / velen areas. problematic parts are the crowded cities such as novigrad/toussaint. then you will have drops to 25s. SAME as consoles (consoles are not IMMNUTE to it, in other words, unlike what you people try to IMPLY. again. IMPLY)
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
This is dishonesty AT ITS PEAK.

1) the scene is HEAVILY CPU LIMITED
2) it is MAX settings (ultra+, increased draw distance, npc count and lods over consoles)
3) ryzen 3600 is pretty much similar to consoles. it even lacks 2 cores. they're teh same arhitecture. you can't practically expect more performance out of same architecture on different platforms.

exact same scene on consoles run around 23-25 FPS as well. it is clearly what ZEN 2 architecture can output in their horribly optimized code. it has nothing to do with RTX 3080 (it is severely underutilize due to heavy CPU bottleneck)



1. Yes, it's also the same on console
2. Most console settings are Ultra + as well minus one or two exceptions as per NXGamer

5tK0xAP.png



3. And that is a more appropriate representation of what PC gamers might have. The people with 12700K and 4090's are the exception, not the rule.
 
Last edited:

Gaiff

Gold Member
And this is a pretty bad experience. Much worse than the console versions running at 30 FPS with RTGI and AO in context.
It's not "much worse". They're both equally bad.

The only way to enjoy ray tracing in this game with acceptable levels of performance is with an RTX 40 card and frame generation. Otherwise, use DX11 on PC or select Performance Mode on consoles.

RT mode is absolutely brutal on all platforms.
 

adamsapple

Or is it just one of Phil's balls in my throat?
It's not "much worse". They're both equally bad.

The only way to enjoy ray tracing in this game with acceptable levels of performance is with an RTX 40 card and frame generation. Otherwise, use DX11 on PC or select Performance Mode on consoles.

RT mode is absolutely brutal on all platforms.

"Much worse" was probably an exaggerated statement, but I'd have expected better optimization and performance from a primarily PC-first developer like CDPR.
 
Top Bottom