• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] The Witcher 3: PS5 vs Xbox Series X Hands-on - Ray Tracing + 60FPS Modes Tested

RoadHazard

Gold Member
I think choppiness is because of motion blur. It's old and not really useful implementation. However it's maybe subject to change.
As for raytracing it will make a picture "right and easy" to the eye and not disturb some people (like me). I personally hate bad shadows, disappearing reflections, pop-in, etc.

This will have neither RT shadows (just RTGI and RTAO) nor RT reflections (they are still screen-space, just higher quality in the 30fps mode) on console, so you'll still have all those issues. And RT has nothing to do with pop-in.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
PC version:



Shame about Twitter compression:

IMG-20221206-131421.jpg


IMG-20221206-131816.jpg

giphy.webp
 
At least than 1440p, with no texture enhancements, dropping to low 40's during chaos, and low anisotropic filtering, and using FSR1.0 with blur trails.
What game does not drop during chaos? Framerate is actually very stable here 99.99% of the time and the RT shadows are really nice to have at 60fps. I just think the PS5 game at launch was a very good deal at 10$ (and I don't even like the game that much, just played a few hours).
 

MaKTaiL

Member
Yes? And you will get that version whether you buy the new PS5 version or upgrade your PS4 version for free.
I don't think you quite know how these PS4/PS5 versions work. Yes, we will get the PS5 version for free but that doesn't mean the PS4 will receive a 60fps patch when playing it on PS5. The PS4 version will be patched with the Netflix stuff but that might be it. Also a native PS5 version of any game get its own set of trophies, that's why I'm hoping the PS4 version on PS5 plays at 60fps as well so I can continue on it.
 

RoadHazard

Gold Member
I don't think you quite know how these PS4/PS5 versions work. Yes, we will get the PS5 version for free but that doesn't mean the PS4 will receive a 60fps patch when playing it on PS5. The PS4 version will be patched with the Netflix stuff but that might be it. Also a native PS5 version of any game get its own set of trophies, that's why I'm hoping the PS4 version on PS5 plays at 60fps as well so I can continue on it.

Yeah, I guess "patched" wasn't the right word. Your original post was asking if the PS4 version would run at 60fps on PS5, and I thought you hadn't realized that if you have the PS4 version you will get the full PS5 upgrade for free. I missed the part about trophies, and yeah, I guess it's true that the upgraded version will have its own set since it's now a PS5 game.

But then no, the PS4 version will definitely not be updated to run at 60. I don't think even the Pro could handle that (and base PS4 obviously couldn't). I guess they could in theory include an unlocked framerate mode in the PS4 version specifically for your use case, but it seems unlikely to happen.
 
Last edited:

DragonNCM

Member
What a waste is quality mode's +RT on new gen consoles...... everyone will change graphics option to quality ONE'S and play 5 minutes (at the best) & say ok....this looks cool, then swap to performance mode & finish the game in that mode.
A FUCKING WASTE OF DEVELOPERS TIME & MONEY :(
 

Gaiff

Gold Member
RT is expensive. Modern GPUs are really expensive and power hungry. An RTX 4090 is $1599.00 and draws up to 450 Watts. The PS5/XSX are $400.00-500.00 devices that top out at about 200 Watts. So even if you want to inflate the price and wattage for the next generation in 2028, they probably aren't going to go above $600 and I doubt they'd come anywhere near 300 Watts. They'll punch above their weight being dedicated devices and will have new tech with better efficiencies but it's unrealistic to think they'll bridge that gap in price/power. The current consoles are still chasing the 4K dream.
A couple of wrong things.

For one, the $1600 price tag is almost irrelevant when it comes to consoles. They pay nowhere near that price for their parts. NVIDIA's profits margins and even AMD's are pretty insane. The estimated cost to produce the 4090 is less than half its selling price. There's a reason that the PS5 in its entirety costs $500 and Sony still turns in a slight profit whereas an equivalent GPU (the 6650 XT) came out with an MSRP of $399, the same price as the digital PS5. And that's ignoring the fact that the PS5 has an entire APU, an optical drive, a cooling solution, an integrated PSU, an SSD, etc.

Two, the wattage question is a bit more nuanced than this. The 4090 at a 100% power limit does draw about 450W but that's chiefly because NVIDIA has cranked it up far past its maximum efficiency. It's actually by a wide margin the most power efficient GPU currently on the market and in gaming scenarios at 4K draws slightly less power than a 3090 Ti despite being over 50% faster and 70%+ faster in heavy RT workloads.

Here's how power efficient it actually is:

small_quasarzone-4090-undervolting-game-performance.png


With proper undervolting and power management, it performs at 94.7% of its full potential drawing a meager 232W. This stomps all over everything else, including the 4080 which is quite a bit faster than previous gen flagships. My 4090 despite being this rumored out of control power hungry beast, consumes less power than my 2080 Ti at 3440x1440/120Hz most of the time.

When the consoles came out in 2020, the flagship of 6 years prior was the 980 Ti (actually the 980 because it came out in September 2014 whereas the Ti came out in June 2015 but they're on the same process node with the same architecture). 6 years from now is 3 generations worth of graphics cards at the current pace and there's no way something like an RTX 7060 will be slower than a 4090, just like a 3060 isn't slower than a 980 Ti.
 

hyperbertha

Member
In games like dying light 2, raytracing does create huge differences, but in witcher 3, all you need for for full raytracing is a contrast adjustment!
 
This will have neither RT shadows (just RTGI and RTAO) nor RT reflections (they are still screen-space, just higher quality in the 30fps mode) on console, so you'll still have all those issues. And RT has nothing to do with pop-in.
I know. Just generalized overall what RT can bring to the picture. As for pop-in - it's a part of "graphics will not disturb you" approach with remasters.
 

NeonGhost

uses 'M$' - What year is it? Not 2002.
Why is it only Sony developers are using unlocked frame rates in 4K modes with vrr ? Why aren’t most other developers doing this by now ?
 

A.Romero

Member
Why is it only Sony developers are using unlocked frame rates in 4K modes with vrr ? Why aren’t most other developers doing this by now ?

Most likely it's an effort issue. The market segment with access to all that tech is probably way smaller than the overall market so it wouldn't make a difference in sales and, therefore, not worth the investment. Specially in this case as it is free.
 

MaKTaiL

Member
Yeah, I guess "patched" wasn't the right word. Your original post was asking if the PS4 version would run at 60fps on PS5, and I thought you hadn't realized that if you have the PS4 version you will get the full PS5 upgrade for free. I missed the part about trophies, and yeah, I guess it's true that the upgraded version will have its own set since it's now a PS5 game.

But then no, the PS4 version will definitely not be updated to run at 60. I don't think even the Pro could handle that (and base PS4 obviously couldn't). I guess they could in theory include an unlocked framerate mode in the PS4 version specifically for your use case, but it seems unlikely to happen.
Yeah. Skyrim was the only case I've seen where the PS4 version was updated to play at 60fps on PS5 while also releasing a PS5 native version. I'll keep my hopes up.
 

bender

What time is it?
A couple of wrong things.

For one, the $1600 price tag is almost irrelevant when it comes to consoles. They pay nowhere near that price for their parts. NVIDIA's profits margins and even AMD's are pretty insane. The estimated cost to produce the 4090 is less than half its selling price. There's a reason that the PS5 in its entirety costs $500 and Sony still turns in a slight profit whereas an equivalent GPU (the 6650 XT) came out with an MSRP of $399, the same price as the digital PS5. And that's ignoring the fact that the PS5 has an entire APU, an optical drive, a cooling solution, an integrated PSU, an SSD, etc.

Two, the wattage question is a bit more nuanced than this. The 4090 at a 100% power limit does draw about 450W but that's chiefly because NVIDIA has cranked it up far past its maximum efficiency. It's actually by a wide margin the most power efficient GPU currently on the market and in gaming scenarios at 4K draws slightly less power than a 3090 Ti despite being over 50% faster and 70%+ faster in heavy RT workloads.

Here's how power efficient it actually is:

small_quasarzone-4090-undervolting-game-performance.png


With proper undervolting and power management, it performs at 94.7% of its full potential drawing a meager 232W. This stomps all over everything else, including the 4080 which is quite a bit faster than previous gen flagships. My 4090 despite being this rumored out of control power hungry beast, consumes less power than my 2080 Ti at 3440x1440/120Hz most of the time.

When the consoles came out in 2020, the flagship of 6 years prior was the 980 Ti (actually the 980 because it came out in September 2014 whereas the Ti came out in June 2015 but they're on the same process node with the same architecture). 6 years from now is 3 generations worth of graphics cards at the current pace and there's no way something like an RTX 7060 will be slower than a 4090, just like a 3060 isn't slower than a 980 Ti.

Of course it is more nuanced. It's a few sentence description for a simple explanation and thus many factors are left out like it taking more than a GPU for a PC so the costs/power calculation will further skew even with components not at 100% draw. And even at your 94.7%, efficiency number, your GPU alone runs at a higher wattage than either console. So no, nothing was really wrong.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
Graphical glitches, performance issues
Why would the pc version have more glitches and performance issues?

Plus, framerate and input lag are still far superior on PC (and not counting the mods). The only problem would be real time shader compilation, but it's a CDPR game, so they should not suck with this.
 
Last edited:

Gaiff

Gold Member
Of course it is more nuanced. It's a few sentence description for a simple explanation and thus many factors are left out like it taking more than a GPU for a PC so the costs/power calculation will further skew even with components not at 100% draw. And even at your 94.7%, efficiency number, your GPU alone runs at a higher wattage than either console. So no, nothing was really wrong.
Everything was wrong though. You said that the 4090 draws 450W and that consoles draw 200W or less. You can get something reasonably close to the 450W 4090 at 232W so something of 200W in 2028 outclassing the full 450W of the 4090 certainly isn't impossible. In fact, it would be unprecedented if it didn't easily beat the 4090. That 450W argument is ignoring some serious context. At 326W, the 4090 essentially performs the same.

Second, precedents dictate that an x60 class cards 3 generations or 6 years later easily beats the Titan from that far back. Hell, even the shitty 3050 comes reasonably close to the 980 Ti. Might even beat it but I can't recall off the top of my head.

Consoles releasing in 2028 drawing 200W should easily be able to beat a 4090 unless there's some major stagnation in the evolution of semiconductor technology in the coming years. The 4090 looks like some unbeatable monster now but 6 years in that industry is a fucking century in the real world.
 
Last edited:

bender

What time is it?
Everything was wrong though. You said that the 4090 draws 450W and that consoles draw 200W or less. You can get something reasonably close to the 450W 4090 at 232W so something of 200W in 2028 outclassing the full 450W of the 4090 certainly isn't impossible. In fact, it would be unprecedented if it didn't easily beat the 4090. That 450W argument is ignoring some serious context. At 326W, the 4090 essentially performs the same.

Second, precedents dictate that an x60 class cards 3 generations or 6 years later easily beats the Titan from that far back. Hell, even the shitty 3050 comes reasonably close to the 980 Ti. Might even beat it but I can't recall off the top of my head.

Consoles releasing in 2028 drawing 200W should easily be able to beat a 4090 unless there's some major stagnation in the evolution of semiconductor technology in the coming years. The 4090 looks like some unbeatable monster now but 6 years in that industry is a fucking century in the real world.

I said it "draws up to". That's just for the GPU whereas these consoles draw up to 200W and that's for the entire system. But please continue to argue in bad faith.
 

Gaiff

Gold Member
I said it "draws up to". That's just for the GPU whereas these consoles draw up to 200W and that's for the entire system.
That it draws up to 450W is completely irrelevant because it's far past peak efficiency. 232W is just to show how power efficient the node is at the moment. Imagine in 6 years. Even the entire system at 200W could beat a 4090 six years from now.

The current consoles are comparable in raster to a 1080 Ti, the flagship from Pascal that came out over 4 years before they entered the market. The consoles at 200W trade blows with the 250W flagship from 2 generations back and actually have hardware-accelerated ray tracing and more modern tech and we're supposed to believe that in fucking 6 years they won't surpass the 4090 because it's $1600 and gains 1% in performance increase by adding 125W? Sure thing.
But please continue to argue in bad faith.
Not doing that. You came up with a completely flawed argument that was easily debunked and you get mad at me for correcting you.
 

scydrex

Member
That it draws up to 450W is completely irrelevant because it's far past peak efficiency. 232W is just to show how power efficient the node is at the moment. Imagine in 6 years. Even the entire system at 200W could beat a 4090 six years from now.

The current consoles are comparable in raster to a 1080 Ti, the flagship from Pascal that came out over 4 years before they entered the market. The consoles at 200W trade blows with the 250W flagship from 2 generations back and actually have hardware-accelerated ray tracing and more modern tech and we're supposed to believe that in fucking 6 years they won't surpass the 4090 because it's $1600 and gains 1% in performance increase by adding 125W? Sure thing.

Not doing that. You came up with a completely flawed argument that was easily debunked and you get mad at me for correcting you.

So you think in 6 years there will be a complete system or console with a power draw of 200w that will have a GPU equal to a 4090 or better?
 
Last edited:

MikeM

Member
Same here. When I first got a 4k tv I thought fidelity or bust. But as time has gone on with the new consoles, I have finally seen the performance light.
Bruce Willis Party GIF by IFC

Graphical glitches, performance issues, micro-stutters due to shader compilation issues, etc.

I think the notion that PC is inherently the best-performing version simply because it has the most powerful CPUs & GPUs available, has been fading away of late.
Correct. As a recent converter to PC, there is an overall lack of refinement that you naturally get with console.
Why would the pc version have more glitches and performance issues?

Plus, framerate and input lag are still far superior on PC (and not counting the mods). The only problem would be real time shader compilation, but it's a CDPR game, so they should not suck with this.
PC has its pluses for sure. But lets be real- issues like shader comp issues is fucking game breaking. As is constant crashing. Some of this is a symptom of developing for a platform with 1000s of combos. But some of it is also just poor development choices.
 

bender

What time is it?
Not doing that. You came up with a completely flawed argument that was easily debunked and you get mad at me for correcting you.

Who is mad? I'm not sure your debunked anything because you seem to be laser focused when I'm talking in generalities (forest for the trees situation) nor do I think our positions are all that different even if we disagree slightly with the outcomes. I'm happy to be wrong about industry trends but that's just how I see it.
 

rodrigolfp

Haptic Gamepads 4 Life
there is an overall lack of refinement that you naturally get with console.
Like more input lag, lower resolution/blurry IQ, lower framerate, lack of RT effects?
issues like shader comp issues is fucking game breaking. As is constant crashing. Some of this is a symptom of developing for a platform with 1000s of combos. But some of it is also just poor development choices.
No, they are not. At least not here with some insignificant "hiccups" here and there (didn't play CP before patch, but it was patched). This is a symptom of not having pre compilations like several games have.
 
Last edited:

Gaiff

Gold Member
So you think in 6 years there will be a complete system or console with a power draw of 200w that will have a GPU equal to a 4090 or better?
Not necessarily that there will be one but if there is a new console in 2028 that's on RDNA6 or whatever AMD decides to call it, yes, its GPU will outperform a 4090. If it doesn't, it'd be a huge disappointment.
Who is mad? I'm not sure your debunked anything because you seem to be laser focused when I'm talking in generalities (forest for the trees situation) nor do I think our positions are all that different even if we disagree slightly with the outcomes. I'm happy to be wrong about industry trends but that's just how I see it.
Then this should make it clear; a 2028 console should have a GPU that outperforms the 4090. 6 years is ancient history.
 

MikeM

Member
Like more input lag, lower resolution/blurry IQ, lower framerate, lack of RT effects?

No, they are not. At least not here with some insignificant "hiccups" here and there (didn't play CP before patch, but it was patched). This is a symptom of not having pre compilations like several games have.
The average user would not notice the input lag and everything else listed. I swap between ps5 and my PC for warzone and don’t notice any difference on my Lg C1 besides the downgrade in quality.
 

Filben

Member
Oh boy, I really like DF technical insights but how they're waxing about this seems so out of touch and I'm usally a sucker for small details, good anti-aliasing, etc. and graphics effects in general I typically want to max out.

But here? They zoom in like 200% onto a mug that looks almost identical, same with Geralt's armour, while he says "much more pop", "pretty visible". And I'm already wearing my glasses and watched it in 4k on my desktop monitor with 40cm distance to it! In that one scene he says with RT on it looks "beyond recognition". Sorry, beyond recognition is something like Tomb Raider 2013 low vs ultra (one of the best scaleable games ever). Maybe it's that typical American thing of overstatements but if you throw around words just as careless as this it loses its meaning and are just empty phrases.

Any dual sense support?

Adaptive triggers?
For PS5, they confirmed this in their stream. They specifically said "PS5" though, so I wouldn't bet on PC getting it. CP2077 also never got patched and modders had to make an (incredible!) adaptive triggers support for the PC version.
 
Last edited:

JaksGhost

Member
You underestimate the number of fans of PowerPoint and input lag.
If you even mention anything other than inserting a disc into a console my brother glitches. A 40 year old man that loves fighters, first person shooters, Metal Gear Solid and doesn't give one damn about framerate. My best friend is the same way. Man has a 55' TV in the living room but is playing Darksiders for the first time with his PS4 on a cheap 720p 24' Insignia TV in his kitchen so his kids can watch Disney. And he's having the time of his life, too. This may be anecdotal but the average consumer does not care at all :messenger_tears_of_joy: We're the outliers here.
 
Last edited:
Why is it only Sony developers are using unlocked frame rates in 4K modes with vrr ? Why aren’t most other developers doing this by now ?
This! It’s unbelievable that developers don’t have the foresight to include a “balanced mode” option for every game that gets released. This way people can decide for themselves if they want to put up with an inconsistent framerate to get the best possible graphics at a higher fps than a capped 30. This also future proofs every game for the inevitable ps5 pro and ps6 which will be backwards compatible.

Absolutely asanine decision for devs not to include this especially a dev like cd project with their history of releasing games on the PC. So now we’re stuck with the same cycle with games having a disappointing performance mode in terms of visuals and a locked 30 in the future. There’s 0 excuse for this! At least with the xbox one/ps4 devs didnt know about backwards compatibility with ps5/sx.

There are so many good ps4/xbox one games that are doomed to forever be 30 fps because of this. I called it that this Witcher 3 update would be disappointing because CP 2077 was pretty disappointing.

Actually, most next gen patches have been disappointments and ive played pretty much all of them- only Metro, spiderman remastered, doom eternal and hitman 1/2 have delivered what i wouldve expected from these consoles. Most of them just do this input lag ridden 30 fps fidelity mode and scaled down 60 fps mode. Control ultimate, cyberpunk, gta v, uncharted 4 remastered were all anticipated next gen patches and all pretty disappointing in their own way.

The consoles are not strong enough to deliver ray tracing and better settings at 60 fps but some have been able to squeak out 40 fps. It’s pure laziness or disinterest in insuring a great next gen patch akin to what PC players get that results in these half hearted updates.
 
Top Bottom