• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Cyberpunk 2077 Next-Gen Patch: The Digital Foundry Verdict

VFXVeteran

Banned
Once again, no one has ever claimed this.

Argue Peace Out GIF
/ignore.
 

RoadHazard

Gold Member
:messenger_tears_of_joy::messenger_tears_of_joy:

I didn't know if I wasn't working on a game using the UE4 game engine for our realtime graphics needs that I don't qualify. Sorry man. I will tell the team they need to stop using Unreal and Unity since we aren't working on games.

It doesn't qualify you to speak as if you are an expert on console game development.

And UE isn't the only engine that exists. It also isn't the engine that has historically achieved the most impressive results on consoles. And I'm willing to bet that won't be the case with UE5 either, as impressive as Lumen and Nanite may look right now. You always lose something by targeting everything over targeting just one thing.

Edit: Ignored me, lol. That's how you win an argument!
 
Last edited:
I have the PS4 disc copy of Cyberpunk, is there any way to not have the PS4 version icon always show up on my PS5 homepage next to the PS5 version I have now installed?
 

Markio128

Member
The green rats initially laughed at the difference in power between Series X and PS5, yet are now having to try incredibly hard to convince everyone that the Series X versions of games are somehow better - usually because of VRR. Not quite the 33% difference we were all led to believe.
 

Zathalus

Member
These features were introduced in Nvidia boards years ago. These engines already have implemented the features you speak of.
This is simply not true, while Turing introduced such features like Sampler Feedback, Mesh Shaders and hardware Variable Rate Shading, almost no games take advantage of these feature sets.

No games out at the moment use Sampler Feedback, and Mesh Shaders are only utilized by a single obscure Chinese MMO (name is Justice). Tier 2 VRS has only become a thing thanks to the Xbox Series X/S, and before the Gears 5 next-gen update no game took advantage of it.

Mesh Shading especially requires a substantial rework of the Geometry pipeline to see performance benefits.
 

VFXVeteran

Banned
No games out at the moment use Sampler Feedback, and Mesh Shaders are only utilized by a single obscure Chinese MMO (name is Justice). Tier 2 VRS has only become a thing thanks to the Xbox Series X/S, and before the Gears 5 next-gen update no game took advantage of it.

Mesh Shading especially requires a substantial rework of the Geometry pipeline to see performance benefits.
I'll ask this. What does mesh shading provide you that will make rendering go faster? What specific bandwidth constraints in the rendering pipeline that you think can be recaptured? How much ms do you think a game would go from using mesh shaders instead of the conventional triangle setup? These GPUs are fillrate limited. How does mesh shaders expand on that bandwidth?
 

Mr Moose

Member
Some no longer vetted vfx guy (suppose he was cosplaying before) telling everyone how it is to everyone’s amusement.
He knows his shit, but he's also not above the console warring (his job isn't really related to gaming, it's movies, but has experience with UE).
 
Well, just because his tag got changed doesn't mean his experience changed. He's not "cosplaying." :pie_eyeroll:

Weird that he lost the vetted part. I don’t think they discovered he was faking it or anything but maybe it was just something that was causing problems here so they removed it.
 

ethomaz

Banned
I give up with you. I'll bookmark this comment and wait for 3yrs and then come back to you to see if we have somehow squeeze full on RT games with Nanite geometry and 8k textures @ native 4k/60FPS. Until then.. enjoy your dreams of low-level API code-to-the-metal from a 2080-like GPU.
You create some illusory targets… c’mon.
 
Last edited:

ethomaz

Banned
I'll ask this. What does mesh shading provide you that will make rendering go faster? What specific bandwidth constraints in the rendering pipeline that you think can be recaptured? How much ms do you think a game would go from using mesh shaders instead of the conventional triangle setup? These GPUs are fillrate limited. How does mesh shaders expand on that bandwidth?
You are right here.
Mesh Shaders just simplify the classic geometric pipeline… so it is suppose to be easier to do things with it in terms of programming.
The GPU work continue being similar and so the performance.
 
Last edited:

Shmunter

Member
I am not seeing any Tomb Raider cosplay, though I would like to.
Is the suggestion that's him? On what grounds tho? Could just as easily be a loon pretending to be someone else. Not saying one way or another, just raising the possibility especially since the conduct and things said simply do not stand up to scrutiny.

Regardless, the stripping of the vetted status is indeed better - can say any stupid shit like the rest of us without anyone getting taken for a ride and we can all get on with enjoying the banter.
 

Mr Moose

Member
Is the suggestion that's him? On what grounds tho? Could just as easily be a loon pretending to be someone else. Not saying one way or another, just raising the possibility especially since the conduct and things said simply do not stand up to scrutiny.

Regardless, the stripping of the vetted status is indeed better - can say any stupid shit like the rest of us without anyone getting taken for a ride and we can all get on with enjoying the banter.
I am pretty sure that's VFXVeteran VFXVeteran , yes.
 
Last edited:

ManaByte

Member
Not for me. I enjoy gaming on the PS5 but I really don't like the Duel Sense controller. It is fine, but just does not feel right for me. I know some really like it and that is great, we all have our own opinions/preferences when it comes to controllers. In playing the PS5 version with the DS, I did not feel the features of the DS gave it that much of a difference.
I think the HD rumble is fine, but I’d give it up for a PS5 controller that had the same ultra low latency that the XS controller uses. Response time >>>> haptics.
 

Zathalus

Member
I'll ask this. What does mesh shading provide you that will make rendering go faster? What specific bandwidth constraints in the rendering pipeline that you think can be recaptured? How much ms do you think a game would go from using mesh shaders instead of the conventional triangle setup? These GPUs are fillrate limited. How does mesh shaders expand on that bandwidth?
I was just pointing out that virtually none of these new features have been taken advantage of. Microsoft and others have several tech blogs and videos going over the benefits of each.

Sampler Feedback:




Mesh Shading:







VRS:





If you don't have time to watch the experts talk about them, just download the following benchmarks to see what benefits they provide:


It's pretty obvious that smart usage of all three feature sets will bring significant performance boosts, be it on console or PC.
 

Hugare

Member
Are you guys really investigating users on IMDB and etc?

Get out of the internet some more, ffs

On topic: got Horizon 2 and Dying Light 2, but cant stop playing Cyberpunk despite having finished it 2 times already.

Just finished the heist. It looks gorgeous.

Perfect 30 fps cap. No stutters, no drops, perfectly locked. Very satisfying.
 
Last edited:

DenchDeckard

Moderated wildly
I’m really hoping that intels machine learning launches soon and offers some benefits to these AMD GPUs as I believe it is going to be open source.
fingers crossed the ps5 gpu and xsx xss have enough resources to utilise it, if possible.
 
Last edited:
Indeed, there is also the 10% of one core dedicated to I/O processing on XSX side to consider which leave the 'difference' as something like ~2% (very negligible to say the least) even if we completely ignore the inter connect cache latency differences between the CPUs, which i don't think is a wise thing to do if we really want to understand what's going on under the hood by the way...

It's possible that the way the PS5 "distribute" the power between the GPU and CPU is more efficient, and the CPU on XsX is simply throttling under heavy load such as some scenes in this game.
 

winjer

Gold Member
Twitter, Alex DF, VGtech saying Cyberpunk uses tier 2 VRS and it’s included in Xbox version.

How can that be true if PS5 is performing better?

Probably higher API overhead on the Xbox.
Seems like MS is still way behind Sony in updating it's software for this generation.
 

Riky

$MSFT
This can't be right, VRS is being talked up and positioned as the VRR replacement. It's the last hope before needing to resort to begging Phil for a mid-gen refresh.

Bad trolling.

VRR is giving Series X owners a 80-120fps version of Dying Light 2 instead of being stuck at 60fps.
DF talk about in their weekly show, they are VERY impressed.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Twitter, Alex DF, VGtech saying Cyberpunk uses tier 2 VRS and it’s included in Xbox version.

How can that be true if PS5 is performing better?

I think you're looking at it from the wrong perspective.

If the SX version does indeed use T2 VRS, it's biggest improvement isn't against PS5. It's again itself on the last patch.

BQUfHyI.png


-
-
-
-


As far as SX vs PS5 and VRR goes, since everyone here hates Tom, let's see what John has to say about it.

According to him, the SX version with a VRR display takes point as the best console version:

 
Last edited:

Thirty7ven

Banned
Tier 2 VRS gives higher resolutions which we see with this game, the drops which are a whole 3% of play let's not forget are CPU related.

Some of the drops are like 10 fps though, So weird. Does this mean PS5 has more CPU power available? So weird. The rez differences are so minuscule too, it has to be because CDreject are no good.

RT mode drops below 30 FPS on Series X, so weird how RT shadows and higher quality SSR are so taxing on the cpu. The more you know.
 

Arioco

Member
Tier 2 VRS gives higher resolutions which we see with this game, the drops which are a whole 3% of play let's not forget are CPU related.


So the (slight) higher average resolution is due to Tier 2 VRS and not the extra 2 Tflops? Interesting...

And what about the RT mode where Series X seems to have lower minimum rez than PS5 while performing slightly worse? Doesn't that mode use VRS?
 

adamsapple

Or is it just one of Phil's balls in my throat?
RT mode drops below 30 FPS on Series X, so weird how RT shadows and higher quality SSR are so taxing on the cpu. The more you know.
And what about the RT mode where Series X seems to have lower minimum rez than PS5 while performing slightly worse? Doesn't that mode use VRS?

If we go by DF, they both run the same resolution and SX only had 1 noted drop to 29 in the video, and Tom verbally says PS5 has one off drops too.

If we go by VGTech, SX's lowest reading is 36p lower and they both perform 100% locked 30 FPS. A once lower reading is also not an indicator of average resolutions.

SMH you guys ..
 

Thirty7ven

Banned
If we go by DF, they both run the same resolution and SX only had 1 noted drop to 29 in the video, and Tom verbally says PS5 has one off drops too.

If we go by VGTech, SX's lowest reading is 36p lower and they both perform 100% locked 30 FPS. A once lower reading is also not an indicator of average resolutions.

SMH you guys ..

Don’t you smh at me. It’s a tier 2 VRS game, running on a 2 TF more powerful machine, and PS5 with no VRS, a lower clocked cpu, is performing better.
 

Arioco

Member
Some of the drops are like 10 fps though, So weird. Does this mean PS5 has more CPU power available?


It does. Or it could at least in some cases. But nowhere close to that difference. Series X CPU is clocked slightly higher but uses part of a core for compression/decompression. That might leave PS5 with a VERY small advantage in some instances but of course there's no way that translates into 10 fps difference. There's got to be another reason for that dips.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Don’t you smh at me. It’s a tier 2 VRS game, running on a 2 TF more powerful machine, and PS5 with no VRS, a lower clocked cpu, is performing better.

Yeah and it's also Cyberpunk, a game that needed a years worth of patches to get in an acceptable state in the first place :messenger_tears_of_joy:

And i'll also post this since some of y'all only care about PS5 vs SX. This is the actual improvement. Look at the frame graph on the 1.23 patch picture and then compare it with 1.50. This is the most important improvement over the patch, not the difference of ~100p in SX's favor in Performance mode or the fluctuating FPS improvement here and there on PS5.



BQUfHyI.png
 
Last edited:

Riky

$MSFT
Some of the drops are like 10 fps though, So weird. Does this mean PS5 has more CPU power available? So weird. The rez differences are so minuscule too, it has to be because CDreject are no good.

RT mode drops below 30 FPS on Series X, so weird how RT shadows and higher quality SSR are so taxing on the cpu. The more you know.

The decompression blocks in the Series consoles aren't being used in last gen games, just like the rest of VA with SFS etc. Other comparisons found no drops on the quality mode so might be an outlier, it's variable across runs.
 

Arioco

Member
If we go by DF, they both run the same resolution and SX only had 1 noted drop to 29 in the video, and Tom verbally says PS5 has one off drops too.

If we go by VGTech, SX's lowest reading is 36p lower and they both perform 100% locked 30 FPS. A once lower reading is also not an indicator of average resolutions.

SMH you guys ..


And which one of those two scenarios do you consider "good" for Series X when it has 2 more Tflops and apparently makes use of VRS, which should in theory boost performance even further? Just curious.
 

Lysandros

Member
It does. Or it could at least in some cases. But nowhere close to that difference. Series X CPU is clocked slightly higher but uses part of a core for compression/decompression. That might leave PS5 with a VERY small advantage in some instances but of course there's no way that translates into 10 fps difference. There's got to be another reason for that dips.
Indeed but there is still the lower interconnect latency of PS5 CPU and slightly lower API overhead to add to this picture. Depending on the situation the difference can possibly be higher.
 

adamsapple

Or is it just one of Phil's balls in my throat?
And which one of those two scenarios do you consider "good" for Series X when it has 2 more Tflops and apparently makes use of VRS, which should in theory boost performance even further? Just curious.

I don't consider Cyberpunk an actual representation of either consoles strengths of weaknesses, cause that game has been a hot mess through the year and still has a lot of issues to iron out.

The game loads faster on SX, which shows you it's not utilizing the PS5 SSD and i/o to it's max either if you think I'm only talking about it from a Xbox perspective.

Ao what? Are you implying there’s no PS4 to PS5 improvement? The hell


No, but the improvement between the Xbox patches is bigger than the PS5 patches. PS5 ran the game *much* better before, now with the new patch performance is neck and neck on both.

The Series version has seen the biggest improvement pre and post patch.
 
Last edited:

Lysandros

Member
It's possible that the way the PS5 "distribute" the power between the GPU and CPU is more efficient, and the CPU on XsX is simply throttling under heavy load such as some scenes in this game.
The architecture is certainly more flexible to deal with 'power' related bottlenecks, can possibly be one of the reasons but i personaly don't think that's the main one.
 
Top Bottom