How can this game be CPU-bound if it runs on Switch which has a much weaker CPU than the Jaguar? I think the PS5 version is simply more optimized than the PS4 one.I have always felt that the Jaguar CPUs in the PS4 held the GPU back in many ways. The GPU had to do more heavy lifting in 60 FPS games than they would with a similar GPU on PCs with better CPUs. Those Uncharted 4 hacks that had the game running at 60 fps were only able to get to 60 fps by reducing the resolution down to 560p. 1/4 resolution drop instead of 1/2 it would take on PCs to get double the FPS.
The PS5 GPU also has 1.5x IPC gains compared to the GCN 1.0 PS4 GPU, but that still puts the PS5 at around 15 GCN 1.0 Tflops. I'd expect an 8x resolution boost. 16x is mind boggling and is probably mostly due to the 8x more powerful CPU.
From the last DF video the Call of Duty 120Hz mode looks to be CPU bound and that is show in the 20-30fps drops.I have always felt that the Jaguar CPUs in the PS4 held the GPU back in many ways. The GPU had to do more heavy lifting in 60 FPS games than they would with a similar GPU on PCs with better CPUs. Those Uncharted 4 hacks that had the game running at 60 fps were only able to get to 60 fps by reducing the resolution down to 560p. 1/4 resolution drop instead of 1/2 it would take on PCs to get double the FPS.
The PS5 GPU also has 1.5x IPC gains compared to the GCN 1.0 PS4 GPU, but that still puts the PS5 at around 15 GCN 1.0 Tflops. I'd expect an 8x resolution boost. 16x is mind boggling and is probably mostly due to the 8x more powerful CPU.
Yes I know but also found value of 3240p in other article?
6k to 8k is 2x more pixels... or 100% more pixels.
4k to 8k is 4x more pixels... or 300% more pixels.
I have not ideia where are you getting 55.66%.
Well when you compare the normal 8k with normal 4k.
You need to use the 19:6 aspect ratio:
8k 16:9: 7680x4320
6k 16:9: 5568x3132 (it is not a standard... so maybe Series X uses another 6k 16:9 res)
Yeap it is not a standard but 8k should be close/around 2x 6k even choosing different resolutions.Yes I know but also found value of 3240p in other article
Thanks for the correction!If you decrease the 8k pixels and increase the 6x pixeks I guess.
8k = 7680x4320 = 33.177.600 (standard res in 16:9 called 8K UHD)
6x = 5760x3240 = 18.662.400 (not a standard res in 16:9 but that is the middle between 8k and 4k that are standards)
5k = 5120x2880 = 14.745.600 (standard res in 16:9 called 5k)
4k = 3840x2160 = 8.294.400 (standard res in 16:9 called 4k UHD)
BTW 8k = 2x6k = 4x4k = 16x2k.
An RDNA2 teraflop is not the same thing as an old GCN / RX 480 architecture teraflop. You get a lot more out of a lot less with RDNA2... especially in certain types of workload. These systems are much more powerful than comparing teraflops between generations would indicate.
Wait until mesh shaders and other tricks are utilized to their full potential.
Devs told DF that higher clocks allowed them to hit 8K vs 6K on Series X.
This is pretty interesting.So it is not just the higher clock but the memory setup too.
How can this game be CPU-bound if it runs on Switch which has a much weaker CPU than the Jaguar? I think the PS5 version is simply more optimized than the PS4 one.
Impressive. my pc chokes even on 8k video,,, and dark souls 3 runs 12fps at 8k. I have 3080
Very interesting.
PS5 can render at 8k thanks to the higher GPU clockspeed vs 6k at the Series X according to the developer. Bigger difference than Hitman 3, I assume?
Cerny redeemed?
Lol Imagine if this video had come out last week when we were discussing the advantages the PS5 might have with its higher clock speeds.
I am guessing one of these things are helping this particular game help hit higher resolutions. Though i still wont declare victory yet. Not every game seems to be benefiting from higher clocks like this. At the end of the day, tflops are still the best metric for ingame performance.
Definitely, but that's unrelated to what i was talking aboutI remember John mention that only a native PS5 app could take full advance over RDNA 2.0, if not would be just an overclock PS4 PRO legacy mode.
To be more accurate they shared in the exact resolution...Thanks for the correction!
Can non-native Series X games reach 6k in resolution? I'm inclined to say this game is using GDK (native Series game).John does say this is native ps5 so is the xbox version the xbox one version?
If so then it shows how much better ps5 native games are compared to ps4 games running on the ps5.
maybe it was 12 fps on 2070... and closer to 30 on 3080 ? I must be messing something upI tried dark souls 3 and I get around 35 to 40 fps at 8k.
6k gives me 60fps which cleans up the chain mail.
I still use 4k though as the latency goes down to less than 7ms
Forgive me but this is a pure nonsense. What about the XSX hardware specs is able to render more "complex" rendering? I don't follow you. Faster GPU is faster in everything, XSX has more CUs and outside that, GPU hasn't any other advantageBecause the visuals are simple, what matters is the machine that can push more pixels, I think.
So this in one example where the higher Pixel Rate of the PS5 makes a concrete difference.
Always bet o Cerny you fools!
But it must be said... the SeX wasn't running this game completely native like the PS5 is, right?
This must makes things a bit more difficult.
Can non-native Series X games reach 6k in resolution? I'm inclined to say this game is using GDK (native Series game).
the game has a 120 fps mode.....Incredible how a 2yo indie game is suddenly relevant...
Serious question: why go for resolutions that almost all screens can’t even display, and not aim for very high fps instead?
Who am I kidding? I know the answer.
They actually said that?
I can't watch the video right now so I wanted to make sure it wasn't a mistake.
Edit: NVM I just saw the article. Very interesting though.
Yes of course its an edge case and not applicable across the board, but I still can’t shake the feeling it can be considered relevant since most games exhibit nigh on identical res and performance in the two consoles despite the TFLOPs difference.
It has a 4K@120fpd mode. What did you expect? A 1080p@480fps? I'm sure that a lot of screen out there can display that, right.Incredible how a 2yo indie game is suddenly relevant...
Serious question: why go for resolutions that almost all screens can’t even display, and not aim for very high fps instead?
Who am I kidding? I know the answer.
Why not aim higher than that too, since this feels like a “we did it because we could” thing?the game has a 120 fps mode.....
Forgive me but this is a pure nonsense. What about the XSX hardware specs is able to render more "complex" rendering? I don't follow you. Faster GPU is faster in everything, XSX has more CUs and outside that, GPU hasn't any other advantage
Definitely, but that's unrelated to what i was talking about
No, it's GPU bound. If last-gen PS4/XB1 could do near 60fps avg, then PS5/XSX including XSS have a ton of headroom for 120+fps on the CPU side.From the last DF video the Call of Duty 120Hz mode looks to be CPU bound and that is show in the 20-30fps drops.
This one definitively not.
i was told vrr will role out around December once it is certified.Is that just internal rendering atm right?
PS5 output is currently limited at 4k.
So basically we're getting a high quality downsampling.
Sony needs to upgrade the PS5 video output with 8k and VRR. The VRR update for their TVs seems to be rolling out right now.
Yeah exactly, there will always be the TFLOPs differentiator between the two, making exact comparisons when it comes to CU/clockspeed inaccurate. I didn’t expect the difference between the two be this small pre-launch, though.true. it could be that they basically balance eachother out somewhat in most games
What TVs support 120+ fps?Why not aim higher than that too, since this feels like a “we did it because we could” thing?
i was told vrr will role out around December once it is certified.