According to DF:
Below HD on PS3 vs 720p on 360 - Big difference
900p on X1 vs 1080p on PS4 - Small difference
1440p on PS4Pro vs 4K on X1X - Big difference
1800p on PS5 vs 4K on XSX - Big difference
6K on XSX vs 8K on PS5 - Small difference
Those returns sure keep diminishing and growing all the time.
DF is something else.
This part below really bugs me too.
Xbox Series X follows the same pattern. Its GPU runs at a slower clock, but should be more capable overall as it has many more compute units.
That's not entirely true. The X1X has more CUs and runs at a slower clock but isnt more powerful than the PS5, is it?
You need Clocks AND CUs or tflops to gauge whether or not a GPU is more powerful. In XSX's case, their tflops number indicates that is more powerful. So for them to say that more CUs means more performance is silly.
The X1x had only 4 more CUs or 10% more CUs than the PS4 Pro. By their logic, the performance shouldve been 10%, but the majority of the performance advantage the x1x enjoyed was due to the much higher clocks. 28% higher clocks. that combined with 10% more CUs and a much higher memory bandwidth allowed the x1x to push 44% more pixels on average and 100% more pixels in several cases like Shadow of Tomb Raider, RDR2, Wolfenstein and Far Cry 5 which were all native 4k on the x1x.
We have seen that with the xss as well. The GPU is only 1/3rd less powerful than the xsx, but performs far worse at around 1/4th the resolution in most games. You can chalk it up to ram bandwidth but the slower clocks might be a bottleneck too. The PS5 isnt even 3x more powerful like the xsx is. It is roughly 2.5x more powerful but pushes 4x more pixels in games like Metro. So clearly, the hardware is either outperforming its tflops count due to higher clocks, or the xss is underperforming due to slower clocks.
Either way, saying higher CUs equals more performance is simply inaccurate.