Nope. Bandwidth on PS5 is just enough for the pixel fill rate with L2 compression.
Link? There is no l2 like "infinite" cache in ps5
Last edited:
Nope. Bandwidth on PS5 is just enough for the pixel fill rate with L2 compression.
but it's still better, and by reviews, the controller is a massive difference, it's a tangible difference the general public will feel vs not feel as with the xbox one, especially in a game like this.
If you asked 100 ransom people would they prefer their car with power steering or without, or electric windows with or without, which one do you think they will choose?
I am not sure why people are not taking this into consideration, its a massive difference in how the game plays and feels. If it was a pure fps i could see your point, this is not, its got a gazillion things in it to do besides just shooting.
Yep and that's why my point stands, its not just a gimmick, its a whole better way of experiencing the game for the better.
Microsoft will probably emulate it in a few year's time, and then people will be all over it like hot glue.
this is an evolution in gaming terms, it might only be 10-15% more but it's still more.
People really need to start taking it in factor, its important to the gamers enjoyment.
Digital Foundry have already debunked that in real world testing.2000Mhz 20CUs will have better performance than 1000Mhz 40CUs.
That is true to any GPU.
The only exceptions are when there is bandwidth differences between the cases.
I have no ideia where you find that claim lol
BTW Series X and PS5 are different architectures.
No. It's part of the GPU. When a specific data in the GPU caches are no longer needed, the cache scrubbers evict those unnecessary data in a more fine-grained manner instead of flushing the whole cache which could hurt GPU perf, in other words, this could tank the frame-rate.Not sure if this has anything to do with rasterization. It is part of the IO
Because it's a gimmick and a lot of people are turning it off, especially in multiplayer games.
This is to keep the gpu fed from the IO. I don't believe that it has anything to do with fillrateNo. It's part of the GPU. When a specific data in the GPU caches are no longer needed, the cache scrubbers evict those unnecessary data in a more fine-grained manner instead of flushing the whole cache which could hurt GPU perf, in other words, this could tank the frame-rate.
Hence the addition of cache scrubbers so that frame-rate don't tank when stale data from the caches are removed.
Idk if ms will emulate if its not in from day one. I expect most devs would probably just ignore it unless its easy to develop for.
Nowhere did I mention fillrate in that post. Cache scrubbers help GPU perf. It is a part of the GPU, not a part of IO like you keep saying.This is to keep the gpu fed from the IO. I don't believe that it has anything to do with fillrate
Never lolDigital Foundry have already debunked that in real world testing.
Nowhere did I mention fillrate in that post. Cache scrubbers help GPU perf. It is a part of the GPU, not a part of IO like you keep saying.
Nowhere did I mention fillrate in that post. Cache scrubbers help GPU perf. It is a part of the GPU, not a part of IO like you keep saying.
DF didn't debunk anything. They concluded by saying they'd need a GPU with more than 40 CUs and more games need to be tested for an accurate picture.Digital Foundry have already debunked that in real world testing.
Everything is compressed in RDNA GPUs, from L1, L2 to GDDR6 (which wasn't the case with GCN).Link? There is no l2 like "infinite" cache in ps5
In previous architectures, AMD introduced delta color compression to reduce bandwidth and save power. The RDNA architecture includes enhanced compression algorithms that will save additional bandwidth. Additionally, the texture mapping units can write compressed color data to the L2 cache and other portions of the memory hierarchy, whereas in earlier architectures, compressed data could only be written back to memory.
To maximize memory bandwidth, Navi also employs lossless color compression between L1, L2, and the local GDDR6 memory.
Again, It's part of the GPU. They exist in the caches of the GPU. It's you who need to watch it again.Road to ps5 shows that is is part of the IO though. Watch it again
And the whole argument is that the ps5 has the fillrate advantage despite having less bandwidth
No they didn't. They ran real world tests having one GPU with higher frequency and lower CU count vs another with lower frequency and higher CU count. The GPU with the higher CU count performed better.DF didn't debunk anything. They concluded by saying they'd need a GPU with more than 40 CUs and more games need to be tested for an accurate picture.
And yeah, the PS5 has 142.72 Gpix/s fillrate vs SX's 116 Gpix/s.whole argument is that the ps5 has the fillrate advantage despite having less bandwidth
You mean "up to" 22% higher pixel fillrate. PS5 uses boost clocks so none of your metrics reflect sustained or average performance.
Nope. Watch that vid again and then come back. Rich tested just one game. He then concluded by saying he'd need more than 40 CU GPU and "more testing points" to fully conclude that higher CU is better.No they didn't. They ran real world tests having one GPU with higher frequency and lower CU count vs another with lower frequency and higher CU count. The GPU with the higher CU count performed better.
It is what it is. Your theory isn't matched in real world applications.
Yeah, this is getting overlooked and dismissed. And its designed so that it can be built on the exact same production line. Remove the drive, change the cover and you have the DE.Additionally, Sony effectively has a 399 console that can go toe to toe with a 499 console.
And yeah, the PS5 has 142.72 Gpix/s fillrate vs SX's 116 Gpix/s.
It's not a gimmick, Sony has designed it with many game titles in mind, just because you can turn it's feature set off does not make it a needless function.
You want 30 fps modes with all the eye candy or 60 fps with the smooth performance, these you can also be changed or turned of in a lot of games, are those gimmicks, no there are not.
Sony put millions if dollars into their controller research to give what they believe is a better next gen performance. Just because you don't like it does not mean millions of casual players our there wont.
The controller is a win-win, no amount of trying to rubbish it will change that.
It had nothing to do with overclocking a GPU.Never lol
I don’t know where you guys take these false stories.
What DF tried is to simulate a PS5 using a RDNA card where the performance doesn’t scale after 2000Mhz and so shows no gain in performance.
But hey RDNA 2 does scale performance up to 2500Mhz or even more.
I even necro a thread to say I was right about they doing that misleading benchmarks lol
I didn't say it helps with rasterization though. You seem confused.No mention that helps with rasterizarion though
You keep saying "limited to memory bandwidth" but so far nearly every game is performing better on PS5. Shouldn't it be limited by bandwidth since it has -20% less bandwidth than SX?These are the theoritical max
l2 compression could help improve rasterizarion in both consoles though, so it is still limired to memory bandwidth.
I didn't say it helps with rasterization though. You seem confused.
You keep saying "limited to memory bandwidth" but so far nearly every game is performing better on PS5. Shouldn't it be limited by bandwidth since it has -20% less bandwidth than SX?
Lol. So the results showed that lower frequency and higher CU count had better performance, and your take is that the opposite is true.Nope. Watch that vid again and then come back. Rich tested just one game. He then concluded by saying he'd need more than 40 CU GPU and "more testing points" to fully conclude that higher CU is better.
Higher frequency is better and the games on PS5 vs SX clearly show PS5 outperforming SX. It is what it is.
PS5 is outperforming XSX though.Lol. So the results showed that lower frequency and higher CU count had better performance, and your take is that the opposite is true.
You are plain wrong. The facts are the facts.
At 30fps it's impossible to know the extra performance going to waste.
So once again see 10 PS5 tittyflops = 12 xsx tflops
PS5 most powerful per titflop
No they didn't. They ran real world tests having one GPU with higher frequency and lower CU count vs another with lower frequency and higher CU count. The GPU with the higher CU count performed better.
It is what it is. Your theory isn't matched in real world applications.
"So I'd say that we'd need more testing points and an RDNA card with more than 40CUs to get much more in the way of meaningful data to be honest"
Yeah I was going to say, and the compression looks god awful too.Why can't DF get their videos to run in 4k on YT? Yeah, its supposed to be a YT issue, but how come VGT and NXG can get their 4k videos uploaded just fine? I feel like something else is going on with the DF channel, because it's been 5 days now, and Dirt5 is still showing 1080. That's now Dirt5, WD, and COD comparison videos that have failed to encode in 4k. Meanwhile, the sponsored videos for COD and Godfall are both up in 4k after 2 and 4 days respectively.
If it was just YT to blame, then those sponsored videos should have suffered delays as well, but they haven't. It's quite strange. I find these comparison videos to be useless when they're sub-4k.
Absolutely unbelievable that all hell has broken loose over some launch titles.. lol
Do people on either side think that these games represent the best of what is possible on these consoles?
Absolutely unbelievable that all hell has broken loose over some launch titles.. lol
Do people on either side think that these games represent the best of what is possible on these consoles?
Both will have advantages and disadvantages.No one is thinking that. However, the difference between the two consoles graphically in multi-platform titles will be minimal at best. Before these comparison videos started showing up, the narrative was the XSX would have a clear advantage in multi platform titles.
Both will have advantages and disadvantages.
The next gen Battlefield in 2021 will be interesting
It’s like people miss out what they don’t want to see, he definently said that and that it is likely the puddles missing in PS5 was a bug to.No.
As Alex pointed out from the config files, they have the same value. Something isn't working properly in the Xbox version.
It's being nullified by lazy devsJust watched the video. So, no differences. He sounded a little mystified why the XSX wasn't performing better than PS5.
Even though it's a draw, I think PS5 is the winner, since it is exceeding expectations ("punching above its weight," as they say). Or, another way of seeing it is XSX is failing to live up to expectations. That 18% TF on-paper advantage is seemingly being nullified by ... something.
RDNA has nothing to do with the test. That's just him saying it would be good to get more info on different cardsLets not forget that Richard recently said that the PlayStation 5 is punching above its weight, which suggest it's performing higher than he anticipated.
You can quote Richard and leave out an important part.
RDNA has nothing to do with the test. That's just him saying it would be good to get more info on different cards
That in no way changes the fact that real world tests showed the opposite of what some here are changing.
And as for Richard's comments, if you are agreeing with him you are literally saying the PS5 shouldn't be able to run ACV as good as it does. I think PS5 can easily run that, so I don't think the PS5 is doing things here that it technically shouldn't be able to.
missing puddles in reflections