• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry The PS5 GPU in PC Form? Radeon RX 6700 In-Depth - Console Equivalent PC Performance?

Buggy Loop

Member
You can mount a complete PC with a 6700XT (better than the 6700 and the PS5 GPU) for $750


If you search for a used 6700 you probably can reduce it to about $600.

So yeah, consoles are no longer such a great deal.

Had a similar build but with a 5600 and a dedicated cooler (not wraith) for roughly the same price.


This kind of stuff is something you plan in the span of months and hunt for bundles and deals. This kind of mid range build easily goes down to $650~600.

Or a 12400F build which at least starts on DDR5 footing and the motherboard supports up to intel 14th gen while AMD's AM4 is a dead end on DDR4.


12400F pricing is a tad above average now but it often goes lower, almost every 2 weeks or so.

As Digital Foundry demonstrated, there's really no need to go big on CPU, that video was with a 12400F. And it's faster on average than a 5600x. It's a really good alternative I think.

 
Last edited:

Topher

Gold Member
Yes and no. The benefit of consoles is games will always be tailored to run decently on them because they have to. With PC games however, even in the same console generation, they will often be designed to run well on the latest GPUs only, and ones a few years old will not cope very well. Even though those same older GPUs did cope with current gen games a few years earlier. It's part of the PC tax, it's just a lot more expensive to keep up to date than it is with a console.

Depends on what you are calling running "decently". Several games on console only run at 30fps. Games like Avatar have to be upscaled from resolutions between 864p and 1260p to achieve 60fps in performance mode. That's the console "tax". You only get the modes the developer gives you. What are you basing this take that games are only designed to run "well" on the latest GPUs? Older GPUs get profile updates for newer games just like new GPUs.
 

Elysium44

Banned
Depends on what you are calling running "decently". Several games on console only run at 30fps. Games like Avatar have to be upscaled from resolutions between 864p and 1260p to achieve 60fps in performance mode. That's the console "tax". You only get the modes the developer gives you. What are you basing this take that games are only designed to run "well" on the latest GPUs? Older GPUs get profile updates for newer games just like new GPUs.

Basing it on how it's always been and is continuing to be. When this gen of consoles came out, a GTX 1060 or RTX 2060 was plenty to run the latest games at medium-high 1080p60. Good luck using those type of cards nowadays, or for the next five years and expecting them to keep up. Devs typically look at the most commonly owned cards like RTX 3060 and that will be the new minimum for a reasonable experience. Anyone who knows PC gaming knows it works like this in practice.

Another thing is console games often run better than PC games these days, as a lot of PC games suffer stuttering problems regardless of hardware performance. Needing to compile shaders every time you update the driver, which can take a very long time in some games (TLOU Part 1 for example). Consoles don't have this problem.

I have a PC which is much more powerful than my Series S, at least on paper. Yet some games perform better on the latter. I played NFS Unbound on both recently. It loads in a flash on the Xbox, on the PC even on an nvme drive, it doesn't. Also the PC version can stutter but the Xbox version never does. Forza Horizon 5 on Xbox lets you completely skip most of the loading splash screens, the PC version makes you sit through them. And it also recompiles shaders with new drivers.

FM2023 is another example - runs much better on the Series S than my PC, because the PC port is terrible. Lazy or incompetent devs expect you to have the latest GPUs to brute force their subpar work. But on the console version, they can't do that - they HAVE to do a decent job.
 
Last edited:

JimboJones

Member
Without knowing the rest of the PC components the comparison is not that interesting. CPU and RAM speed can be huge differentiators for games' performance.
Not really, if someone is bottlenecked by their GPU they now know ballpark GPU is going to give them console equivalent settings. If they buy the GPU and are then limited by CPU then they can upgrade around that. It's not rocket science.
 

Bojji

Member
Basing it on how it's always been and is continuing to be. When this gen of consoles came out, a GTX 1060 or RTX 2060 was plenty to run the latest games at medium-high 1080p60. Good luck using those type of cards nowadays, or for the next five years and expecting them to keep up. Devs typically look at the most commonly owned cards like RTX 3060 and that will be the new minimum for a reasonable experience. Anyone who knows PC gaming knows it works like this in practice.

1060 is from 2016 and it was medium even back then.

2060 is not much slower than 3060 when not vram limited:

xw0HyAY.jpg


It's true that new architectures usually get the best optimization but if GPU is not heavily cut down and have good hardware in it it should perform well for many years. GTX 780 for example was (much) faster than PS4 but that laughable 3GB of Vram killed that GPU pretty quickly.
 

Topher

Gold Member
Basing it on how it's always been and is continuing to be. When this gen of consoles came out, a GTX 1060 or RTX 2060 was plenty to run the latest games at medium-high 1080p60. Good luck using those type of cards nowadays, or for the next five years and expecting them to keep up. Devs typically look at the most commonly owned cards like RTX 3060 and that will be the new minimum for a reasonable experience. Anyone who knows PC gaming knows it works like this in practice.

Older GPUs are able to use upscaling tech just like consoles can. Here is a GTX 1660 Super running Avatar at 1080p medium settings averaging frame rates in the 80s.

Timestamped


I have a PC which is much more powerful than my Series S, at least on paper. Yet some games perform better on the latter. I played NFS Unbound on both recently. It loads in a flash on the Xbox, on the PC even on an nvme drive, it doesn't. Also the PC version can stutter but the Xbox version never does. Forza Horizon 5 on Xbox lets you completely skip most of the loading splash screens, the PC version makes you sit through them. And it also recompiles shaders with new drivers.

FM2023 is another example - runs much better on the Series S than my PC, because the PC port is terrible. Lazy or incompetent devs expect you to have the latest GPUs to brute force their subpar work. But on the console version, they can't do that - they HAVE to do a decent job.

What are the specs of that PC of yours?
 
Last edited:

Elysium44

Banned
1060 is from 2016 and it was medium even back then.
It's a 1080p card, not really relevant to show a 1440p graph.

In 2020-2021 the GTX 1060 was one of, if not the most commonly owned card on Steam (where even now, most people use 1080p). It played most games of that time quite well, or very well.
 

Elysium44

Banned
Older GPUs are able to use upscaling tech just like consoles can. Here is a GTX 1660 Super running Avatar at 1080p medium settings averaging frame rates in the 80s.

What are the specs of that PC of yours?

1660 is still a reasonable card but probably won't be doing so well by the time we get to the end of the gen. I could prove my point with benchmarks of games over the years on the same hardware, showing a given card can't keep up as time goes by - even though the console version obviously does because console devs can't just tell gamers to upgrade their CPU, GPU or RAM.

I built my PC in 2020 (with a 1660 then but upgraded to a 12GB RTX 3060 since). Z490, i5-10600, 32GB 3600, various nvme and SATA SSDs.
 

Bojji

Member
It's a 1080p card, not really relevant to show a 1440p graph.

In 2020-2021 the GTX 1060 was one of, if not the most commonly owned card on Steam (where even now, most people use 1080p). It played most games of that time quite well, or very well.

Same performance delta:

IwPQ6kG.jpg


2060 12GB is actually a cross between 2060 super (core) and 2060 (lower memory bus) but it's the same 2018 architecture.
 

Topher

Gold Member
1660 is still a reasonable card but probably won't be doing so well by the time we get to the end of the gen. I could prove my point with benchmarks of games over the years on the same hardware, showing a given card can't keep up as time goes by - even though the console version obviously does because console devs can't just tell gamers to upgrade their CPU, GPU or RAM.

I built my PC in 2020 (with a 1660 then but upgraded to a 12GB RTX 3060 since). Z490, i5-10600, 32GB 3600, various nvme and SATA SSDs.

1660 Super is less powerful than a 2060. I'd say it passes your test of 1080p60 medium settings for a card in the 1060-2060 range with newest, most demanding game.
 

Kerotan

Member
You can also buy a used full PC and it will be even more cheap, so no, this is not 2020 anymore.


You can't buy a new 6700 if that's what you're talking about.
A full pc for the price of a ps5 running games at the same level? Yeah I don't think so. Buy all the parts new at RRP nowhere close to a new ps5.

If you go scrounging looking for cheaper used or refurbished then you need to factor in a used or refurbished ps5 would be still cheaper.
 

SKYF@ll

Member
It's interesting to see how the PS5 benchmarks go up and down depending on the settings (resolution, RT enabled, etc.).
PS5($500 console) seems to be demonstrating performance of over 10 TFLOPs.
4zjx9VI.jpg
 

shamoomoo

Member
Non of these gpu's are made for 4k.
4060 absolute shits over PS5 not even close, framegen, dlss, rt. But again not made with its 8gb framebuffer for 4k and bandwidth.

Complete dog shit comparison.
A newer GPU or PC component should be better than 3-4 year old tech.
 

SlimySnake

Flashless at the Golden Globes
None of the games tested at 60fps are CPU bound at those framerates on a 3600 or even a 2700. You really think Monster Hunter Rise (a Switch port) is CPU limited?
I dont think monster hunter is a good test for anything.. Not sure why it was even included.

And Avatar is definitely a game that uses RTGI even in its 60 fps mode on consoles. CPUs get hit hard in any game with RT. AW2 is also a game that pushes the CPUs hard

Alex used to do this all the time. Dismiss the fact that these games were not CPU bound and then they tested the PS5 and XSX CPUs and they were completely trash in DF's own comparisons in modern games especially with RT on. IIRC, cyberpunk was like 40% slower compared to the 3600 and metro was even worse. I cant find that video right now but its a must watch to see just how poor these CPUs really are and how much they hold back the games even at 30-50 fps.
 

SlimySnake

Flashless at the Golden Globes
Found the video where Rich reviewed the XSX CPU.



He uses the 7900XTX in that video which is absolutely hilarious. But in other cpu tests you can see just how poorly this thing performs even compared to the 3600 which has 2 fewer cores and 4 fewer threads. The consoles reserve an entire CPU core for the OS have to share vram with the GPU which means console CPU is even slower than what we are seeing here.

i really hope Richard used this CPU to test the GPU and not the 3600 or worse the latest 7600 which outperforms the XSX CPU by 2.5-3x despite having the same number of cores and threads due to architectural gen on gen improvements.
 

Gaiff

SBI’s Resident Gaslighter
i really hope Richard used this CPU to test the GPU and not the 3600 or worse the latest 7600 which outperforms the XSX CPU by 2.5-3x despite having the same number of cores and threads due to architectural gen on gen improvements.
Why would you hope that? Nobody has that desktop kit so how would it make sense to us this?

It’s a GPU test my guy. That’s why he didn’t downclock the 6700 to PS5’s level. He wants to show how the card compares to the PS5’s GPU, not how a theoretical PS5 would fare in a PC environment.

Long as he uses a CPU that’s at least on a level similar to consoles or slightly higher that cannot completely skew results, it’s fine. The goal is to test GPU performance.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Why would you hope that? Nobody has that desktop kit so how would it make sense to us this?

It’s a GPU test my guy. That’s why he didn’t downclock the 6700 to PS5’s level. He wants to show how the card compares to the PS5’s GPU, not how a theoretical PS5 would fare in a PC environment.

Long as he uses a CPU that’s at least on a level similar to consoles or slightly higher than cannot completely skew results, it’s fine. The goal is to test GPU performance.
If it's solely a GPU test then he might as well use a 7800x3D. Would you say that would be an accurate comparison against the PS5 GPU?

If he had mentioned what CPU he was using then fine, but the fact that he left it unsaid makes no sense.
 

Gaiff

SBI’s Resident Gaslighter
If it's solely a GPU test then he might as well use a 7800x3D. Would you say that would be an accurate comparison against the PS5 GPU?
No, and I said as much because the CPU is so much more powerful that it can actually completely change some results. That and it wouldn’t make sense because no one pairs a 7800X3D with a 6700.
If he had mentioned what CPU he was using then fine, but the fact that he left it unsaid makes no sense.
Yeah, I took issue with that. What makes me think he didn’t use a high-end CPU is his conclusion with the MHR and Hitman tests. If he was using a powerful CPU, then there’s no way he’d be dumb enough not to realize the massive disparity is due to the CPU. Something like a 3600 wouldn’t trounce the PS5 by 40% in CPU-bound scenes.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Something like a 3600 wouldn’t trounce the PS5 by 40% in CPU-bound scenes.
In the XSX CPU DF review, hitman 3 is anywhere between 10-50% faster on the 3600. And again, its clocked at 4.0 ghz, has access to all 8 cores and 16 threads unlike the PS5 and XSX and access to its own dedicated ram pool. so the PS5 CPU is probably worse than these results below.

CJdlgTi.jpg


The 7600 just destroyed everything. It actually gives me hope for the PS5 pro if they use the 7600, the CPU upgrade would be a massive 2-3x if they dont skimp on the L3 cache.


Honestly the gen on gen improvements AMD has made in the CPU space are pretty cool to see, but they also kind of fucked over the consoles by not doing them sooner.
 

SlimySnake

Flashless at the Golden Globes
Guys, I think I was able to sherlock holmes this thing and figured out what CPU he's using. It's the i5-12400F.

I looked at his 4060 review which had some PS5 benchmark comparisons he made using the i5-12400F to get a more budget build benchmark, and found the same scene from Plague's Tale which matches his results for his 4060 + i5-12400F benchmark.

4060 review/ 6700 review.

Plague2_DdnBbz5.jpg
5rMCEZE.jpg


The i5-12400F is way more powerful than the ryzen 3600. Looks like 20-30% according to this video.



However, what it does not do is give us much of a meaningful sense of how the GPU is actually going to be used, what it can achieve with optimised settings nor how it works sits within a more budget-orientated rig that matches its price-point. This is why I spent a weekend with the RTX 4060 plugged into a system consisting of a Core i5 12400F, paired with 3200MT/s CL18 DDR4 and running off a 1TB PCIe 3.0 Crucial P3 SSD. The aim was straightforward - to use gaming standards set by PlayStation 5 and to see if the RTX 4060 could get anywhere near to it.
 

Leonidas

Member
In the XSX CPU DF review, hitman 3 is anywhere between 10-50% faster on the 3600. And again, its clocked at 4.0 ghz, has access to all 8 cores and 16 threads unlike the PS5 and XSX
3600 is 6 core 12 thread

It actually gives me hope for the PS5 pro if they use the 7600, the CPU upgrade would be a massive 2-3x if they dont skimp on the L3 cache.
7600 is 6 core 12 thread.
Consoles are based on APUs. Zen2-Zen4 APUs are all gimped on cache...
 

SABRE220

Member
The whole comparison becomes kind of weak when they don't mention which cpus are being used since we know modern cpus trounce the console ones.
 

SlimySnake

Flashless at the Golden Globes
3600 is 6 core 12 thread


7600 is 6 core 12 thread.
Consoles are based on APUs. Zen2-Zen4 APUs are all gimped on cache...
I was talking about the XSX CPU he was testing. it is clocked at 4.0 ghz and has all 8 threads and 16 cores available for games. the XSX does not. neither does the ps5.
 

Leonidas

Member
The whole comparison becomes kind of weak when they don't mention which cpus are being used since we know modern cpus trounce the console ones.
The consoles and the weak GPUs used were most likely GPU limited. You only notice the CPU difference when you aren't limited by lower end GPUs.

I was talking about the XSX CPU he was testing. it is clocked at 4.0 ghz and has all 8 threads and 16 cores available for games. the XSX does not. neither does the ps5.
Ah, the 4800S. But he tested that with a 3090. That's a much more powerful GPU than what he tested compaing the PS5 in the OP video.

The 3090 is 1.8 to 2x faster than the 4060. With that card you can see that 10-50% CPU difference (at 1080p in that canned Hitman benchmark you pointed out), but with a 4060, 6700 and other PS5 tier cards you would not notice anywhere near that difference as you will be GPU limited with lower end GPUs.

4800S is also gimped by PCIe 4x4, which could have also hurt it in that test...
 
Last edited:

Zathalus

Member
I dont think monster hunter is a good test for anything.. Not sure why it was even included.

And Avatar is definitely a game that uses RTGI even in its 60 fps mode on consoles. CPUs get hit hard in any game with RT. AW2 is also a game that pushes the CPUs hard

Alex used to do this all the time. Dismiss the fact that these games were not CPU bound and then they tested the PS5 and XSX CPUs and they were completely trash in DF's own comparisons in modern games especially with RT on. IIRC, cyberpunk was like 40% slower compared to the 3600 and metro was even worse. I cant find that video right now but its a must watch to see just how poor these CPUs really are and how much they hold back the games even at 30-50 fps.
Avatar can easily do 60+fps on a 2600 of all things, the game is not heavy on the CPU at all. Literally none of the games tested at the console settings are CPU bound.

Guys, I think I was able to sherlock holmes this thing and figured out what CPU he's using. It's the i5-12400F.

I looked at his 4060 review which had some PS5 benchmark comparisons he made using the i5-12400F to get a more budget build benchmark, and found the same scene from Plague's Tale which matches his results for his 4060 + i5-12400F benchmark.

4060 review/ 6700 review.

Plague2_DdnBbz5.jpg
5rMCEZE.jpg


The i5-12400F is way more powerful than the ryzen 3600. Looks like 20-30% according to this video.




Those scores would have been the exact same had he used a 3600, 12400, or even a 7800x3D. None of those CPUs drop under 50fps, the game is completely GPU bound at those framerates with those GPUs. It would have been a different story if he was using the 60fps mode but he was not.
 

SABRE220

Member
The consoles and the weak GPUs used were most likely GPU limited. You only notice the CPU difference when you aren't limited by lower end GPUs.


Ah, the 4800S. But he tested that with a 3090. That's a much more powerful GPU than what he tested compaing the PS5 in the OP video.

The 3090 is 1.8 to 2x faster than the 4060. With that card you can see that 10-50% CPU difference (at 1080p in that canned Hitman benchmark you pointed out), but with a 4060, 6700 and other PS5 tier cards you would not notice anywhere near that difference as you will be GPU limited with lower end GPUs.

4800S is also gimped by PCIe 4x4, which could have also hurt it in that test...
They literally did a cpu faceoff where with the same gpus near all games received massive improvements 2X+ at least with a simple 7600, a lot of these games have the threshold to have considerable performance improvements outside the gpu.
 

hinch7

Member
lol I dont ever want to hear how console fanboys are the worst.

PC gamers are just as bad as any console fanboy.



This mother fucker watched the entire video and ignored 6 other comparisons where the PS5 came out on top just to shill to his twitter fanbase. What a loser.

To be fair if Richard matched the 6700 with any modern AMD/Intel CPU they'd run circles around the PS5 in most situations. Frame times especially in non capped scenarios.

Dumb take though, I agree.
 
Last edited:

Elysium44

Banned
Avatar can easily do 60+fps on a 2600 of all things, the game is not heavy on the CPU at all. Literally none of the games tested at the console settings are CPU bound.

Those scores would have been the exact same had he used a 3600, 12400, or even a 7800x3D. None of those CPUs drop under 50fps, the game is completely GPU bound at those framerates with those GPUs. It would have been a different story if he was using the 60fps mode but he was not.

CPU performance isn't just about the average but also the lows, which weaker CPUs struggle with.
 

winjer

Gold Member
In the XSX CPU DF review, hitman 3 is anywhere between 10-50% faster on the 3600. And again, its clocked at 4.0 ghz, has access to all 8 cores and 16 threads unlike the PS5 and XSX and access to its own dedicated ram pool. so the PS5 CPU is probably worse than these results below.

CJdlgTi.jpg


The 7600 just destroyed everything. It actually gives me hope for the PS5 pro if they use the 7600, the CPU upgrade would be a massive 2-3x if they dont skimp on the L3 cache.


Honestly the gen on gen improvements AMD has made in the CPU space are pretty cool to see, but they also kind of fucked over the consoles by not doing them sooner.

That tests was very flawed. And it has one huge issue, that greatly affects the 3600 performance and invalidates the comparison.
The latency test for the 3600 DF made showed a +90ns for memory latency. A normal Zen2 CPU would have a memory latency of 70ns. For tweaked memory, mid 60ns.
This is a huge difference that hurts performance on that CPU.
I don't know how DF screwed up the memory latency on the 3600 that badly, but it's really bad result.
 
Last edited:

Leonidas

Member
They literally did a cpu faceoff where with the same gpus near all games received massive improvements 2X+ at least with a simple 7600, a lot of these games have the threshold to have considerable performance improvements outside the gpu.
That was a 3090 at 1080p with canned benchmarks designed to show the differences in CPU performance.

The 6700/PS5 video is different, it focuses on GPU bottlenecks. No CPU is going to save those weak GPUs.

Its two different situations, they are not the same thing...
 
Last edited:

SABRE220

Member
That was a 3090 at 1080p with canned benchmarks designed to show the differences in CPU performance.

The 6700/PS5 video is different, it focuses on GPU bottlenecks. No CPU is going to save those weak GPUs.

Its two different situations, they are not the same thing...
Youre assuming every game there is perfectly gpu limited and CPU has no impact on performance compared to the console cpus and the onus to prove that is on you. Some of these games are more gpu limited but to claim all of these games are perfectly gpu limited is baseless, you can have a lower mid end gpu and still run into CPU chokepoints.
 
Last edited:

Leonidas

Member
Youre assuming every game there is perfectly gpu limited and CPU has no impact on performance...
I never claimed it to be perfectly GPU limited, but the GPU is definately the bottleneck on the PC and no CPU is going to magically make those weak GPUs 1.5-2x faster as one might erroneously conclude if one misunderstands the 3090 @ 1080p CPU benchmark video that is screencapped in this thread...
 
Last edited:

yamaci17

Member
I dont think monster hunter is a good test for anything.. Not sure why it was even included.

And Avatar is definitely a game that uses RTGI even in its 60 fps mode on consoles. CPUs get hit hard in any game with RT. AW2 is also a game that pushes the CPUs hard

Alex used to do this all the time. Dismiss the fact that these games were not CPU bound and then they tested the PS5 and XSX CPUs and they were completely trash in DF's own comparisons in modern games especially with RT on. IIRC, cyberpunk was like 40% slower compared to the 3600 and metro was even worse. I cant find that video right now but its a must watch to see just how poor these CPUs really are and how much they hold back the games even at 30-50 fps.
what are you even trying to prove here ? 12400f is a dirt cheap CPu that most 3000/4000 GPU owners will use as a baseline. i dont understand your obsession with console equivalent CPUs to be used in these comparisons because most people who own modern GPUs won't even use console equivalent CPUs.
literally no one will pair a 4060 or 4060ti with a 3600 or alike. people will use 13400f as a baseline because that is literally what is on shelves and it is hilariously faster than those old antique CPUs (ryzen 3600 is 5 years old at this point. pairing it even with 3070 class of GPU was very wrong and was frowned upon by many folks)

aw 2 pushes cpus hard ? what ?



even lowend antique 2600 here pushes 60+ fps in this town

and here's avatar with cpu bound resolution but high settings on a 3600



its still heavily gpu bound above 60+ fps

case in point a balanced rx 6700/i5 12400f/i5 13400f/r5 5600/r5 7600 build won't have any bottlenecks or whatsoever. it is sony's problem that they decided to release console in 2020 when it was meaningless to do so, resulting use of outdated CPU with small cache

they did this before with jaguar / GCN GPU and they repeated the mistake, they don't care. they just want you to go back to 30 FPS, target 30 FPS due to anemic CPU, push excess GPU resources to resolution. I thought this was common knowledge. performance modes will be a thing of past going forward and you know that. so consider yourself lucky we have this many performance modes on consoles to begin with

Look, I can understand if you get angry when alex uses 7900x 3d or 13900k or whatever. but you can't be this angry if they use pair these GPUs with run off the mill CPUs like 12400f/r5 5600

DF did many tests with 1060 versus PS4. And they used regular CPUs. The worse CPU you could can get was still 5x faster than the PS4 CPU. Do you wish they would pair 1060 with an antique Jaguar CPU specifically ? what use would it have ? the worst CPU I've seen people pair 1060 with was an i5 2400 which was still leaps and bounds better than PS4. It is a console problem and it is not something DF has to fix / adhere to. 4700s, 2600 etc. these are not real use case CPUs anymore for this class of GPU. 3600, maybe. Even then, it is widely known that 3600 is problematic in many games due to double CCX structure and most people already moved on from it
 
Last edited:

yamaci17

Member
Absolutely retarded comparison. he doesnt state what CPU hes using but hes definitely not using that AMD 6700U chip from china that he knows for a FACT is the PS5 CPU. Then he wonders why the 60 fps and 120 fps modes are performing well. Well, why do you think sherlock?

God i hate DF. Every time i stan for this fuckers they make me look like a fucking fool.

Oh and he didnt even downclock the 6700 to match the PS5 tflops. Not even for one test just to see how well the PS5 GPU compares to its PC counterpart with more dedicated VRAM and infinity cache.

To think he had 3 years to come up with the testing scenarios and completely blew it.
?

you of all people know that ps5 has those cache scrubbers or whatever that probably helps with its performance

more dedicated VRAM ? rx 6700 has 10 GB of memory, 0.6-1.2 gb of it will usually go to background services (which is why he himself says he had to reduce texture quality in last of us to reduce stutters. last of us require exactly 10 GB OF FREE memory. not total memory for ps5 equivalent textures and smoothness. you can't have 10 GB OF free memory with 10 gb of total memory on PC. some of it will be used by OS. this also applies to your 10 gb 3080 all the same.

infintiy cache is just a band aid that often can fail to produce great results.

rx 6700 only has 320 gb/s bandwidth
whereas ps5 has a total of 448 gb/s bandwidth
most likely ps5 would use 50-60 gb/s for the CPU, which leaves ps5 a whopping 400 gb/s of bandwith or so. (ps4 cpu barely used 10-15 gb/s of bandwidth out of a total of 176 gb/s for example. 50 60 gb/s of memory bandwidth is usually enough to get the most out of most zen 2 CPUs on desktop)
rx 6700 has a gimped bus (160 bit vs 256 bit)

test is fine as is. rx 6700 has many cutbacks compared to ps5, and vice versa. ps5 has its advantages, rx 6700 has its advantages. how will you know how much infintiy cache improves performance ? how will you know how much cache scrubbers improve performance? ps5 has its pros, and 6700 has its. it is best to leave them to their own and not tweak any external clock of the GPU.

if you can't disable cache scrubbers or get them out of the equation, there's no point getting clock difference out of the equation or calling infinity cache out. these GPUs do not even use same APIs anyways which is the reason games have differing performance deltas (most notably last of us part 1)
 
Last edited:

Three

Member
Vast majority of the tests seem to be GPU limited, so the comparison is valid and interesting, however that Monster Hunter test might well be CPU limited so Rich not mentioning the CPU is dumb.
I'm pretty sure that section with the swaying trees in Avatar is CPU limited too. Some of the tests don't make much sense and it's guesswork on his part.
 
Top Bottom