• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry The PS5 GPU in PC Form? Radeon RX 6700 In-Depth - Console Equivalent PC Performance?

Mr.Phoenix

Member
My problem with this train of thought is that no one besides warriors trying to prove PCs are so much better than consoles ever attempts to build a gaming PC for the price of a console.

I've never been once approached by a layman who told me they had a $500 (at least not in the past 4 years) budget for a gaming PC. If that's their budget, they don't even think about a PC and go straight for a console. This constant back-and-forth I see between console and PC gamers on forums is so bizarre, and some PC enthusiasts insist you can build a decent rig for the price of a console but that's just not possible. No one's gonna buy your shitty $500 build, just like no one would have bought that crappy $400 PC in 2014 that rivaled a PS4.

What is interesting is discussing how much you need to pay to get a decent machine that can play mainstream games on the same level as consoles. Besides that, you also got DLSS if you go NVIDIA and can just tweak your settings to get better performance. Ultimately, you decide what you want.
I am in complete agreement with you. My issue is that I just find those posts saying "Just go get a PC" especially when anyone is asking for better performance from their games on a console as disingenuous as if because they both play games they are both the same thing.

And I also have a gaming PC, and I know how much I spent on this thing. For the Price of my GPU alone, I have bought 2 PS5s (one came with a game) And paid for two years of PS+Ex. That's the reality of the difference between console and PC gaming.
 

Topher

Identifies as young
I am in complete agreement with you. My issue is that I just find those posts saying "Just go get a PC" especially when anyone is asking for better performance from their games on a console as disingenuous as if because they both play games they are both the same thing.

And I also have a gaming PC, and I know how much I spent on this thing. For the Price of my GPU alone, I have bought 2 PS5s (one came with a game) And paid for two years of PS+Ex. That's the reality of the difference between console and PC gaming.

Sure, but the reality doesn't end there. Games can be bought for less money on PC, even brand new games. Accessories are also cheaper. There is no requirement for a $70 controller. Buy a cheapo $20 controller off of Amazon if you want or spend a little more and get an 8bitdo for $30. Massive selection in headsets if you want one and are no limitations to "officially licensed" stuff.

On the flip side, the availablity of physical games on consoles allow for renting and reselling games, as well as buying used. So point of all this is there are number of ways to look for value beyond the initial purchase price of the system. Both PC and console have advantages as far as value is concerned.
 
Last edited:

Senua

Gold Member
Sure, but the reality doesn't end there. Games can be bought for less money on PC, even brand new games. Accessories are also cheaper. There is no requirement for a $70 controller. Buy a cheapo $20 controller off of Amazon if you want or spend a little more and get an 8bitdo for $30. Massive selection in headsets if you want one and are no limitations to "officially licensed" stuff.
Also you don't have to pay for online play, the biggest gyp going in the console space.
 

Topher

Identifies as young
Also you don't have to pay for online play, the biggest gyp going in the console space.

Yep. I forgot to mention that. That's $80 a year. Even if you get a steep sale and get it half off, forty-ish bucks over the course of a generation adds up to around $300. Something folks have to take into account if they are looking for value.
 

Astray

Member
I think if you play online then it's basically a no-brainer to go PC, the extra expense at the beginning is much better spent on an asset that you can sell or repurpose later on than on a subscription that eventually ends and takes your benefits with it.

But if you play offline (like i do), then going with a console is by far the less costly option, especially if you go physical and make good use of the used market.
 

GoldenEye98

posts news as their odd job
I think if you play online then it's basically a no-brainer to go PC, the extra expense at the beginning is much better spent on an asset that you can sell or repurpose later on than on a subscription that eventually ends and takes your benefits with it.

But if you play offline (like i do), then going with a console is by far the less costly option, especially if you go physical and make good use of the used market.

Yeah if you mainly play multiplayer, consoles are annoying in the sense that you always have buy subscriptions. PS Plus Essential is now $95 a year in Canada.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Yeah if you mainly play multiplayer, consoles are annoying in the sense that you always have buy subscriptions. PS Plus Essential is now $95 a year in Canada.
Used to be a measly $50 when it was introduced and you could get it for $25-30 during the holidays which is what I always did.

The fact that the price almost doubled in about 10-15 years is wild.
 

JimboJones

Member
You don't even need windows btw. Been gaming on Linux for quite some time without problems.
I was going to mention that but apart from steam deck I don't have much experience with Linux gaming and I think most people would prefer windows but it is a valid option too.
 

Topher

Identifies as young
Hot take. I actually think DF is a good YT-channel.

And I'm a PS5 guy. :)

Good, yeah, but there are better. If I'm shopping for a CPU or GPU then I'll go to hardware unboxed or Gamers Nexus long rather than DF. Those guys have no interest in the console war whereas DF has made a living off console game comparisons.
 

hinch7

Member
Good, yeah, but there are better. If I'm shopping for a CPU or GPU then I'll go to hardware unboxed or Gamers Nexus long rather than DF. Those guys have no interest in the console war whereas DF has made a living off console game comparisons.
Talking about HWU. They did a comparison with CPU's with different cache sizes. Massive gains from added L3 in a lot of games and will be especially noticable in the 1% lows. Granted they did test with a 4090 to avoid being GPU bound.
 

rofif

Can’t Git Gud
Sure, but the reality doesn't end there. Games can be bought for less money on PC, even brand new games. Accessories are also cheaper. There is no requirement for a $70 controller. Buy a cheapo $20 controller off of Amazon if you want or spend a little more and get an 8bitdo for $30. Massive selection in headsets if you want one and are no limitations to "officially licensed" stuff.

On the flip side, the availablity of physical games on consoles allow for renting and reselling games, as well as buying used. So point of all this is there are number of ways to look for value beyond the initial purchase price of the system. Both PC and console have advantages as far as value is concerned.
Ok let's go one by one with your points:
Games can be bought for less money on PC, even brand new games
Yes. Games are generaly cheaper on pc. You can get games from cdkeys like I do and other places. That said, psn constantly have sales. If you follow dekudekudeals, it's really nice and easy to see the prices, what was the lowest price and so on. + used market.
I've only paid full price for 2-3 games. Everything else I get on ps5, I preorder physical already for 10$ less than on psn + ton of goodies in the box. Elden Ring on ps5 was cheaper than on steam and it got ton of crap in the box at that.
Accessories are also cheaper. There is no requirement for a $70 controller
Yes. You can get shovelware accessories on pc of any kind. of course. But again - ps5 already comes with best controller ever and still allows you to use other controllers (limited but it does) and any headsets you want. Wireless out of the box if you connect any headphones to dualsense. I just find the pulse 3d excellent value and I am about to get new elite.
And "but dualsense sucks and get a drift"... bitch my viper v2 pro 150$ mouse just developed scrolling problems. It goes up and down as it pleases when I scroll. Currently in contact with razer support for a week....
Physical is my main selling point. I value it a lot.
One thing I want to empathize - Whiles you can "just get a pc" for cheap, I find it pointless. If I am to game on pc, I want the experience to be significantly better. And for me, some higher settings and fps are not always it. I value console gaming comfort, dualsense features and physical gaming a bit more.
But I also have a gaming pc as you know and I judge every game separately.

Also you don't have to pay for online play, the biggest gyp going in the console space.
Some people don't play online and don't need it. I don't play online on ps5.
And while I do agree, the basic online should be free 100%, the fact is if you pay for it you get ton of games and other stuff in there. I would prefer "just online" tier but whatever. Microsoft started this bs and sony ain't ending in.
 
Last edited:

Topher

Identifies as young
Ok let's go one by one with your points:

Yes. Games are generaly cheaper on pc. You can get games from cdkeys like I do and other places. That said, psn constantly have sales. If you follow dekudekudeals, it's really nice and easy to see the prices, what was the lowest price and so on. + used market.
I've only paid full price for 2-3 games. Everything else I get on ps5, I preorder physical already for 10$ less than on psn + ton of goodies in the box. Elden Ring on ps5 was cheaper than on steam and it got ton of crap in the box at that.

Yes. You can get shovelware accessories on pc of any kind. of course. But again - ps5 already comes with best controller ever and still allows you to use other controllers (limited but it does) and any headsets you want. Wireless out of the box if you connect any headphones to dualsense. I just find the pulse 3d excellent value and I am about to get new elite.
And "but dualsense sucks and get a drift"... bitch my viper v2 pro 150$ mouse just developed scrolling problems. It goes up and down as it pleases when I scroll. Currently in contact with razer support for a week....
Physical is my main selling point. I value it a lot.
One thing I want to empathize - Whiles you can "just get a pc" for cheap, I find it pointless. If I am to game on pc, I want the experience to be significantly better. And for me, some higher settings and fps are not always it. I value console gaming comfort, dualsense features and physical gaming a bit more.
But I also have a gaming pc as you know and I judge every game separately.

Sure.....waiting for a PSN sale is fine. Point is I don't have to wait for a sale on PC. You are going to pay $40 for Hellidivers 2 right now. I'm going to pay $33 from cd keys. Elden Ring was cheaper than on Steam, but Steam isn't your only option on PC. I preordered Elden Ring off of GMG for 20% off. But yeah, physical gives consoles advantges as well. I said that in the post you quoted. That's great for people like me who prefer physical, but doesn't apply to those who buy digital obviously.

lol....why are you calling PC accessories "shovelware"? That doesn't even make any sense. I've got a couple of flight sticks I use for some games on PC that simply will not work on my PS5 and is absurdly superior in quality than the officially licensed crap. Your other options for controller on PS5 consists of $200 Edge wannabe options from Razer, Scuf and Nacon (all of which also work on PC). The best controller I own is 8bitdo Ultimate and yeah, I own a DuelSense Edge as well. Either way, PC has a lot more controller options than any console. Not debatable. Simply a fact.

Same applies to headsets although if you buy a USB to bluetooth add-on for PS5 then you'll get some of the same functionality as long as you don't mind a microphone jutting out of your controller. Or....yeah, using a wire to plug in your wireless headset to your controller. lol

Why are you quoting nonsense about "duelsense sucks" and talking about drift? You are making up arguments I never made. Sucks about your mouse though.
 
Last edited:

DAHGAMING

Member
XSX needs to figure out to "embarrass" PS5 first.

Already has, we got the power and the games, Sony begging and giving back handers to lord and Saviour big Phil to bring monster games like Pentiment and Grounded to PS5. Aparently there trying for more and its like a scene from Oliver "please sir, can I have some more" .
 
Last edited:

Topher

Identifies as young
Already has, we got the power and the games, Sony begging and giving back handers to lord and Saviour big Phil to bring monster games like Pentiment and Grounded to PS5. Aarently there trying for me and its like a scene from Oliver "please sir, can I have some more" .

If You Say So Wow GIF by Identity
 

CLW

Member
What's Richard Leadbetter's obsession with these GPU to console comparisons? He keeps doing them and they're completely useless.
Clicks people are obsessed with how cheap a pc can I make to beat console performance
 

SlimySnake

Flashless at the Golden Globes
Avatar can easily do 60+fps on a 2600 of all things, the game is not heavy on the CPU at all. Literally none of the games tested at the console settings are CPU bound.


Those scores would have been the exact same had he used a 3600, 12400, or even a 7800x3D. None of those CPUs drop under 50fps, the game is completely GPU bound at those framerates with those GPUs. It would have been a different story if he was using the 60fps mode but he was not.
I think you are forgetting 3 years of poor console 60 fps performance compared to quality modes. How many times have we done this over the years in every single DF thread where we see a game run locked at native 4k 30 fps and then struggle to run at 1080p 60 fps AFTER downgrading visual settings.

How do you not remember the Guardians of the Galaxy threads where we tried to see why the game couldnt maintain a 60 fps? We even found some 6600xt benchmarks which was running the game at 100 fps at higher settings than the PS5 and XSX.

And it wasnt just third party devs either. Sony devs werent able to do much better. Miles in RT performance mode went down to 1080p despite having no issues running the game at native 4k 30 fps then later at 40 fps. Returnal ran at 1080p. HFW's performance mode was downgraded from the fidelity mode but they couldnt do 4kcb 60 fps and had to go all the way down to 1800p cb which is basically 1296p upscaled and it blew up in their face. Why would they settle for such a low resolution if the CPU could handle 60 fps?

Even the latest Spiderman 2's performance mode stays mostly around 1080p in the 60 fps despite the native 4k mode running at a locked 30 fps. On PCs, doubling the framerate by halving the resolution is not an issue if you have a semi decent CPU. On these consoles, we are seeing devs settle for a quarter of the resolution.

If that doesnt tell you that the console is bottlenecked by the CPU even at 60 fps then i dont know what to tell you.
 

rofif

Can’t Git Gud
Sure.....waiting for a PSN sale is fine. Point is I don't have to wait for a sale on PC. You are going to pay $40 for Hellidivers 2 right now. I'm going to pay $33 from cd keys. Elden Ring was cheaper than on Steam, but Steam isn't your only option on PC. I preordered Elden Ring off of GMG for 20% off. But yeah, physical gives consoles advantges as well. I said that in the post you quoted. That's great for people like me who prefer physical, but doesn't apply to those who buy digital obviously.

lol....why are you calling PC accessories "shovelware"? That doesn't even make any sense. I've got a couple of flight sticks I use for some games on PC that simply will not work on my PS5 and is absurdly superior in quality than the officially licensed crap. Your other options for controller on PS5 consists of $200 Edge wannabe options from Razer, Scuf and Nacon (all of which also work on PC). The best controller I own is 8bitdo Ultimate and yeah, I own a DuelSense Edge as well. Either way, PC has a lot more controller options than any console. Not debatable. Simply a fact.

Same applies to headsets although if you buy a USB to bluetooth add-on for PS5 then you'll get some of the same functionality as long as you don't mind a microphone jutting out of your controller. Or....yeah, using a wire to plug in your wireless headset to your controller. lol

Why are you quoting nonsense about "duelsense sucks" and talking about drift? You are making up arguments I never made. Sucks about your mouse though.
fairenough. Especially last sentence. You never made those points
 

SlimySnake

Flashless at the Golden Globes
what are you even trying to prove here ? 12400f is a dirt cheap CPu that most 3000/4000 GPU owners will use as a baseline. i dont understand your obsession with console equivalent CPUs to be used in these comparisons because most people who own modern GPUs won't even use console equivalent CPUs.
literally no one will pair a 4060 or 4060ti with a 3600 or alike. people will use 13400f as a baseline because that is literally what is on shelves and it is hilariously faster than those old antique CPUs (ryzen 3600 is 5 years old at this point. pairing it even with 3070 class of GPU was very wrong and was frowned upon by many folks)
What is this video for? To give PC gamers PS5 equivalent performance yes?

Then why in the world something as crucial as the CPU not mentioned? He's saying the gpu is $280 which is great but the CPU is still an extra $150. Still, had he mentioned what CPU he was using, I would not be this upset. But now that i know what CPU he used, im ok with it. He still shouldve mentioned it and referred back to his own testing for the 4800s for some more context. I shouldnt have to put my detective hat on to determine his testing setup.
aw 2 pushes cpus hard ? what ?


I stand corrected. I must have been misremembering or confusing it with another game. In my experience last year, almost every game pushed my CPUs way harder than cross gen games that topped out around 10-15%. As more and more games use the CPUs beyond simply running jaguar games at 60 fps, you will continue to see the PS5 performance modes struggle to keep up and require downgrades to 720p or below like we saw with FF16, Immortals, Star Wars, and even Avatar at its lower bounds.
even lowend antique 2600 here pushes 60+ fps in this town

and here's avatar with cpu bound resolution but high settings on a 3600



its still heavily gpu bound above 60+ fps

Avatar pushes CPUs hard. you can see the 3600 go up to 67% here. They even have a CPU benchmark in the game.
 

Elysium44

Banned
what are you even trying to prove here ? 12400f is a dirt cheap CPu that most 3000/4000 GPU owners will use as a baseline. i dont understand your obsession with console equivalent CPUs to be used in these comparisons because most people who own modern GPUs won't even use console equivalent CPUs.
literally no one will pair a 4060 or 4060ti with a 3600 or alike. people will use 13400f as a baseline because that is literally what is on shelves and it is hilariously faster than those old antique CPUs (ryzen 3600 is 5 years old at this point. pairing it even with 3070 class of GPU was very wrong and was frowned upon by many folks)

I bet there's loads of people using a CPU like a 3600 with a 4060, it's a lot easier to upgrade the GPU and often gives a more dramatic benefit than the CPU. Same can be said for 10th gen Intel (or the 8700K), these are still in wide use as for many people they're good enough. (Yes I am aware that pairing a 4060 with an older CPU means it runs at PCI-E 3.0 and suffers a performance penalty but that isn't too significant.)
 

Zathalus

Member
I think you are forgetting 3 years of poor console 60 fps performance compared to quality modes. How many times have we done this over the years in every single DF thread where we see a game run locked at native 4k 30 fps and then struggle to run at 1080p 60 fps AFTER downgrading visual settings.

How do you not remember the Guardians of the Galaxy threads where we tried to see why the game couldnt maintain a 60 fps? We even found some 6600xt benchmarks which was running the game at 100 fps at higher settings than the PS5 and XSX.

And it wasnt just third party devs either. Sony devs werent able to do much better. Miles in RT performance mode went down to 1080p despite having no issues running the game at native 4k 30 fps then later at 40 fps. Returnal ran at 1080p. HFW's performance mode was downgraded from the fidelity mode but they couldnt do 4kcb 60 fps and had to go all the way down to 1800p cb which is basically 1296p upscaled and it blew up in their face. Why would they settle for such a low resolution if the CPU could handle 60 fps?

Even the latest Spiderman 2's performance mode stays mostly around 1080p in the 60 fps despite the native 4k mode running at a locked 30 fps. On PCs, doubling the framerate by halving the resolution is not an issue if you have a semi decent CPU. On these consoles, we are seeing devs settle for a quarter of the resolution.

If that doesnt tell you that the console is bottlenecked by the CPU even at 60 fps then i dont know what to tell you.
I'm not really sure what you are trying to argue here. You mention the resolution dropping massively in the 60fps modes and then blame the CPU for that... but if the CPU was the bottleneck then much higher resolutions should be be possible.

Besides your entire premise that dropping resolution increases FPS by a corresponding amount is incorrect. Dropping from 4k to 1080p has zero guarantee that your FPS will increase 4x, as there is more to the GPU rendering pipeline then just resolution.

I do agree that some games can be bottlenecked by the CPU, Flight Simulator and Starfield come to mind, but none of the games tested (at the FPS measured) are CPU bottlenecked. As I pointed out you can get Avatar at a locked 60fps with a 2600 of all things.
 

SlimySnake

Flashless at the Golden Globes
I'm not really sure what you are trying to argue here. You mention the resolution dropping massively in the 60fps modes and then blame the CPU for that... but if the CPU was the bottleneck then much higher resolutions should be be possible.

Besides your entire premise that dropping resolution increases FPS by a corresponding amount is incorrect. Dropping from 4k to 1080p has zero guarantee that your FPS will increase 4x, as there is more to the GPU rendering pipeline then just resolution.

I do agree that some games can be bottlenecked by the CPU, Flight Simulator and Starfield come to mind, but none of the games tested (at the FPS measured) are CPU bottlenecked. As I pointed out you can get Avatar at a locked 60fps with a 2600 of all things.
Zero guarantee? PC GPUs are literally built and sold on that guarantee. More tflops = more performance as long as your CPU is strong enough and doesnt serve as a bottleneck.

Thats not the case on consoles where the CPU is clearly preventing their games from hitting their pixel budget. The pixel budget of a 4kcb game at 60 fps is the same as the pixel budget of a native 4k 30 fps game. If the game is unable to do 60 fps at 4.1 million pixels and has to run at 2.1 million pixels at reduced graphics settings then by definition, its not longer GPU bound because the gpu rendering budget should be the same unless of course you bring in the vram requirements which we were able to rule out when comparing when the 6600xt which has half of the vram bandwidth.

You were saying that the tests he has done for Avatar, cyberpunk and other games with 60 fps performance modes are not CPU bottlenecked. Thats simply not true and we have evidence from prior games struggling to run at 1080p, and more recent games running at 720p despite having zero issues running at 3x-4x higher resolutions in their 30 fps modes. Avatar goes up to 1800p in its 30 fps mode at times, and in this video struggles to hit 60 fps at 720p. thats 0.9 million pixels vs 5.6 million pixels. so the same gpu that had no issues rendering 5.6 million pixels is now all of a sudden struggling to render 0.9 million pixels twice per second?
 

SlimySnake

Flashless at the Golden Globes
Right on cue, as if they are reading this thread, the tech director at Massive just posted this.



Really good thread. Avatar is probably the most optimized game this gen, so these guys know what they are doing. The PS5 and XSX still have relatively slow clocked CPUs at 3.5 Ghz. Pitting them against PC CPUs going upwards of 4.5 Ghz to 5 Ghz is going to create some erroneous results when you're benchmarking GPUs.
 

Bojji

Member
If we assume that Richard knows what he is doing and tested games only in GPU bottleneck places this CPU talk is completely irrelevant.
 

Gaiff

SBI’s Resident Gaslighter
Guys, I think I was able to sherlock holmes this thing and figured out what CPU he's using. It's the i5-12400F.

I looked at his 4060 review which had some PS5 benchmark comparisons he made using the i5-12400F to get a more budget build benchmark, and found the same scene from Plague's Tale which matches his results for his 4060 + i5-12400F benchmark.

4060 review/ 6700 review.

Plague2_DdnBbz5.jpg
5rMCEZE.jpg


The i5-12400F is way more powerful than the ryzen 3600. Looks like 20-30% according to this video.




That moron used a 13900K.

1JggGha.png


I really gave him the benefit of the doubt but it looks like I shouldn't have. While Hitman and MHR might still be GPU-limited in their 4K and 2700p modes, Rich seriously didn't even consider throwing a weaker CPU just to make sure his conclusion was correct? At frame rates that high, of course, I'd start questioning a CPU bottleneck.

Updated the OP.
 

SlimySnake

Flashless at the Golden Globes
That moron used a 13900K.

1JggGha.png


I really gave him the benefit of the doubt but it looks like I shouldn't have. While Hitman and MHR might still be GPU-limited in their 4K and 2700p modes, Rich seriously didn't even consider throwing a weaker CPU just to make sure his conclusion was correct? At frame rates that high, of course, I'd start questioning a CPU bottleneck.

Updated the OP.
now you know how i felt in my original post. I am laughed at every time i praise these guys for things to do get right, time and time again they sure do love to make a fool out of me.

If we assume that Richard knows what he is doing
Narrator: He doesnt.
 
Last edited:

shamoomoo

Member
That moron used a 13900K.

1JggGha.png


I really gave him the benefit of the doubt but it looks like I shouldn't have. While Hitman and MHR might still be GPU-limited in their 4K and 2700p modes, Rich seriously didn't even consider throwing a weaker CPU just to make sure his conclusion was correct? At frame rates that high, of course, I'd start questioning a CPU bottleneck.

Updated the OP.
Oh, wow! I look in the wrong box.
 

Gaiff

SBI’s Resident Gaslighter
now you know how i felt in my original post. I am laughed at every time i praise these guys for things to do get right, time and time again they sure do love to make a fool out of me.


Narrator: He doesnt.
The man is willfully ignorant. He damn well knows what might be going on but goes, "curious". Curious that a 13900K is crushing a console-class CPU by 40%, really? Should have simply rerun that bench with a Ryzen 3600-based system. Hell, I wouldn't even be mad if he had run it on the much faster 12400F but a fucking 13900K?
 

Bojji

Member
The man is willfully ignorant. He damn well knows what might be going on but goes, "curious". Curious that a 13900K is crushing a console-class CPU by 40%, really? Should have simply rerun that bench with a Ryzen 3600-based system. Hell, I wouldn't even be mad if he had run it on the much faster 12400F but a fucking 13900K?

He could have run it on Zen 5 internal engineering sample and it wouldn't make the difference in GPU limited places.

Some of you guys are really sensitive about this PS5 - vs the world comparisons, there is nothing indicating that places he tested are CPU limited so this talk about not matching CPUs is irrelevant.
 

Gaiff

SBI’s Resident Gaslighter
He could have run it on Zen 5 internal engineering sample and it wouldn't make the difference in GPU limited places.

Some of you guys are really sensitive about this PS5 - vs the world comparisons, there is nothing indicating that places he tested are CPU limited so this talk about not matching CPUs is irrelevant.
I'm not talking about the obvious GPU-bound scenarios such as Alan Wake 2 at 4K. I'm talking about the instances of Hitman and MHR running at over 100fps on PS5 and 120-150fps on the PC. This could easily be due to the CPU, not the GPU.

The PS5 gets mauled by 30-40%. Do you seriously think that's a GPU problem?
 
Last edited:

Bojji

Member
I'm not talking about the obvious GPU-bound scenarios such as Alan Wake 2 at 4K. I'm talking about the instances of Hitman and MHR running at over 100fps on PS5 and 120-150fps on the PC. This could easily be due to the CPU, not the GPU.

The PS5 gets mauled by 30-40%. Do you seriously think that's a GPU problem?

2600x is running this in 166fps:

lm7EKhg.jpg
 

shamoomoo

Member
He could have run it on Zen 5 internal engineering sample and it wouldn't make the difference in GPU limited places.

Some of you guys are really sensitive about this PS5 - vs the world comparisons, there is nothing indicating that places he tested are CPU limited so this talk about not matching CPUs is irrelevant.
Stop exaggerating things,we know the current gen consoles are CPU limited,less than the previous gen but limited nonetheless. The Intel chip used is 2-3x faster than PS5 and is about the same or more for the CPU alone.

Richard has a Ryzen 4800 so he could've similar the PS5 to whether the CPU or GPU was the limit, within reason.
 

Gaiff

SBI’s Resident Gaslighter
2600x is running this in 166fps:

lm7EKhg.jpg
Doesn't matter unless you can compare the scenes like-for-like. The 2600x could run that scene at 110fps for all we know.
In super sampling mode game is for sure 110% GPU limited:



You can blame CPU all you want, lol.
Again, there's no way to know this unless we use a CPU comparable to what the PS5 has. If all Rich wanted to do was compare them at 4K 40fps or less, I wouldn't have batted an eye because the CPU is pretty much a non-factor. At those frame rates, anyone applying proper methodology would have gone out of their way to eliminate the CPU disparity as much as possible instead of wondering why it's happening.

As I said earlier:
While Hitman and MHR might still be GPU-limited in their 4K and 2700p modes, Rich seriously didn't even consider throwing a weaker CPU just to make sure his conclusion was correct? At frame rates that high, of course, I'd start questioning a CPU bottleneck.

A proper analysis would have had him cast away any doubt. Throw in a 3600. Is the PC still leading by 40%? Yes? Then we got our answer.
 
Last edited:

Zathalus

Member
As pointed out, Plague Tale Requiem was benched on a i5-12400F with a 4060 and guess how much the FPS differed when using a i9-13900k? Absolutely nothing, as one would expect when GPU limited.

This is not fucking rocket science people, when you are fully GPU limited, putting in a faster CPU (even one 3x as fast) will achieve absolutely nothing.
 

Bojji

Member
Again, there's no way to know this unless we use a CPU comparable to what the PS5 has. If all Rich wanted to do was compare them at 4K 40fps or less, I wouldn't have batted an eye because the CPU is pretty much a non-factor. At those frame rates, anyone applying proper methodology would have gone out of their way to eliminate the CPU disparity as much as possible instead of wondering why it's happening.

What you are talking about?

Scene that you showed was in 2160p mode and there was 30% difference, you said "maybe it's the CPU". I showed you scene from the same place but running in 2700p internal and it's still 30% difference. It was running over 100FPS in lower resolution so it can't be CPU limited...

s6bCmmf.jpg
Y7534Ou.jpg


84hos3.gif
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
As pointed out, Plague Tale Requiem was benched on a i5-12400F with a 4060 and guess how much the FPS differed when using a i9-13900k? Absolutely nothing, as one would expect when GPU limited.

This is not fucking rocket science people, when you are fully GPU limited, putting in a faster CPU (even one 3x as fast) will achieve absolutely nothing.
We're not discussing A Plague Tale. We're talking about MHR and Hitman 3. Y'know, games where the 6700 has an inexplicable 30-40% lead at high frame rates using a 13900K. Instead of being befuddled at the result, anyone would have tried a different method.

Rich goes, "I wonder if it's the Infinity Cache, AMD driver optimizations, or simply it being an early port." Maybe try another CPU? You know, just to eliminate that variable.

What annoys me is the fact that he didn't even attempt to answer the question with something so obvious.
 
Last edited:

Bojji

Member
We're not discussing A Plague Tale. We're talking about MHR and Hitman 3. Y'know, games where the 6700 has an inexplicable 30-40% lead at high frame rates using a 13900K. Instead of being befuddled at the result, anyone would have tried a different method.

Rich goes, "I wonder if it's the Infinity Cache, AMD driver optimizations, or simply it being an early port." Maybe try another CPU? You know, just to eliminate that variable.

What annoys me is the fact that he didn't even attempt to answer the question with something so obvious.

MHR is not CPU limited and you have proof above your post ^
 

Zathalus

Member
We're not discussing A Plague Tale. We're talking about MHR and Hitman 3. Y'know, games where the 6700 has an inexplicable 30-40% lead at high frame rates using a 13900K. Instead of being befuddled at the result, anyone would have tried a different method.

Rich goes, "I wonder if it's the Infinity Cache, AMD driver optimizations, or simply it being an early port." Maybe try another CPU? You know, just to eliminate that variable.

What annoys me is the fact that he didn't even attempt to answer the question with something so obvious.
But neither of those games are anywhere near CPU limited? Hitman 3 does 100fps+ (80fps minimums) while MHR does almost 200fps (160fps minimums) both with a 3600. Hitman can even do 70fps with a 1800x.
 

yamaci17

Member
I bet there's loads of people using a CPU like a 3600 with a 4060, it's a lot easier to upgrade the GPU and often gives a more dramatic benefit than the CPU. Same can be said for 10th gen Intel (or the 8700K), these are still in wide use as for many people they're good enough. (Yes I am aware that pairing a 4060 with an older CPU means it runs at PCI-E 3.0 and suffers a performance penalty but that isn't too significant.)
it will become too significant, and borderline unusable in certain cases going forward. some of the new games in 2023 rely too much on using a bit of shared vram to make up for the lack of total vram. ratchet clank, spiderman, last of us are some examples. these games make heavy use of shared VRAM on 8 GB GPUs and you can see extreme amounts of PCI-e transfer with them (and no, it is not related to directstorage or streaming, as it does not happen if you have enough vram, such as 12 gb 3060)

4060 having pcie 4 8x interface is a real bummer for pcie 3 based systems. i would not advise such pairings or getting that GPU if you're going to stay on pcie 3 for a while, especially looking forward. you wouldn't even be able to save that GPU there with upgrades if you're on something like a B450 board (which most ryzen 3600 users are on).

in games that I listed above, performance tanks heavily with pcie 3 8x. and actually ratchet clank creates a rare scenario where using a 3070 on pcie 4 16x gives massive performance advantage over pcie 3 16x with massive boost in %1 lows.

with how 8 gb gpus are still a mainstay, more and more games will rely more on PCI-e transfers and use some amount of RAM as caching purposes. so pairing a 4060 with a b450-pcie 3 system will probably be end up with bad results eventually. looking back, sure, most games will run fine. but you buy a 4060 to play future games mostly... even then, I would still not recommend anyone to play with zen 2 CPUs in 2024. game developers have abandoned CCX specific optimization on desktop, which means games will now hop threads randomly between CCXes and wont bring ccx coherency into discussion. they cannot be bothered to cater to this specific group of people who can easily upgrade to a 150 bucks 5600 that has unified 6 core cluster.



What is this video for? To give PC gamers PS5 equivalent performance yes?

Then why in the world something as crucial as the CPU not mentioned? He's saying the gpu is $280 which is great but the CPU is still an extra $150. Still, had he mentioned what CPU he was using, I would not be this upset. But now that i know what CPU he used, im ok with it. He still shouldve mentioned it and referred back to his own testing for the 4800s for some more context. I shouldnt have to put my detective hat on to determine his testing setup.

I stand corrected. I must have been misremembering or confusing it with another game. In my experience last year, almost every game pushed my CPUs way harder than cross gen games that topped out around 10-15%. As more and more games use the CPUs beyond simply running jaguar games at 60 fps, you will continue to see the PS5 performance modes struggle to keep up and require downgrades to 720p or below like we saw with FF16, Immortals, Star Wars, and even Avatar at its lower bounds.

Avatar pushes CPUs hard. you can see the 3600 go up to 67% here. They even have a CPU benchmark in the game.

while they pushed hard, even with my unbalanced combo of 2700x/3070 , I found myself in GPU bottleneck situations more ofthen than not and have to use extreme DLSS upscaling presets even at 1440p to get more out of my GPU. games also gotten much much heavier on GPUs and this applies to PS5 as well as a result

"You were saying that the tests he has done for Avatar, cyberpunk and other games with 60 fps performance modes are not CPU bottlenecked. Thats simply not true and we have evidence from prior games struggling to run at 1080p, and more recent games running at 720p despite having zero issues running at 3x-4x higher resolutions in their 30 fps modes. Avatar goes up to 1800p in its 30 fps mode at times, and in this video struggles to hit 60 fps at 720p. thats 0.9 million pixels vs 5.6 million pixels. so the same gpu that had no issues rendering 5.6 million pixels is now all of a sudden struggling to render 0.9 million pixels twice per second?"

I will have to make a correction here. upscaling is not really that light, even at low resolutions. and after a while, the performance return you get does not start to correlate with internal resolutions.
let's start here



so 1440p dlss quality is %67 and 960p, and in extension, 1.60 million pixels. and it renders like 52 frames there
then he has 1440p dlss performance there which is 720p and at that magical 0.9 million pixels count. and frames? 67

so it gives you %29 more performance for a %77 reduction in pixel counts. i hope this brings some light as to why some games need extreme amounts of upscaling to hit certain targets. because at some point upscaling itself starts to stop... scaling. do REMEMBER that even with upscaling, game still reconstructs 1440p worth of pixels, and uses a lot of native 1440p buffers to MAKE upscaling happen with decent results. believe me, if they did not do that, in other words, if upscaling was lighter than it currently is, that "720p" internal resolution game would look multiple times worse.

but as you can see in the video... it does not look like it is reduced from 1.6 million pixels to 0.9 million pixels (there's some dlss magic there but that is another topic. you can find some fsr benchmark, and i'm sure difference for it will be more noticable. but my point still stands)

so using extreme statements like "it is rendering 0.9 millions how can it not still hit the perf target" is a bit of a... wrong assumption. it is clear that Upscaling in avatar is extremely heavy if you observe the example I'm giving, in the specific game we've been talking about.

and one more perspective: dlss ultra quality renders at 1280p which has 2.7 millions of pixels, and gets you 42 frames in that scene. and with 0.9 millions of pixels, you get 67 FPS. so 1.59 times more frames for 3 times pixel reduction.

see the pattern here ? GPU performance is not that easy to determine. and with upscaling, things get even more confusing and complex. All I hope is to make you gain a different perspective towards these internal resolution counts. I'd say you need to look them from a different perspective at times to realize they might still be HEAVILY gpu bound there too.

and this is why 40 fps modes are precious and should be requested more by players. if you want to get real actual performance scaling, you have to change the UPSCALED output resolution altogether. remember Guardians of the galaxy? the game that renders at native 1080p on PS5 to hit 60 fps? and how people hated it because how blurry it looked ? because it was outputting to 1080p.

now imagine if that game used 1440p or 4K output. do you realize internal resolutions it would have to hit the same 60 fps target on GPU would now be much much below than 1080p? this is why some developers target 1080p or 1440p outputs so that they can avoid backlash of "muh low internal resolution". only problem here is how horrible FSR looks despite the upscaling cost. XeSS much better and it is just FSR being a bad upscaler. it does not even have to do with how many pixels it is working with or that stuff. it is plain bad and needs fixing.

in my own benchmarks, some games at 4k dlss ultra performance has the same performance as native 1440p dlaa. it is crazy, but it is true. in spiderman miles morales, 4k dlss ultra performance is LESS performant than native 1440p, for example. one is "rendering at 0.9 millions of pixels" and other is rendering 3.6 millions of pixels natively. problem is, the 0.9 millions of pixels is being accompanied by a frame buffer that is compromised of 8 millions of pixels.

1440p dlaa vs 4k dlss ultra perf vs 4k dlss perf
MFBkZSy.png
3BEseVB.png


as you can see here technically the one that is rendered with 1.9 millions of pixels (4k dlss performance) runs much slower than 1440p native that is rendered with 3.6 millions of pixels. none of it makes sense but it makes sense when you consider the end result has nothing to do with 720p or 1080p. with FSR problem is more complex, as it just cannot get the basics of upscaling right. god of war taau, spiderman IGTI, all look better than FSR 2. FSR 2 is a horribly cheap attempt of giving developers an easy excuse to not develop their own upscalers. I will reiterate this time and time again, sony and microsoft should've come preparered. that 720p internal would look good enough that you wouldn't care.

i personally don't care when i play with 1440p dlss perf or 4k dlss ultra perf. and even xess at 4k performance often looks surprisingly good. it is really FSR 2 that is being bad here, despite having similar costs compared to other upscalers.
 
Last edited:

Elysium44

Banned
it will become too significant, and borderline unusable in certain cases going forward. some of the new games in 2023 rely too much on using a bit of shared vram to make up for the lack of total vram. ratchet clank, spiderman, last of us are some examples. these games make heavy use of shared VRAM on 8 GB GPUs and you can see extreme amounts of PCI-e transfer with them (and no, it is not related to directstorage or streaming, as it does not happen if you have enough vram, such as 12 gb 3060)

4060 having pcie 4 8x interface is a real bummer for pcie 3 based systems. i would not advise such pairings or getting that GPU if you're going to stay on pcie 3 for a while, especially looking forward. you wouldn't even be able to save that GPU there with upgrades if you're on something like a B450 board (which most ryzen 3600 users are on).

I said 4060 as an example, it could just as easily be a 4060 Ti 16GB which I expect would be good for the rest of this gen for 1080p or better even when paired with a 5-year-old CPU and PCI-E 3.0. I just think it's silly calling them antiques, it isn't the early 2000s any more where a five year old CPU quickly became obsolete. Ideally a newer CPU is better, but swapping a motherboard and CPU is a big hassle (and expense) compared to just the GPU which takes 30 seconds to swap. Last PC I built was 2020 and I don't intend to do it again until my 10th gen i5 can no longer cut it. I don't foresee this being for many years and I will upgrade the GPU again before that.
 
Top Bottom