• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Marvel's Spider-Man Remastered - PC Features & Specs Revealed

yamaci17

Member
a PC grade 3700x is way more powerful than the CPU in the PS5 tho, so what difference would that make? especially since most games are GPU bound anyways
things are completed with low level API advantage consoles have. i remember especially in terms of drawcalls, console have massive advantage over PCs and DX12. even with dx12/vulkan, consoles still have better drawcall performance (i think drawcalls on consoles are practically free, therefore not using CPU or something like that.) also, nvidia GPUs combined with any CPU is practically performing %20 slower compared to AMD counterparts (nvidia driver overhead is a thing and threads/cores do not matter for it. it still decreases single thread bound performance compared to AMD gpus. in some cpu limited games, you get %15-20 more frames with a similar AMD GPU than an Nvidia GPU).

when these things are considered, you can say ps5 CPU would punch %35-50 above compared to equivalent CPU on desktop (4700s or 2700x or stuff like that) being %35-50 over an 2700x will easily put ps5 cpu way above 3700x (considering it is only %15-25 faster than 2700x)

so its really hard to do an exact comparison in terms of CPU. GPUs are cool, they're directly comparable. but in terms of CPUs, something is definetely off with PC games and their performance.

https://gpuopen.com/learn/porting-detroit-1/

"The CPU of the PlayStation® 4 is an AMD Jaguar with 8 cores. It is obviously slower than some recently-released PC hardware; but the PlayStation® 4 has some major advantages, such as very fast access to the hardware. We find the PlayStation® 4 graphics API to be much more efficient than all PC APIs. It is very direct and has very low overhead. This means we can push a lot of draw calls per frame. We knew that the high number of draw calls could be an issue with low-end PCs.

for example. this stuff will also be relevant for nextgen consoles. drawcalls in some games can heavily bottleneck a single thread, to a point that you don't even see a reasonable difference between a 5600x and 5950x. this is the cause, most of the times. i don't know if anything will improve it or not
 
Last edited:

Dream-Knife

Banned
also, nvidia GPUs combined with any CPU is practically performing %20 slower compared to AMD counterparts (nvidia driver overhead is a thing and threads/cores do not matter for it. it still decreases single thread bound performance compared to AMD gpus. in some cpu limited games, you get %15-20 more frames with a similar AMD GPU than an Nvidia GPU).
Have a video of that?
 

yamaci17

Member
Have a video of that?


this goes very back in the history


practically nvidia at some point (600 series i guess) cut down a specific chip that handled a specific job (some kind of scheduling. this has nothing do with "HAGS" hardware acceleratated gpu scheduling released for w10/w11. that's just a general scheduling task. the kind of scheduling we talk about is related to how GPU handles the distribution of game render workloads).

they did this because of couple reasons
- it was useless back then. APIs were high level, they didn't even take advantage of such a hardware
- they instead wrote a software scheduler that did the trick. said software scheduler works on the CPU, causing more overhead/CPU cycle cost per frame
- it was consuming power. nvidia was pushing more power, but were limited by this additional hardware. they removed and reduced overall power consumption

and it stayed like that. software scheduler was actually beneficial for dx11. driver practically could multithread the games that are low threaded, even. this actually put AMD at an disadvantage. AMD continued on with their existing scheduling hardware.
 

yamaci17

Member
It's more a low end CPU high end GPU thing. Hardware unboxed did a video or 2 about it a while ago
3700x/3600 and zen 2 in general is low-ish end for anything 3060ti and above in Ampere lineup. they didn't focus on zen 2 products to not cause a big eruption in the internet, since most users didn't want to upgrade from their zen 2 cpus at that time. the issue affects all CPUs, its not a low end thing. 3700x drops frames below 60 in Death Stranding, and on a counterpart AMD gpu you practically get %20 more frames, which would easily put you back to 60+ frames region.

this issue is relevant for wherever you're coming across CPU bottlenecks. and 3600/3700x tend to bottleneck with 3060ti and above even at 1440p in certain games, or certain scenes. its just not a good enough matchup. even at 4k, if the CPU drops frames below 60, stuff is sad

example 1

3700x drops frames waay below 50 in certain parts of Night City. with similar AMD GPUs, it performs %20-25 better. that will easily put you above the targeted 60 fps for zen 2

you can easily target a locked 60 fps with a 3060ti in cyberpunk. but if the cpu drops frames here and there, its not going to be a pleasant experience, simply. and that just tells us a mismatched hardware.

i'm not saying its unplayable or anything, but anything above a 3060ti deserves a 5600x, regardless of resolution. if one is able to upgrade, they should.
 
3700x/3600 and zen 2 in general is low-ish end for anything 3060ti and above in Ampere lineup. they didn't focus on zen 2 products to not cause a big eruption in the internet, since most users didn't want to upgrade from their zen 2 cpus at that time. the issue affects all CPUs, its not a low end thing. 3700x drops frames below 60 in Death Stranding, and on a counterpart AMD gpu you practically get %20 more frames, which would easily put you back to 60+ frames region.

this issue is relevant for wherever you're coming across CPU bottlenecks. and 3600/3700x tend to bottleneck with 3060ti and above even at 1440p in certain games, or certain scenes. its just not a good enough matchup. even at 4k, if the CPU drops frames below 60, stuff is sad

example 1

3700x drops frames waay below 50 in certain parts of Night City. with similar AMD GPUs, it performs %20-25 better. that will easily put you above the targeted 60 fps for zen 2

you can easily target a locked 60 fps with a 3060ti in cyberpunk. but if the cpu drops frames here and there, its not going to be a pleasant experience, simply. and that just tells us a mismatched hardware.

i'm not saying its unplayable or anything, but anything above a 3060ti deserves a 5600x, regardless of resolution. if one is able to upgrade, they should.

Thank you for definitely saying EVERYTHING that I’ve been saying since the beginning of the thread like I said it’s a colossal failure they went zen 2 over zen 3 with the ps5 even low end zen 3 would be so much better than what we got and I hope the pro models do not make this same mistake
 

Dream-Knife

Banned
Thank you for definitely saying EVERYTHING that I’ve been saying since the beginning of the thread like I said it’s a colossal failure they went zen 2 over zen 3 with the ps5 even low end zen 3 would be so much better than what we got and I hope the pro models do not make this same mistake
PS5 isn't 3060ti or above though.
3700x/3600 and zen 2 in general is low-ish end for anything 3060ti and above in Ampere lineup. they didn't focus on zen 2 products to not cause a big eruption in the internet, since most users didn't want to upgrade from their zen 2 cpus at that time. the issue affects all CPUs, its not a low end thing. 3700x drops frames below 60 in Death Stranding, and on a counterpart AMD gpu you practically get %20 more frames, which would easily put you back to 60+ frames region.

this issue is relevant for wherever you're coming across CPU bottlenecks. and 3600/3700x tend to bottleneck with 3060ti and above even at 1440p in certain games, or certain scenes. its just not a good enough matchup. even at 4k, if the CPU drops frames below 60, stuff is sad

example 1

3700x drops frames waay below 50 in certain parts of Night City. with similar AMD GPUs, it performs %20-25 better. that will easily put you above the targeted 60 fps for zen 2

you can easily target a locked 60 fps with a 3060ti in cyberpunk. but if the cpu drops frames here and there, its not going to be a pleasant experience, simply. and that just tells us a mismatched hardware.

i'm not saying its unplayable or anything, but anything above a 3060ti deserves a 5600x, regardless of resolution. if one is able to upgrade, they should.

Thank you for the detailed info.
 

Tqaulity

Member
Ok Guys, lots of talking in circles and avoiding the facts. Let me try to move things forward a bit...

First a couple of facts:
  1. Next-gen console GPU perf has been within a range of PC GPUs between an RTX 2070 and RTX 3070. This is because every game engine is different, some are more/less optimized for AMD hardware, and some are more/less optimized for consoles. In the worst cases, the consoles are right around RX 5700XT/2070 but in best cases (AMD favoring workloads) we're seeing perf approach RTX2080ti/RTX 3070 levels
  2. Everyone always tries to compare PC vs consoles by looking at PC versions of games that were straight ports to consoles (not optimized for consoles). But the better way to actually compare the console perf differences with all of the delta in HW, OS, APIs, and SW is to look at games that have been designed with consoles in mind and then ported back to PC. In practice we have precious few of those to date but it's not a coincidence that many of those games are the ones people label as a "poor" or "weird" port.
    1. Death Stranding - built on an engine designed to harness the PS4 hardware with optimizations for unifed RAM, additional ACE engines etc. This isn't just a weird PC port but a console engine port to PC which isn't trivial. While people love to dismiss it, this is probably the BEST example we have of console to PC perf since it's the only game available that have native versions on last gen, current gen, and PC. This shows the potential upper limit of console perf when a game/engine is optimized for the platform. BTW the perf delta here isn't to say that the PC HW is somehow less powerful but that porting that console game over is suboptimal (I.e the SW is suboptimal and not the HW).
    2. Other Sony PC ports such as Days Gone, Horizon, and God of War cannot be used to draw meaningful conclusions since there are no PS5 native ports for those. Yet you can look at PC performance and still see that "equivalent" console GPUs tend to perform considering worse than their console equivalents. For example, in Horizon Zero Dawn on PC, it takes an GTX 1050 or something above an r9 390 GPU to match base PS4 settings at 1080p/30fps! (Link) A PS4 is supposed to be ~a 7850 yet a 1050 is evenly matched with a 7950. An R9 390 is several generations ahead of a PS4 is generally nearly 2x faster than a 7950! (Link)
    3. SpiderMan PC port is a fine example again of a console optimized game and the difficulty of replicating that perf on a PC. The min/rec chart reflects this in spades as it takes more PC HW grunt to overcome some of the more efficient and optimized console hardware. It'll be interesting to see the perf comparison since again we have native PS4/PS5 and PC versions to compare. Don't be surprised if the Playstation consoles are performing well above their weight in this title

Also, lots of circular questions around examples of PS5 performing closer to a RTX 3070 aside from Death Stranding and Horizon. Some mention of a Call of Duty game. Ok here you go:

Call of Duty Cold War: PS5 is within 8 % of a 3070 according to Digital Foundry's test
6zkYrZj.png


Assassin's Creed Valhalla: PS5 is only about 12-15% below a 2080Ti (which is close to a 3070):
B5tMKK4.png


Both of those games that ran better on AMD vs Nvidia so it makes sense the consoles will run better. There aren't a lot of examples yet but trust you will see more as more true console games are ported to PC and developers have more time with the new hardware.
 
Last edited:
Ok Guys, lots of talking in circles and avoiding the facts. Let me try to move things forward a bit...

First a couple of facts:
  1. Next-gen console GPU perf has been within a range of PC GPUs between an RTX 2070 and RTX 3070. This is because every game engine is different, some are more/less optimized for AMD hardware, and some are more/less optimized for consoles. In the worst cases, the consoles are right around RX 5700XT/2070 but in best cases (AMD favoring workloads) we're seeing perf approach RTX2080ti/RTX 3070 levels
  2. Everyone always tries to compare PC vs consoles by looking at PC versions of games that were straight ports to consoles (not optimized for consoles). But the better way to actually compare the console perf differences with all of the delta in HW, OS, APIs, and SW is to look at games that have been designed with consoles in mind and then ported back to PC. In practice we have precious few of those to date but it's not a coincidence that many of those games are the ones people label as a "poor" or "weird" port.
    1. Death Stranding - built on an engine designed to harness the PS4 hardware with optimizations for unifed RAM, additional ACE engines etc. This isn't just a weird PC port but a console engine port to PC which isn't trivial. While people love to dismiss it, this is probably the BEST example we have of console to PC perf since it's the only game available that have native versions on last gen, current gen, and PC. This shows the potential upper limit of console perf when a game/engine is optimized for the platform. BTW the perf delta here isn't to say that the PC HW is somehow less powerful but that porting that console game over is suboptimal (I.e the SW is suboptimal and not the HW).
    2. Other Sony PC ports such as Days Gone, Horizon, and God of War cannot be used to draw meaningful conclusions since there are no PS5 native ports for those. Yet you can look at PC performance and still see that "equivalent" console GPUs tend to perform considering worse than their console equivalents. For example, in Horizon Zero Dawn on PC, it takes an GTX 1050 or something above an r9 390 GPU to match base PS4 settings at 1080p/30fps! (Link) A PS4 is supposed to be ~a 7850 yet a 1050 is evenly matched with a 7950. An R9 390 is several generations ahead of a PS4 is generally nearly 2x faster than a 7950! (Link)
    3. SpiderMan PC port is a fine example again of a console optimized game and the difficulty of replicating that perf on a PC. The min/rec chart reflects this in spades as it takes more PC HW grunt to overcome some of the more efficient and optimized console hardware. It'll be interesting to see the perf comparison since again we have native PS4/PS5 and PC versions to compare. Don't be surprised if the Playstation consoles are performing well above their weight in this title

Also, lots of circular questions around examples of PS5 performing closer to a RTX 3070 aside from Death Stranding and Horizon. Some mention of a Call of Duty game. Ok here you go:

Call of Duty Cold War: PS5 is within 8 % of a 3070 according to Digital Foundry's test
6zkYrZj.png


Assassin's Creed Valhalla: PS5 is only about 12-15% below a 2080Ti (which is close to a 3070):
B5tMKK4.png


Both of those games that ran better on AMD vs Nvidia so it makes sense the consoles will run better. There aren't a lot of examples yet but trust you will see more as more true console games are ported to PC and developers have more time with the new hardware.
For Spider-Man since we have uncapped vrr modes it will be a perfect way to compare the full potential vs a pc that’s why I’m so happy this game is coming to pc
 
Just again shows when I build my first pc I should only get the best of the best cpus or I eventually will get bottlenecked
You will eventually always get a bottleneck.
3700x/3600 and zen 2 in general is low-ish end for anything 3060ti and above in Ampere lineup. they didn't focus on zen 2 products to not cause a big eruption in the internet, since most users didn't want to upgrade from their zen 2 cpus at that time. the issue affects all CPUs, its not a low end thing. 3700x drops frames below 60 in Death Stranding, and on a counterpart AMD gpu you practically get %20 more frames, which would easily put you back to 60+ frames region.

this issue is relevant for wherever you're coming across CPU bottlenecks. and 3600/3700x tend to bottleneck with 3060ti and above even at 1440p in certain games, or certain scenes. its just not a good enough matchup. even at 4k, if the CPU drops frames below 60, stuff is sad

example 1

3700x drops frames waay below 50 in certain parts of Night City. with similar AMD GPUs, it performs %20-25 better. that will easily put you above the targeted 60 fps for zen 2

you can easily target a locked 60 fps with a 3060ti in cyberpunk. but if the cpu drops frames here and there, its not going to be a pleasant experience, simply. and that just tells us a mismatched hardware.

i'm not saying its unplayable or anything, but anything above a 3060ti deserves a 5600x, regardless of resolution. if one is able to upgrade, they should.

I beat cyberpunk on my PC 1440p high+ with RT dlss quality and never once did I drop below 60 fps. 90 ish average I would say.
Side note I got my 3700x several years ago for 40$. She's a beast at that price.
 

yamaci17

Member
Thank you for definitely saying EVERYTHING that I’ve been saying since the beginning of the thread like I said it’s a colossal failure they went zen 2 over zen 3 with the ps5 even low end zen 3 would be so much better than what we got and I hope the pro models do not make this same mistake
okay but here's the kicker



i just checked this video. its a perfect locked 60, it seems like. this palce was a mess on my 2700x, often dropping to 45s.

exact same place puts the 5800x 3D CPU bound at 90 FPS



that's... massive. this is a CPU with 96 mb cache, a whopping %25-30 IPC over Zen 2, a whopping 4.5 Ghz of frequency. (i know that settings in the said video is ultra. it does not even matter. i put my settings to ultra, turn back to default, i still get exact same CPU bound performance). compared that to the puny ps5 with its measly 8 mb cache, zen 2 cpu at 3.6 ghz getting a rocksolid 60 fps in that place.

now let's do a simple math. we already have a whopping %22 frequency advantage in our hands. then come another %25 due to zen 3. then come another most likely %25-30 advantage thanks to enormously high cache (do note that this increased cache makes 5800x 3d outperform regular 5800x reasonably more in most games). a whopping total of %100 CPU performance. yes, in theory, 5800x 3D should be, core by core, 2 times faster than a ps5. yet what see here is 5800x 3d barely pushing 90-95 FPS in this scene, heavily CPU bound situation. its just comical at this point. its like thanos vs avengers, all that for a droop of blood. and mind you, this is an AMD GPU. situation would be %15-20 worse if this video had an Nvidia GPU in it. it would be CPU bound at a freaking 80-85 FPS, most likely. its just comical. imagine brute forcing that much power over a console, only to get barely %33-50 more FPS in return in certain games

i too firmly believe that this is a really case of bad optimization but this kind of bad CPU scaling happens across more games. on GPUs you can quite exact performance nowadays, but CPU side of things is still f**** up

also do note that i said i was getting 45-50 fps in this place with my puny old 2700x. at least 5800x 3d manages to get 90-100 fps. and when you actually check the benchmarks, you will see 5800x 3d actually outperforming my CPU by two times at times. but usually it beats it by %75-80. the extra %20 comes from the nvidia overhead.

so yes, matching consoles with console - equivalent CPU is not going to do it for this gen. even stronger CPUs will falter. I bet in this scene, 3700x would drop frames easily below 60. people with 3700x hardware can go and try ruined factory in death stranding directors cut and see if they can get that smooth ps5-equivalent locked 60 fps experience that place. just go and test it. chances are if 5800x 3d drops to 90s, 3700x will easly drop below 60.
 
Last edited:

yamaci17

Member
You will eventually always get a bottleneck.

I beat cyberpunk on my PC 1440p high+ with RT dlss quality and never once did I drop below 60 fps. 90 ish average I would say.
Side note I got my 3700x several years ago for 40$. She's a beast at that price.
go to jig jig street, run around, record a video of you not dropping below 60
i will be waiting

i have video proof that 3700x drops heavily below 50 in crowded sections of the city when while merely driving. either you didn't notice or you didn't care enough.

have my SECOND SOLID evidence that 3700x drops heavily below 60 in cyberpunk

 
Last edited:

Dream-Knife

Banned
okay but here's the kicker



i just checked this video. its a perfect locked 60, it seems like. this palce was a mess on my 2700x, often dropping to 45s.

exact same place puts the 5800x 3D CPU bound at 90 FPS



that's... massive. this is a CPU with 96 mb cache, a whopping %25-30 IPC over Zen 2, a whopping 4.5 Ghz of frequency. (i know that settings in the said video is ultra. it does not even matter. i put my settings to ultra, turn back to default, i still get exact same CPU bound performance). compared that to the puny ps5 with its measly 8 mb cache, zen 2 cpu at 3.6 ghz getting a rocksolid 60 fps in that place.

now let's do a simple math. we already have a whopping %22 frequency advantage in our hands. then come another %25 due to zen 3. then come another most likely %25-30 advantage thanks to enormously high cache (do note that this increased cache makes 5800x 3d outperform regular 5800x reasonably more in most games). a whopping total of %100 CPU performance. yes, in theory, 5800x 3D should be, core by core, 2 times faster than a ps5. yet what see here is 5800x 3d barely pushing 90-95 FPS in this scene, heavily CPU bound situation. its just comical at this point. its like thanos vs avengers, all that for a droop of blood. and mind you, this is an AMD GPU. situation would be %15-20 worse if this video had an Nvidia GPU in it. it would be CPU bound at a freaking 80-85 FPS, most likely. its just comical. imagine brute forcing that much power over a console, only to get barely %33-50 more FPS in return in certain games

i too firmly believe that this is a really case of bad optimization but this kind of bad CPU scaling happens across more games. on GPUs you can quite exact performance nowadays, but CPU side of things is still f**** up

also do note that i said i was getting 45-50 fps in this place with my puny old 2700x. at least 5800x 3d manages to get 90-100 fps. and when you actually check the benchmarks, you will see 5800x 3d actually outperforming my CPU by two times at times. but usually it beats it by %75-80. the extra %20 comes from the nvidia overhead.

so yes, matching consoles with console - equivalent CPU is not going to do it for this gen. even stronger CPUs will falter. I bet in this scene, 3700x would drop frames easily below 60. people with 3700x hardware can go and try ruined factory in death stranding directors cut and see if they can get that smooth ps5-equivalent locked 60 fps experience that place. just go and test it. chances are if 5800x 3d drops to 90s, 3700x will easly drop below 60.

How are you CPU bound when nothing is maxed out there?
 

yamaci17

Member
How are you CPU bound when nothing is maxed out there?
you will get, let's say
45 fps cpu bound with a ryzen 1600 with %60 usage
55 fps cpu bound with a ryzen 2600 with %60 usage
then
80+ fps cpu bound with a ryzen 5600 with the same exact %60 usage

cpu usage doesn't tell anything useful anymore. a single thread that is being maxed out will cause the performance to be heavily cpu bound regardless how many cores you have. windows shuffling threads randomly is also not helping when it comes to detecting cpu bottlenecks.

same thing in cyberpunk



lets observe the video again

3700x drops frames to 45s with a mere %45-50 usage

exact same place, 5900x grafecully pushes a healthy 64 fps


it also has a mere %34-35 usage. gpu bottleneck is alleviated and now you get 60+ frames
 
Last edited:
go to jig jig street, run around, record a video of you not dropping below 60
i will be waiting

i have video proof that 3700x drops heavily below 50 in crowded sections of the city when while merely driving. either you didn't notice or you didn't care enough.

have my SECOND SOLID evidence that 3700x drops heavily below 60 in cyberpunk


I looked up that street and yeah I remember that place. It ran like shit there. I guess I just remember it more fondly.
 

Dream-Knife

Banned
you will get, let's say
45 fps cpu bound with a ryzen 1600 with %60 usage
55 fps cpu bound with a ryzen 2600 with %60 usage
then
80+ fps cpu bound with a ryzen 5600 with the same exact %60 usage

cpu usage doesn't tell anything useful anymore. a single thread that is being maxed out will cause the performance to be heavily cpu bound regardless how many cores you have. windows shuffling threads randomly is also not helping when it comes to detecting cpu bottlenecks.

same thing in cyberpunk



lets observe the video again

3700x drops frames to 45s with a mere %45-50 usage

exact same place, 5900x grafecully pushes a healthy 64 fps


it also has a mere %34-35 usage. gpu bottleneck is alleviated and now you get 60+ frames

Yeah I know that about the default number, but does that also apply to individual threads? In that video you posted, none of the threads even hit 90%.
 

PaintTinJr

Member
okay but here's the kicker



i just checked this video. its a perfect locked 60, it seems like. this palce was a mess on my 2700x, often dropping to 45s.

exact same place puts the 5800x 3D CPU bound at 90 FPS



that's... massive. this is a CPU with 96 mb cache, a whopping %25-30 IPC over Zen 2, a whopping 4.5 Ghz of frequency. (i know that settings in the said video is ultra. it does not even matter. i put my settings to ultra, turn back to default, i still get exact same CPU bound performance). compared that to the puny ps5 with its measly 8 mb cache, zen 2 cpu at 3.6 ghz getting a rocksolid 60 fps in that place.

now let's do a simple math. we already have a whopping %22 frequency advantage in our hands. then come another %25 due to zen 3. then come another most likely %25-30 advantage thanks to enormously high cache (do note that this increased cache makes 5800x 3d outperform regular 5800x reasonably more in most games). a whopping total of %100 CPU performance. yes, in theory, 5800x 3D should be, core by core, 2 times faster than a ps5. yet what see here is 5800x 3d barely pushing 90-95 FPS in this scene, heavily CPU bound situation. its just comical at this point. its like thanos vs avengers, all that for a droop of blood. and mind you, this is an AMD GPU. situation would be %15-20 worse if this video had an Nvidia GPU in it. it would be CPU bound at a freaking 80-85 FPS, most likely. its just comical. imagine brute forcing that much power over a console, only to get barely %33-50 more FPS in return in certain games

i too firmly believe that this is a really case of bad optimization but this kind of bad CPU scaling happens across more games. on GPUs you can quite exact performance nowadays, but CPU side of things is still f**** up

also do note that i said i was getting 45-50 fps in this place with my puny old 2700x. at least 5800x 3d manages to get 90-100 fps. and when you actually check the benchmarks, you will see 5800x 3d actually outperforming my CPU by two times at times. but usually it beats it by %75-80. the extra %20 comes from the nvidia overhead.

so yes, matching consoles with console - equivalent CPU is not going to do it for this gen. even stronger CPUs will falter. I bet in this scene, 3700x would drop frames easily below 60. people with 3700x hardware can go and try ruined factory in death stranding directors cut and see if they can get that smooth ps5-equivalent locked 60 fps experience that place. just go and test it. chances are if 5800x 3d drops to 90s, 3700x will easly drop below 60.

it doesn't sound unoptimized to me, it just reads like the difference between a console that can deference data by changing a pointer to an area of unified memory to deliver all its draw calls instantaneously and a PC setup with RAM and separate VRAM, and a PCIe bus bandwidth and GPU cache latency to contend with for turning those draw calls into rendering,

edit:
An easy test would be to limit the PCIe bandwidth for the GPU to half - in bios or masking lanes on the card - and see if the CPU limit gets worse.
 
Last edited:

yamaci17

Member
I looked up that street and yeah I remember that place. It ran like shit there. I guess I just remember it more fondly.
no problem, that place also drops to 50s-55s with ps5.

tho that's the problem, logically i would expect it to drop to 40s... ps5 really performs like a 3700x, and i think as i said it has to do largely with Nvidia's driver overhead. situations like this makes me hate Nvidia...

knife i will go to that place again and see if any of my threads also max out
 
Last edited:
no problem, that place also drops to 50s-55s with ps5.

tho that's the problem, logically i would expect it to drop to 40s... ps5 really performs like a 3700x, and i think as i said it has to do largely with Nvidia's driver overhead. situations like this makes me hate Nvidia...

knife i will go to that place again and see if any of my threads also max out
Keep in mind that RT stresses CPU more. Consoles only have little RT for shadows and nothing else.
 

yamaci17

Member
Keep in mind that RT stresses CPU more. Consoles only have little RT for shadows and nothing else.
we will see what the situation will be in future month however. we already know how ps5 performs with rt in spiderman. considering it manages to get upwards of 75 frames with vrr mode, lets see what 3700x will be able to do with rt in that game
 
we will see what the situation will be in future month however. we already know how ps5 performs with rt in spiderman. considering it manages to get upwards of 75 frames with vrr mode, lets see what 3700x will be able to do with rt in that game
If only we can match low RT of ps5.

In Watch Dogs 3 with higher RT RTX 2060 is comfortably beating ps5.
 

yamaci17

Member
Yeah I know that about the default number, but does that also apply to individual threads? In that video you posted, none of the threads even hit 90%.
SDLuF8R.png

similar situation, same factory, no threads hitting max util. yet a profound bottleneck, which 5800x would greatly solve

https://imgsli.com/MTE4MjIz

and this is the proof that 5800x 3d even with very low settings would still get 90-95 fps cpu bound. as you can see, going from ultra to very low does not even change CPU bound framerate.
 
Last edited:
im not talking about gpus tho, im merely talking about cpus :D
Oh ok. Still will need to use same low RT settings as on ps5 as even when not GPU bound with RT CPU utilization is much higher. Easy to observe it in CP77. [try low resolution with RT so that not gpu bound and then RT off]
 

Stuart360

Member
no problem, that place also drops to 50s-55s with ps5.

tho that's the problem, logically i would expect it to drop to 40s... ps5 really performs like a 3700x, and i think as i said it has to do largely with Nvidia's driver overhead. situations like this makes me hate Nvidia...

knife i will go to that place again and see if any of my threads also max out
I have a 3700x and in certain parts of the city (cyberpunk) it does drop into the 50's at ultra settings, its rare but it happens. But i changed my refresh to 50hz through Nvidia control panel (as the game doesnt have a 50hz option in the settings), turned on vsync, and the game didnt drop once under 50fps. I was literally testing the game an hour ago for around 3 hours. At Ultra settings its a locked 50fps with vsync. So if those vids are showing drops into the 40's, there is somehting else going on.

My specs - 3700X, 32gb Ram, GTX 1080ti.

EDIT I forgot to mention that Cyberpunk has a bug with AMD cpu's where the game doesnt utilize the cpu properly, and uses less cores (even though Rivatuner will show all cores being used, as it does with every game, even games that only use 8 threads or less). I did the fix and def saw way higher core numbers than before, so maybe that has something to do with those vids you are referencing.
 
Last edited:

yamaci17

Member
I have a 3700x and in certain parts of the city (cyberpunk) it does drop into the 50's at ultra settings, its rare but it happens. But i changed my refresh to 50hz through Nvidia control panel (as the game doesnt have a 50hz option in the settings), turned on vsync, and the game didnt drop once under 50fps. I was literally testing the game an hour ago for around 3 hours. At Ultra settings its a locked 50fps with vsync. So if those vids are showing drops into the 40's, there is somehting else going on.

My specs - 3700X, 32gb Ram, GTX 1080ti.
that video is with ray tracing, now that i noticed
however still proves my point, kind of, considering you can actually get 60 fps rt experience alongside with dlss with a 3060ti
yoours is a more balanced system tho. 3700x is a great fit for 1080ti/2070/2070 super
anything above, you need to be at 4k/+gpu heavy settings region . some games at 1440p will also do fine but 3060ti/3070 and above is so strong for 1440p even there some times bottlenecks will be unavoidable
 
Last edited:
okay but here's the kicker



i just checked this video. its a perfect locked 60, it seems like. this palce was a mess on my 2700x, often dropping to 45s.

exact same place puts the 5800x 3D CPU bound at 90 FPS



that's... massive. this is a CPU with 96 mb cache, a whopping %25-30 IPC over Zen 2, a whopping 4.5 Ghz of frequency. (i know that settings in the said video is ultra. it does not even matter. i put my settings to ultra, turn back to default, i still get exact same CPU bound performance). compared that to the puny ps5 with its measly 8 mb cache, zen 2 cpu at 3.6 ghz getting a rocksolid 60 fps in that place.

now let's do a simple math. we already have a whopping %22 frequency advantage in our hands. then come another %25 due to zen 3. then come another most likely %25-30 advantage thanks to enormously high cache (do note that this increased cache makes 5800x 3d outperform regular 5800x reasonably more in most games). a whopping total of %100 CPU performance. yes, in theory, 5800x 3D should be, core by core, 2 times faster than a ps5. yet what see here is 5800x 3d barely pushing 90-95 FPS in this scene, heavily CPU bound situation. its just comical at this point. its like thanos vs avengers, all that for a droop of blood. and mind you, this is an AMD GPU. situation would be %15-20 worse if this video had an Nvidia GPU in it. it would be CPU bound at a freaking 80-85 FPS, most likely. its just comical. imagine brute forcing that much power over a console, only to get barely %33-50 more FPS in return in certain games

i too firmly believe that this is a really case of bad optimization but this kind of bad CPU scaling happens across more games. on GPUs you can quite exact performance nowadays, but CPU side of things is still f**** up

also do note that i said i was getting 45-50 fps in this place with my puny old 2700x. at least 5800x 3d manages to get 90-100 fps. and when you actually check the benchmarks, you will see 5800x 3d actually outperforming my CPU by two times at times. but usually it beats it by %75-80. the extra %20 comes from the nvidia overhead.

so yes, matching consoles with console - equivalent CPU is not going to do it for this gen. even stronger CPUs will falter. I bet in this scene, 3700x would drop frames easily below 60. people with 3700x hardware can go and try ruined factory in death stranding directors cut and see if they can get that smooth ps5-equivalent locked 60 fps experience that place. just go and test it. chances are if 5800x 3d drops to 90s, 3700x will easly drop below 60.

People lied to me and said the cpu doesn’t really matter at 4k and here I’m thinking besides some games like crysis scale on the cpu with higher resolutions, having a better cpu can always help with your 1% lows. I honestly almost wish the ps5 could be tested with a zen 3 chip like someone makes a custom model I think it would be revelatory and confirm what I’ve been saying on other forums like Reddit and YouTube since launch
 

Stuart360

Member
that video is with ray tracing, now that i noticed
however still proves my point, kind of, considering you can actually get 60 fps rt experience alongside with dlss with a 3060ti
Well thats why i mentioned i was using ultra settings. If i stick everyhting onto 'low' the game barely drops under 100fps at 1080p, even in the rare spots where the game dropped into the high 50's at ultra.
And by the way, i'm pretty sure the console verions of Cyberpunk, even the next gen versions, are not running anywhere close to ultra. Not in the vids i have wathced anyway.
 

yamaci17

Member
People lied to me and said the cpu doesn’t really matter at 4k and here I’m thinking besides some games like crysis scale on the cpu with higher resolutions, having a better cpu can always help with your 1% lows. I honestly almost wish the ps5 could be tested with a zen 3 chip like someone makes a custom model I think it would be revelatory and confirm what I’ve been saying on other forums like Reddit and YouTube since launch
5800x 3d actually improves %1 lows HUGELY over base 5800x. in some games, it only provides a mere %10-15 uplift, but in most scenarios, it provides a healthy %25-35 uplift to %1 lows. its really a great cpu, probably the best cpu that will provide most smooth unlocked framerate experience in the industry
 
5800x 3d actually improves %1 lows HUGELY over base 5800x. in some games, it only provides a mere %10-15 uplift, but in most scenarios, it provides a healthy %25-35 uplift to %1 lows. its really a great cpu, probably the best cpu that will provide most smooth unlocked framerate experience in the industry
This shows that it’s good I’ve done way more digging into the importance of the cpu and it mattered in the end. It’s strange I hear more from the console side how important the cpu is than the pc players who have given me consistent misinformation. Im trying to imagine how a custom ps5 with the exact same internals but now with a 5800x3d would run the gpu would now never be bottlenecked
 

Stuart360

Member
This shows that it’s good I’ve done way more digging into the importance of the cpu and it mattered in the end. It’s strange I hear more from the console side how important the cpu is than the pc players who have given me consistent misinformation. Im trying to imagine how a custom ps5 with the exact same internals but now with a 5800x3d would run the gpu would now never be bottlenecked
I dont think you were given bad info to be honest. Fact is outside of a very small selection of PC games (Cyberpunk being one of them, and the worst cpu wise), the vast majority of PC games will easily run at 100+fps if you have a good cpu and gpu. And if you're like me, and are happy with a locked 60fps (or play on a 60hz screen), then cpu's way lower end than a 3700x will hit 60fps in pretty much every game, along side a decent gpu.
Cyberpunk is just one of those very rare games that is super demanding both on the cpu and gpu.
 
I dont think you were given bad info to be honest. Fact is outside of a very small selection of PC games (Cyberpunk being one of them, and the worst cpu wise), the vast majority of PC games will easily run at 100+fps if you have a good cpu and gpu. And if you're like me, and are happy with a locked 60fps (or play on a 60hz screen), then cpu's way lower end than a 3700x will hit 60fps in pretty much every game, along side a decent gpu.
Cyberpunk is just one of those very rare games that is super demanding both on the cpu and gpu.
In this case though we were trying to judge the max potential of a gpu and it’s important to know if the cpu is holding it back. Do you know if there will be a 6800x3d for the upcoming zen 4 line it sounds like a perfect cpu for me
 

01011001

Banned
same thing in cyberpunk



lets observe the video again

3700x drops frames to 45s with a mere %45-50 usage

exact same place, 5900x grafecully pushes a healthy 64 fps


it also has a mere %34-35 usage. gpu bottleneck is alleviated and now you get 60+ frames


yeah, I just recorded a quick video with and without RT and using several different DLSS settings.
this game is severely CPU limited

I did have the Nvidia performance overlay on but somehow Nvidia's own screen recording app doesn't jive well with it... ironic...
but you can still see the Steam overlay top left thankfully, but that doesn't give the whole picture of course.

at the end when I turned off RT and ran at 1440p the GPU utilization was at 95%, basically showcasing how even without RT my CPU is the limiting factor here

Ryzen 5600X / RTX 3060ti / 16GB DDR4 3200
I am running Digital Foundry's optimized settings



(also Nvidia's recording thingy has really shitty quality... damn... this is the highest bitrate available too)
 
Last edited:

Stuart360

Member
yeah, I just recorded a quick video with and without RT and using several different DLSS settings.
this game is severely CPU limited

I did have the Nvidia performance overlay on but somehow Nvidia's own screen recording app doesn't jive well with it... ironic...
but you can still see the Steam overlay top left thankfully, but that doesn't give the whole picture of course.

at the end when I turned off RT and ran at 1440p the GPU utilization was at 95%, basically showcasing how even without RT my CPU is the limiting factor here

Ryzen 5600X / RTX 3060ti / 16GB DDR4 3200
I am running Digital Foundry's optimized settings


There isnt a PC game like it in terms of cpu usage. Its the only game that i couldnt lock to 60fps (although it was at 60fps like 95+% of the time) on my 3700x. Hell its the only game i couldnt get 60fps when i had my 2700x too.
All the older cpu benchmark games like AC Odyssey, Watchdogs 2 and Legeon, no problem getting 60fps.

Cyberjunk is Cyberjunk. (although it is a bit of a sight toi behoild at Ultra settings lol).
 

01011001

Banned
There isnt a PC game like it in terms of cpu usage. Its the only game that i couldnt lock to 60fps (although it was at 60fps like 95+% of the time) on my 3700x. Hell its the only game i couldnt get 60fps when i had my 2700x too.
All the older cpu benchmark games like AC Odyssey, Watchdogs 2 and Legeon, no problem getting 60fps.

Cyberjunk is Cyberjunk. (although it is a bit of a sight toi behoild at Ultra settings lol).

it really is a big outlier.

I rerecorded with better quality, because god damn that other video looked like crap...
the Nvidia performance overlay still didn't wanna play nice, so Steam FPS overlay it still is



interestingly, layering more and more RT effects on top only slightly changes the performance, it seems the BVH generation is one of the big performance hogs here and at some point your performance will more or less plateau for each resolution you are running in

at the end I show without RT running first at DLSS Ultra Performance (which is 480p internal I think) and then again without DLSS at native 1440p, and both hover at around 80FPS
 
Last edited:

Stuart360

Member
it really is a big outlier.

I rerecorded with better quality, because god damn that other video looked like crap...
the Nvidia performance overlay still didn't wanna play nice, so Steam FPS overlay it still is



interestingly, layering more and more RT effects on top only slightly changes the performance, it seems the BVH generation is one of the big performance hogs here and at some point your performance will more or less plateau for each resolution you are running in

at the end I show without RT running first at DLSS Ultra Performance (which is 480p internal I think) and then again without DLSS at native 1440p, and both hover at around 80FPS

Ha you have been doing a lot of testing on this game havent you. That area, and a main road near by, are the most demanding areas in the game, and the 2 areas when the game would drop into the high 50's at Ultra settings for me. The rest of the game was 60fps.
 

01011001

Banned
Ha you have been doing a lot of testing on this game havent you. That area, and a main road near by, are the most demanding areas in the game, and the 2 areas when the game would drop into the high 50's at Ultra settings for me. The rest of the game was 60fps.

it's mostly that area that is an issue yes. I usually run DF's optimised settings + RT with DLSS and am usually above 60fps, often in the 70's, but that fucking part of the city is ridiculous.
 
Last edited:

Stuart360

Member
it's mostly that area that is an issue yes. I usually run DF's optimised settings + RT with DLSS and am usually above 60fps, often in the 70's, but that fucking part of the city is ridiculous.
You know the annoying thing for me is that i have always used Intel cpu's, always, until i switched to AMD with the 2700x, and now3700x, simply because of the price to power level of AMD compared to Intel. The annoying thing though with all my Intel cpu's, they would always get to high 90's percent usage on the cpu cores before they would start bottlenecking my gpu's, often at 99%. With AMD though, the cores get to the high 70's/low 80's percent usage and then start bottlenecking the gpu.
Is that like an AMD thing?.
 
Last edited:

01011001

Banned
You know the annoying thing for me is that i have always used Intel cpu's, always, until i switched to AMD with the 2700x, and now3700x, simply because of the price to power level of AMD compared to Intel. The annoying thing though with all my Intel cpu's, they would always get to high 90's percent usage on the cpu cores before they would start bottlenecking my gpu's, often at 99%. With AMD though, the cores get to the high 70's/low 80's percent usage and then start bottlenecking the gpu.
Is that like an AMD thing?.

not sure, could be a driver thing, could be simply an issue with games being mainly optimised for Intel, all I know is that the 2000 series was not great for gaming in general compared to Intel, AMD became competitive again with Zen 2, and with my current Zen 3 I basically have no issues with any game, I am almost always GPU limited, it's really only Cyberpunk and some really badly optimized games like the og Crysis.

the jump from Zen+ (2600 in my case) to Zen 3 (5600X) was a big breath of fresh air... Zen+ really was just not good :pie_expressionless:
I don't even wanna know what would happen if you tried running Cyberpunk on a Zen+ CPU, especially with Raytracing, in that exact part of the city
 
Last edited:

Stuart360

Member
not sure, could be a driver thing, could be simply an issue with games being mainly optimised for Intel, all I know is that the 2000 series was not great for gaming in general compared to Intel, AMD became competitive again with Zen 2, and with my current Zen 3 I basically have no issues with any game, I am almost always GPU limited, it's really only Cyberpunk and some really badly optimized games like the og Crysis.

the jump from Zen+ (2600 in my case) to Zen 3 (5600X) was a big breath of fresh air... Zen+ really was just not good :pie_expressionless:
I don't even wanna know what would happen if you tried running Cyberpunk on a Zen+ CPU, especially with Raytracing, in that exact part of the city
To be fair, i never had a problem getting to 60fps max settings with any game i played when i had my 2700x, except Cyberpunk of course.:messenger_smiling_with_eyes:, and the highest core usage i saw at 60fps on any game was 45-55% usage. In fact the only reason why i even have a 3700x now is because the retailer where i always buy my PC stuff from sent me an 'exclusive' offer of a 3700x for £50. The price was too good to pass up, otherwise i would still be on the 2700x.
 
Last edited:

01011001

Banned
To be fair, i never had a problem getting to 60fps max settings with any game i played when i had my 2700x, except Cyberpunk of course.:messenger_smiling_with_eyes:. In fact the only reason why i even have a 3700x now is because the retailer where i always buy my PC stuff from sent me an 'exclusive' offer of a 3700x for £50. The price was too good to pass up, otherwise i would still be on the 2700x.

60fps was no issue no, but some games I play require more than that. at that time I played Valorant a lot, and I ran into a CPU bottleneck there.
 

Stuart360

Member
60fps was no issue no, but some games I play require more than that. at that time I played Valorant a lot, and I ran into a CPU bottleneck there.
I think thats where i feel i'm a lucky PC gamer, because i play on a 60hz big screen tv. I only ever 'need' 60fps, i dont play unlocked framerates, so pc parts can last me years. Like i wouldnt be surprised if i'm still getting 60fps in the vast majority of games 5 ot 6 years from now, with my 3700x and 1080ti.:messenger_sunglasses:
 
Last edited:
Top Bottom