• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Forspoken PC Requirements

Mister Wolf

Gold Member
LoL @ PS5 = 3060ti. at this rate by next year the PS5 will be a 3080 at gaf heh.

PS5 is not a 3060ti. not even close.

Watch all the videos head to head in the digital foundry where they compare PC settings to the PS5. even in games where it favors Nvidia like assassin's creed, the PS5 couldn't top more than 2070s.

Yeah it's always been understood to be 2070S/2080 range.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I know, I recall having these discussions with you. And now that it's transpiring there are cries of "unoptimised".

This is why I said what I said.
Whats transpired?
6 core CPUs are still absolutely eating through games.

Is there a game where the 126K is dying.
Hell the 106K is still a beyond viable CPU for any modern title matching or beating 8 core 3700Xs.
The CPU you were really championing for as the minimum to match nextgen consoles.

Dont even get me started on the 7600X and 136K.

For this very game, the rec spec is a lowly R5-3600.......a 6 core CPU.

So what exactly has transpired?
relative-performance-games-2560-1440.png



Are you joking? You can do it but unless you configure it correctly you can run into issues. It's not plug and play in the same way that you'd get if you just got 2 (or 4) matching sticks.

There's a reason why they sell kits in 16GB and 32GB.

Ohh you are one of those builders.
20 years you say?

Even mismatched kits will revert to or rather start at JEDEC standard, just like any kit that uses XMP or EXPO.
Match their timings and you are solid.
Is it an extra few minutes in the BIOS....yes.
But you were entering the BIOS anyway to enter XMP/EXPO profiles and to OC.
But if you arent entering the BIOS with your builds might as well just get a prebuilt.
 
Last edited:

GHG

Member
LoL @ PS5 = 3060ti. at this rate by next year the PS5 will be a 3080 at gaf heh.

PS5 is not a 3060ti. not even close.

Watch all the videos head to head in the digital foundry where they compare PC settings to the PS5. even in games where it favors Nvidia like assassin's creed, the PS5 couldn't top more than 2070s.

It's like people don't realise consoles aren't running at native resolution, nor are they running at PC max settings.

People see these requirements and panic. It's the 4k 60fps max settings config that people are talking about the most and consoles will not be running the game at those settings.
 

ANDS

King of Gaslighting
Same goes for you.

We haven't played the game, but based on those system requirements it's safer to say that it's a shit port rather than a competently made one.

. . .which is why I'm not making any statements on the game being optimized based solely on my interest in playing the game.
 

GHG

Member
Whats transpired?
6 core CPUs are still absolutely eating through games.

Is there a game where the 126K is dying.
Hell the 106K is still a beyond viable CPU for any modern title matching or beating 8 core 3700Xs.
The CPU you were really championing for as the minimum to match nextgen consoles.

Dont even get me started on the 7600X and 136K.

For this very game, the rec spec is a lowly R5-3600.......a 6 core CPU.

So what exactly has transpired?
relative-performance-games-2560-1440.png





Ohh you are one of those builders.
20 years you say?

Even mismatched kits will revert to JEDEC standard.
Match their timings and you are solid.
Is it an extra few minutes in the BIOS....yes.
But if you arent entering the BIOS with your builds might as well just get a prebuilt.

What has transpired is that 6 core CPU's will start to get hammered and it will only get worse as the generation develops. As before you're bringing up graphs containing a bunch of last gen and cross-gen games as if it means anything. I've been there before and the one time I didn't take my own advice (and settled for a 4 core 4 thread CPU instead of a 4/8 or even 6/12 CPU at the time) my PC very quickly started to struggle, all for the sake of "saving" £50. I didn't cry unoptimised, I just realised my mistake. It's always better to go "overkill", expecting id level optimisation for every game is a fools errand.

Most people don't know what they are doing in Bios beyond enabling XMP, expecting the average or beginner PC builder to have to faff around with RAM timing is foolish, especially when you have to purposefully go out of your way to create a configuration like 24GB.
 
Last edited:

clampzyn

Member
LoL @ PS5 = 3060ti. at this rate by next year the PS5 will be a 3080 at gaf heh.

PS5 is not a 3060ti. not even close.

Watch all the videos head to head in the digital foundry where they compare PC settings to the PS5. even in games where it favors AMD like assassin's creed, the PS5 couldn't top more than 2070s.
yea its fun thinking no games were built solely for the ps5s hardware yet, all of those games were cross-gen games. and most ps5 titles that are built for ps5 only are mostly exclusives. and its funny thinking this 8gb cards that you are comparing too will not last much longer just like 2070s is hitting vram limitations already on spider man remastered. later this gen games will be more demanding on vram and you realise this cards will struggle to compete on higher res/textures that are built for current gen(ps5/seriesx) consoles.

 
Last edited:

Spyxos

Member
just install w11 man. It's free and it's the same fucking thing as w10. Just as bad :p
Every time I install new Windows, something goes really wrong. My update to win 10 killed my GPU. So i will wait so long as i can. And i have some weird Windows version right now. I dont think i can update.
 
Last edited:

rofif

Can’t Git Gud
Every time I install new Windows, something goes really wrong. My update to win 10 killed my GPU. So i will wait so long as i can.
That's strange and probably something else happened.
Installing windows is a 10 min deal really. Even faster fresh than update
 

rofif

Can’t Git Gud
LoL @ PS5 = 3060ti. at this rate by next year the PS5 will be a 3080 at gaf heh.

PS5 is not a 3060ti. not even close.

Watch all the videos head to head in the digital foundry where they compare PC settings to the PS5. even in games where it favors AMD like assassin's creed, the PS5 couldn't top more than 2070s.
Well it is close. It's about 2070s-2080 but with a bit worse RT performance. 3060ti is very close to 2080.
And that's on crossgen games. We've not seen crazy optimized stuff like uncharted 4 was on ps4 just yet.... and we might never see this with dev times taking 5-6 years.... by then new consoles and stupid pro will be out
 

Nvzman

Member
I think the most fascinating thing here is that the 6800XT RDNA2 GPU is recommended but the Lovelace 4080 is also recommended.

Game was optimized heavily for RDNA2 which makes sense given it's PS5s gen of hardware. Wonder how that stacks up.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
What has transpired is that 6 core CPU's will start to get hammered and it will only get worse as the generation develops. As before you're bringing up graphs containing a bunch of last gen and cross-gen games as if it means anything. I've been there before and the one time I didn't take my own advice (and settled for a 4 core 4 thread CPU instead of a 4/8 or even 6/12 CPU at the time) my PC very quickly started to struggle, all for the sake of "saving" £50. I didn't cry unoptimised, I just realised my mistake. It's always better to go "overkill", expecting id level optimisation for every game is a fools errand.
6 cores will start to get hammered???
Hammered as in how....cuz a 4/8 CPU can still handle 60fps with relative ease today.
6 cores arent gonna fail to hit 60 anytime soon.
And by "as the generation develops" do you mean when the PS6 drops?
Or during the generation of the PS5/XSX?


Cuz if you mean during the PS5/XSX generation im willing to avatar bet you, whatever 6 core CPU drops in the penultimate year of the PS5/XSX it will still eat through 90% of whatever games you throw at it.
There will be outliers like Anno/Skylines and the like but for 90% of games, 6 cores will float through the entire PS5/XSX generation.


Games arent utilizing those extra cores.
So going for 10 or 12 or 16 doesnt make sense.
We are still stuck in the master thread generation.
High single core performance and maybe in future high cache, matter more than loading up with cores.
Games still arent being designed with super high parallelization and there are no inclinings devs are even planning on going that route.
Its still easier to have a master thread and a few worker threads than to heavily spread jobs.
 

Zathalus

Member
LoL @ PS5 = 3060ti. at this rate by next year the PS5 will be a 3080 at gaf heh.

PS5 is not a 3060ti. not even close.

Watch all the videos head to head in the digital foundry where they compare PC settings to the PS5. even in games where it favors AMD like assassin's creed, the PS5 couldn't top more than 2070s.
The PS5 performs anywhere between a 2060 Super and a 3060ti. Death Stranding, Valhalla, and Uncharted 4 for example are quite close to the 3060ti on PS5. It really depends on the game.
 

ACESHIGH

Banned
This game looks generic AF. 9th gen has been a disappointment so far. Specially for PC gamers with massive backlogs like me looking for reasons to upgrade:

On 8th gen you had games doing amazing shit from day one like Killzone SF, Dead Rising 3, Ryze or Infamous SS. If you were a console gamer you felt those improvements from the get go and across all games: you could clearly see the resolution boost, overall IQ, higher res textures, smoother framerate (solid 30 vs avg 20 - 25) and if you were a PC gamer you were already gaming at 1080p 60 or more but at least the eye test "aka more shit on screen" told you that those games were next gen.

The most next gen game we had so far was released even before the consoles: Flight Simulator. All the others were disappointing from a technical and most of all, they were plain bad games for the most part.
 

clampzyn

Member
The PS5 performs anywhere between a 2060 Super and a 3060ti. Death Stranding, Valhalla, and Uncharted 4 for example are quite close to the 3060ti on PS5. It really depends on the game.
yea thats the problem, we never know when is the ps5 hardware being pushed hard cause consoles usually runs at a fixed resolution and fps, we almost never know how much headroom it still has. remember spiderman remastered where the options were just locked 60 for performance mode then when they enabled vrr it could go on the higher 80-100s?

people are so fixated just because consoles only do 1440p60 means it can only do 1440p60, not thinking what fps it can achieve if devs unlocked those fps caps they put

 
Last edited:

//DEVIL//

Member
yea its fun thinking no games were built solely for the ps5s hardware yet, all of those games were cross-gen games. and most ps5 titles that are built for ps5 only are mostly exclusives. and its funny thinking this 8gb cards that you are comparing too will not last much longer just like 2070s is hitting vram limitations already on spider man remastered. later this gen games will be more demanding on vram and you realise this cards will struggle to compete on higher res/textures that are built for current gen(ps5/seriesx) consoles.



The PS5 performs anywhere between a 2060 Super and a 3060ti. Death Stranding, Valhalla, and Uncharted 4 for example are quite close to the 3060ti on PS5. It really depends on the game.

Wow.. people here are actually serious when they mention the PS5 and compare it to a 3060 TI.. what a joke .,.. I am out. I am not even gonna bother trying. because honestly If I start, I will be ending up cussing these people to oblivion for their stupidity and probably get banned. good luck to anyone who will take that task.

and no. the fucking PS5 is not close to a 3060 ti.

ffs.
 

Gaiff

SBI’s Resident Gaslighter
Black_Stride Black_Stride GHG GHG My bet is you two aren't even talking about the same thing. Black_Stride is clearly talking about a CPU that gets you by for 60fps and mid-tier gaming which is sensible.

GHG on the other hand I bet is a guy like me who gets an i7/i9 and guns for 100fps+ because when he saw his six core struggle to remain above 70, he concluded that it wasn't enough. I see that you have a 5800X3D and a 4090 so by your standards, no, a 6-core won't cut it. By mainstream standards though, Black_Stride is 100% correct and you will be able to get a 60fps experience with a 6-core in 99% of the games out there. GHG I think is talking about high-end PC gaming which a 6-core won't be able to feed.

Who the fuck buys a 4090/5800X3D for 60fps? That's 4K/120 combo. A 6-core will bottleneck the fuck out of a 4090 or other high-end GPUs.
 

clampzyn

Member
Wow.. people here are actually serious when they mention the PS5 and compare it to a 3060 TI.. what a joke .,.. I am out. I am not even gonna bother trying. because honestly If I start, I will be ending up cussing these people to oblivion for their stupidity and probably get banned. good luck to anyone who will take that task.

and no. the fucking PS5 is not close to a 3060 ti.

ffs.
yea soon games will be more demanding 3060ti would be stuck at lower res/texture thanks to nvidia cheaping out on vram.
 

ToTTenTranz

Banned
These "minimum requirements" lists are usually pretty bad at showing AMD and Nvidia equivalents, but it's usually biased towards the other way. I.e. "for medium settings we recommend either a Geforce GTX 1050 2GB Laptop or a Radeon RX 6900XT 16GB".
Either way, I wouldn't trust these recommendations anymore than I trust all the others, which is no trust at all. Just wait for pc performance comparisons.


This game looks generic AF. 9th gen has been a disappointment so far.

Best answer I have for that statement is this in-game screenshot I took yesterday of Forbidden West in my PS5 in Performance mode.


y8DRouo.jpg


In case you're wondering, it looks 10x better in my C9 OLED without internet compression.
 

Metnut

Member
Wow, that’s a pretty demanding game.

Very happy PS5 owner here who is looking forward to digging into this beast. Glad Square has focused on optimizing the game for the PS5 experience. Nice to be able to just insert the disc and play and not have to worry about mixing and matching hardware.
 

Gaiff

SBI’s Resident Gaslighter
what ?



at 4k gaming with the 4090, doesn't even matter. look at the difference between a 250$ CPU and 650$ CPU.

your CPU is no longer the problem.

On what planet is the 13600K a 6-core CPU? It has 6 P-cores and 8 E-cores. I'm talking about something like a 12400 with 6 P-cores and nothing else.
 

diffusionx

Gold Member
It’s hilarious how we keep having these same conversations every generation. Once games target better console hardware they will need higher pc requirements lol. There is no ps4 version of this game designed to run on a netbook cpu.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Black_Stride Black_Stride GHG GHG My bet is you two aren't even talking about the same thing. Black_Stride is clearly talking about a CPU that gets you by for 60fps and mid-tier gaming which is sensible.

GHG on the other hand I bet is a guy like me who gets an i7/i9 and guns for 100fps+ because when he saw his six core struggle to remain above 70, he concluded that it wasn't enough. I see that you have a 5800X3D and a 4090 so by your standards, no, a 6-core won't cut it. By mainstream standards though, Black_Stride is 100% correct and you will be able to get a 60fps experience with a 6-core in 99% of the games out there. GHG I think is talking about high-end PC gaming which a 6-core won't be able to feed.

Who the fuck buys a 4090/5800X3D for 60fps? That's 4K/120 combo. A 6-core will bottleneck the fuck out of a 4090 or other high-end GPUs.
Even if we talk about super high end gaming. 4K120 and the like.
Park 2 of the cores in the 5800X3D and tell me the performance loss.
If you have an i9 park 2 of the P-cores and tell me the perf difference you see.

Hell when the 7800X3D drops build your own 7600X3D by parking two cores.

I can all but guarantee you wont be seeing a perf loss worth mentioning.

Hell the 136K a 6 core CPU already matches the 5800X3D on average and matches its higher core older brothers, the fundemental difference in gaming is down to single core performance from i9s being binned better.
I know GHG GHG is gonna say thats with lastgen/crossgen titles and not future titles.
Im gonna keep saying for atleast this generation of consoles 6 cores will not only be enough, but with similar binning the 6 core will match the 8, 10, 12, 16 core CPUs in 90% of games.


I cant off head think of a game that actually gets bottlenecked by 6 cores and isnt just straight up bottlenecked by the whole arch.
Are there really that many games that do significaltly better on 139K vs a 136K?


I dont play many RTS/Sim games I know Anno and Skylines are known to load up as many cores are available but those are outliers.
 

//DEVIL//

Member
On what planet is the 13600K a 6-core CPU? It has 6 P-cores and 8 E-cores. I'm talking about something like a 12400 with 6 P-cores and nothing else.
Oh My GOD what is wrong with people today ???

again... while there is a difference since the 12400 is a banana CPU, its still not that much of a gap you thinking at 4k with 4090



20 fps or so ?. ( note the video is not even running at full 4k. just ultra wide 2k, and still 20 fps and sometimes less. ) at 4k the gap will be even smaller.

at 4k? if your CPU is decent enough, YOU ARE FINE !!
 
Last edited:
Wow, that’s a pretty demanding game.

Very happy PS5 owner here who is looking forward to digging into this beast. Glad Square has focused on optimizing the game for the PS5 experience. Nice to be able to just insert the disc and play and not have to worry about mixing and matching hardware.

Same, minus inserting a disc lol
 

Gaiff

SBI’s Resident Gaslighter
Hell the 136K a 6 core CPU already matches the 5800X3D on average and matches its higher core older brothers, the fundemental difference in gaming is down to single core performance from i9s being binned better.
Because it's not really a 6-core CPU. Those 8 E-cores aren't doing nothing and it has 20 threads. Compare that to a 12400K, an actual 6-core, the difference is pretty big. Furthermore, it's better to look at minimums which is where a high-end CPU really makes a difference. You might average 90fps with the budget CPU and 120fps with the high-end one, but if the 1% lows are 55fps vs 100fps, it will impact the experience substantially.
Oh My GOD what is wrong with people today ???

again... while there is a difference since the 12400 is a banana CPU, its still not that much of a gap you thinking at 4k with 4090



20 fps or so ?. ( note the video is not even running at full 4k. just ultra wide 2k, and still 20 fps and sometimes less. ) at 4k the gap will be even smaller.

at 4k? if your CPU is decent enough, YOU ARE FINE !!

You have to be joking.

GYyRpHW.png

A 50fps gap or 55% advantage to the 13900K.

ozv6IqI.png

A 26fps advantage to the 13600K or 41%.

This is where high-end CPU shine. People have to stop looking at fps averages and call it a day. What matters most is consistency and while a weak CPU might average something close to a high-end one in certain scenarios, you have to look at the frame time graphs as well to get the real picture. The high-end one will be much more consistent. I had a 9900K and upgraded to a 13900K and the difference in smoothness is pretty damn enormous. In extremely heavy scenarios, the 9900K sometimes struggled to maintain 60fps whereas the 13900K kept above 90 but by and large, the 9900K got by pretty decently.

Spider-Man, Miles Morales, Plague Tale Requiem, Flight Simulator 2020, and Cyberpunk 2020 are just a few examples where my 8-core 9900K got hammered badly while my 4090 was sitting at an 80% usage or less.
 
These requirements are utter nonsense. A 6800XT is roughly 30-35% faster than a 6700XT. Yet we are supposed to believe it will push more than TWICE the pixel count (4k Vs 1440p) at TWICE the frame rate (60 Vs 30fps)???

Yeah, yet another PC port that shouldn't be touched with a 10 foot pole.

Edit: Even comparisons with PS5 is weird. DF says PS5 RT mode runs at around 1000p (pushing about half as much pixels at 1440p) at 30fps. But a 5700XT (ballpark PS5 level GPU) is only 30% slower than 6700XT. So theoretically at least, the PS5 should be performing much better here. This looks like bad optimization all around.
 
Last edited:

Tqaulity

Member
Well it is close. It's about 2070s-2080 but with a bit worse RT performance. 3060ti is very close to 2080.
And that's on crossgen games. We've not seen crazy optimized stuff like uncharted 4 was on ps4 just yet.... and we might never see this with dev times taking 5-6 years.... by then new consoles and stupid pro will be out

The PS5 performs anywhere between a 2060 Super and a 3060ti. Death Stranding, Valhalla, and Uncharted 4 for example are quite close to the 3060ti on PS5. It really depends on the game.

Wow.. people here are actually serious when they mention the PS5 and compare it to a 3060 TI.. what a joke .,.
Ok here's the thing. Let me articulate the claim here that makes it clearer which parts are really true:
  • Fact: The PS5's GPU is NOT equal to an RTX 3060Ti from a purely hardware perspective
  • Also Fact: It is possible for a PS5 system to outperform (i.e. run at higher more stable framerates at equivalent raster settings) a particular PC system that has an RTX 3060Ti GPU in it (i.e. NOT ALL PCs ARE CREATED EQUAL....EVEN WITH THE SAME GPU)
This is a classic case of theory vs reality. Theoretically, no the PS5's GPU is not a 3060Ti and should not perform on par with it. However, in reality it's much easier for a developer to optimize a game for a PS5 than for dozens to hundreds of PC configurations meaning that it is much more likely to have a PC sku with a bottleneck or non optimal code path somewhere in the chain. So at a system level, PS5 is a fixed entity that is more balanced and easier to utilize so YES, there are countless cases of folks running tests on games where it is possible to get roughly equivalent settings such as Deathloop, Far Cry 6, Uncharted 4, Death Stranding, SpiderMan, and others where the PC with a 3060Ti is not able to hit a locked 60fps with the same consistently as the PS5. This doesn't mean that the PS5 is better than a 3060Ti but rather just speaks to the core nature of a console vs PC and the realities that come with that. Who knows what all a person has on THEIR PC both in terms of other HW and other SW running that could impact the performance that a developer can avoid on a PS5. CPU issues, HDD limitations, memory bandwidth, caching issues, OSS and driver overhead, conflicting user software (OBS, Anti-malware, Geforce Experience etc) can all negatively impact the PC experience (and they do frequently).

This the true nature of this (never-ending) discussion and it's not really comparing apples to apples equivalent HW in a console or PC. Sure by this logic, if the PS5 had a GPU with the size and spec of a 3060Ti in it, then it will be performing much better than it currently is. But real talk, you guys can't miss the fact that a console and PC are NOT THE SAME and don't function the same. Truth be told, I can easily build a PC with a RTX 3080 in it that can't dream of matching a PS5 or Series X in it's performance (hint: I actually do have a PC like that). How about you try to play Forza Horizon 5 with an old Intel Core i7 6700 CPU, a standard HDD, and a 3080 GPU at 1440p Med-High settings (nevermind ultra) and tell me if it matches a Series X in performance mode. (Hint: you'll be lucky if the game even ran properly for more than a few mins even though the min spec is met).
 

rodrigolfp

Haptic Gamepads 4 Life
Ok here's the thing. Let me articulate the claim here that makes it clearer which parts are really true:
  • Fact: The PS5's GPU is NOT equal to an RTX 3060Ti from a purely hardware perspective
  • Also Fact: It is possible for a PS5 system to outperform (i.e. run at higher more stable framerates at equivalent raster settings) a particular PC system that has an RTX 3060Ti GPU in it (i.e. NOT ALL PCs ARE CREATED EQUAL....EVEN WITH THE SAME GPU)
This is a classic case of theory vs reality. Theoretically, no the PS5's GPU is not a 3060Ti and should not perform on par with it. However, in reality it's much easier for a developer to optimize a game for a PS5 than for dozens to hundreds of PC configurations meaning that it is much more likely to have a PC sku with a bottleneck or non optimal code path somewhere in the chain. So at a system level, PS5 is a fixed entity that is more balanced and easier to utilize so YES, there are countless cases of folks running tests on games where it is possible to get roughly equivalent settings such as Deathloop, Far Cry 6, Uncharted 4, Death Stranding, SpiderMan, and others where the PC with a 3060Ti is not able to hit a locked 60fps with the same consistently as the PS5. This doesn't mean that the PS5 is better than a 3060Ti but rather just speaks to the core nature of a console vs PC and the realities that come with that. Who knows what all a person has on THEIR PC both in terms of other HW and other SW running that could impact the performance that a developer can avoid on a PS5. CPU issues, HDD limitations, memory bandwidth, caching issues, OSS and driver overhead, conflicting user software (OBS, Anti-malware, Geforce Experience etc) can all negatively impact the PC experience (and they do frequently).

This the true nature of this (never-ending) discussion and it's not really comparing apples to apples equivalent HW in a console or PC. Sure by this logic, if the PS5 had a GPU with the size and spec of a 3060Ti in it, then it will be performing much better than it currently is. But real talk, you guys can't miss the fact that a console and PC are NOT THE SAME and don't function the same. Truth be told, I can easily build a PC with a RTX 3080 in it that can't dream of matching a PS5 or Series X in it's performance (hint: I actually do have a PC like that). How about you try to play Forza Horizon 5 with an old Intel Core i7 6700 CPU, a standard HDD, and a 3080 GPU at 1440p Med-High settings (nevermind ultra) and tell me if it matches a Series X in performance mode. (Hint: you'll be lucky if the game even ran properly for more than a few mins even though the min spec is met).
TLDR: There are good and bad ports, not that the same GPU is more or less powerful depending on the games.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Because it's not really a 6-core CPU. Those 8 E-cores aren't doing nothing and it has 20 threads. Compare that to a 12400K, an actual 6-core, the difference is pretty big. Furthermore, it's better to look at minimums which is where a high-end CPU really makes a difference. You might average 90fps with the budget CPU and 120fps with the high-end one, but if the 1% lows are 55fps vs 100fps, it will impact the experience substantially.

You have to be joking.
A 50fps gap or 55% advantage to the 13900K.
A 26fps advantage to the 13600K or 41%.

This is where high-end CPU shine. People have to stop looking at fps averages and call it a day. What matters most is consistency and while a weak CPU might average something close to a high-end one in certain scenarios, you have to look at the frame time graphs as well to get the real picture. The high-end one will be much more consistent. I had a 9900K and upgraded to a 13900K and the difference in smoothness is pretty damn enormous. In extremely heavy scenarios, the 9900K sometimes struggled to maintain 60fps whereas the 13900K kept above 90 but by and large, the 9900K got by pretty decently.

Spider-Man, Miles Morales, Plague Tale Requiem, Flight Simulator 2020, and Cyberpunk 2020 are just a few examples where my 8-core 9900K got hammered badly while my 4090 was sitting at an 80% usage or less.
Not this e-cores for gaming thing again.

Weve seen the benchmarks.
Across a plethora of titles the average differential is less the 5%.
Yes there are highs of ~10% but those are clearly outlier results.


And as I said, the 139K vs 136K is almost entirely down to binning not having more threads in 90% of games.
You cant compare a poorly binned 12400 to highly binned 136K.
Get the 136K and disable the e-cores to get a like for like with exactly the same binning.
Do some gaming benchmarks for yourself with the 139K.
Start with the thing fully loaded.
Now park 2 P-cores....tell me the differentials.
3840-2160.png


bNj6Sgc.png





As for true 6 core vs 8 core.
Weve got those benchmarks too.
Or as I said you can do them yourself by parking cores and see what happens.
CP2077-p.webp
 

T4keD0wN

Member
This the true nature of this (never-ending) discussion and it's not really comparing apples to apples equivalent HW in a console or PC. Sure by this logic, if the PS5 had a GPU with the size and spec of a 3060Ti in it, then it will be performing much better than it currently is. But real talk, you guys can't miss the fact that a console and PC are NOT THE SAME and don't function the same. Truth be told, I can easily build a PC with a RTX 3080 in it that can't dream of matching a PS5 or Series X in it's performance (hint: I actually do have a PC like that). How about you try to play Forza Horizon 5 with an old Intel Core i7 6700 CPU, a standard HDD, and a 3080 GPU at 1440p Med-High settings (nevermind ultra) and tell me if it matches a Series X in performance mode. (Hint: you'll be lucky if the game even ran properly for more than a few mins even though the min spec is met).
3080 thats running at 100% usage will beat series x/ps5 any day.
What youre suggesting for us to compare is not a 3080, its a 3080 limited by other hardware and running at nowhere near 100% usage. Its pretty obvious that rtx 3080 running at 30% would not win against series x gpu thats running at 100%.
Why not compare a 3080 at 30% usage against 30% of series x performance to be fair? I can come up with stupid and unfair limitations to push some agenda like comparing 13900k + 3080 with dlss in ultra performance mode and saying its 5 times stronger than some console or turning on ray tracing and laughing that ps5 is weaker than rtx2060, these arguments would be just as ridiculous and unfair as yours is.
I dont know if youve noticed, but most people have moved on from hdds a long time ago (even low-end prebuilts had ssds long before consoles did). Who would pair a 3080 with a 6700 anyway? Thats an unrealistic and very badly balanced build.
 
Last edited:
Top Bottom