• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Forspoken PC Requirements

TrebleShot

Member
Once again I would like to say with chest loud and proud so the dunces in the back can hear.

PS5 hardware is NOT one to one “equal“ to any PC.
The APIs are different.
The hardware is custom.

To match certain low level efficiencies on PS5 you need to vastly over spec your gaming PC which of course can achieve better results but you can not say “oh it’s like a 3060” etc as a hardware configuration of the ps5 does not exist in desktop nor are the systems and APIs driving it.
 

Kenpachii

Member
just install w11 man. It's free and it's the same fucking thing as w10. Just as bad :p

1iq5lk.jpg


Windows 10 resistance!

Why?

Can't be arsed to install 11 that's why.
 

Fbh

Member
PC has been getting a ton of shitty ports lately. Makes me glad im just a lowly console gamer

To me it's not as much a PC issue as it is about some devs (specially japanese ones) seemingly using the extra power in this new gen to ignore optimization and just brute force everything.
We've gotten a lot of games like Elden Ring, Stranger of Paradise, Star Ocean 6, Gotham Knights, etc that aren't really showing a generational leap compared to what we had last gen (I'd argue some like Stranger of Paradise almost looks closer to a Ps3 era title) and yet they really struggle with performance.

I don't consider myself a graphics whore but I can accept either amazing visuals with performance issues or dated visuals with solid performance. But a lot of these new games are giving us dated visuals with bad performance
 

Gaiff

SBI’s Resident Gaslighter
Not this e-cores for gaming thing again.

Weve seen the benchmarks.
Across a plethora of titles the average differential is less the 5%.
Yes there are highs of ~10% but those are clearly outlier results.


And as I said, the 139K vs 136K is almost entirely down to binning not having more threads in 90% of games.
You cant compare a poorly binned 12400 to highly binned 136K.
Get the 136K and disable the e-cores to get a like for like with exactly the same binning.
Do some gaming benchmarks for yourself with the 139K.
Start with the thing fully loaded.
Now park 2 P-cores....tell me the differentials.
3840-2160.png


bNj6Sgc.png
There is one fatal flaw with your example; you may disable the 13600K's e-cores but this won't change the price you're paying for it. We're talking real-life scenarios, not theoretical ones. Disabling the e-cores on the 13600K doesn't suddenly make it a 6-core CPU. if someone approaches you asking for a 6-core CPU, would you tell them to get a 13600K and disable the e-cores? Of course not.

In the current market, the real 12th gen 6-core is the 12400K, not the 12600K which has 4 e-cores. You can't seriously tell me the 12400K won't severely bottleneck the 4090.
As for true 6 core vs 8 core.
Weve got those benchmarks too.
Or as I said you can do them yourself by parking cores and see what happens.
CP2077-p.webp
Again, not what's being talked about. The argument is whether or not 6-core CPUs will be a bottleneck for high-end PC gaming. Judging by the performance of the 12400K vs the 13600K, it absolutely will be as it's sometimes outperformed by over 50% by the 13600K.
 

ToTTenTranz

Banned
Fact: The PS5's GPU is NOT equal to an RTX 3060Ti from a purely hardware perspective

Of course it's not equal but the rasterization performances are pretty similar.
The 3060 Ti has dual-pumped ALUs doing ~7 TFLOPs x2 (the x2 part makes it hard to reach full potential because there's no 2x registers and 2x caches), whereas the PS5 does 10 TFLOPs. The PS5 has much higher texel fillrate (closer to a RTX3080, actually). The 3060 Ti has higher available memory bandwidth but lower amount of VRAM available.

In the end the PS5's GPU is more or less a RX 6700 non-XT which does seem to place very close to the 3060 Ti at 1440p (that's the pre-upscale native render target on most console games anyways).

KFz3ckQ.jpg




It's not like the PS5's iGPU needs any favors of better optimization to be comparable to the 3060 Ti. If anything, such optimizations will probably make it punch well above a 3060 Ti on windows.
 

GymWolf

Member
I can't think of any reason why you wouldn't have at least 32GB of system ram now.

Those people who told you 6 core CPU's and 16GB of Ram would serve you well once the next gen games started arriving were lying to you.
Does that count a lot if i play at 4k with a 13600k??
 

rofif

Banned
As a PC gamer I’m not understanding what I’m supposed to be mad about.

The recommended specs is a low-mid end computer.
PCMR is either all poud of their rigs or all angry about high specs required for some reason... because they lied about their rigs.

IMO none of this is important. People are too focused on technicality and will quickly boycot and ditch the game because it's not 4k144fps but only 4k113fps. These people are not games.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
There is one fatal flaw with your example; you may disable the 13600K's e-cores but this won't change the price you're paying for it. We're talking real-life scenarios, not theoretical ones. Disabling the e-cores on the 13600K doesn't suddenly make it a 6-core CPU. if someone approaches you asking for a 6-core CPU, would you tell them to get a 13600K and disable the e-cores? Of course not.

In the current market, the real 12th gen 6-core is the 12400K, not the 12600K which has 4 e-cores. You can't seriously tell me the 12400K won't severely bottleneck the 4090.

Again, not what's being talked about. The argument is whether or not 6-core CPUs will be a bottleneck for high-end PC gaming. Judging by the performance of the 12400K vs the 13600K, it absolutely will be as it's sometimes outperformed by over 50% by the 13600K.
If the 6 core 7600X is bottlenecking a 4090 then the 8 core 7700X will also bottleneck the 4090.

Theres a ~5% performance differential between the 6 core 7600X and the 16 core 7950X
At 100+fps we are talking about a difference of maybe 5fps......literally negligible.

Even in heavy games like Spider-Man with High RT the fps and 1% lows are separated by sub 5%.

One of the biggest gaps ive seen is in Shadow of the Tomb Raider and Cyberpunk where the average is pretty much equal but 1% lows are 136 vs 126.
1% lows of 10fps at well over 100fps is the definition of negligible and could easily be attested to binning.
At 4K these numbers close up even more cuz now the GPU is doing the heavy lifting.

Average-p.webp


Even here the true 6 core intel 12400 is 10fps at 166fps behind the 12600K.
A better binned 12400 could easily make that 10fps up.
 
Its a game that looks worse than some open world last gen games and somehow still require much higher specs.

Its just dumb.
Forspoken definitely looks like ass, but it’s still a new modern game engine at the end of the day.

I’m all for developers to continue pushing the tech on PC 👍🏾
 
LoL @ PS5 = 3060ti. at this rate by next year the PS5 will be a 3080 at gaf heh.

PS5 is not a 3060ti. not even close.

Watch all the videos head to head in the digital foundry where they compare PC settings to the PS5. even in games where it favors AMD like assassin's creed, the PS5 couldn't top more than 2070s.
Yeah, this cracks me up every time I see it. Games at the moment are just better optimised for consoles, that’s literally all it is. Developers are clearly putting their money where they believe the player base is.

Edit: And yet I say this and this game runs below 1080p. Optimised indeed!
 
Last edited:

octiny

Banned
Wait a fucking moment...the instant loading time only work with samsung ssd? is this gonna be a thing with direct storage in the future? because i bought a WD ssd...

That's just a paid plug (ad). I can see more companies paying for this type of advertisement in regards to Direct Storage going forward to "trick" people into buying their drives.
 

RickSanchez

Member
Remind me again why PC gaming is the way to go? Nothing but poorly optimized console ports these days.
because other developers are making good, well-optimised ports which run great even on slightly older hardware.

My 1080Ti runs Elden Ring, Death Stranding DC, Sniper Elite 5, Doom Eternal, Hitman 3, etc at 50-60 fps, at 4K, high or very high settings. heck even Ubisoft managed better optimisation in Far Cry 6 and AC Valhalla (again i got 4k 60 fps high settings on 1080Ti).

Square Enix is shit, and we have other games to play.
 

Tqaulity

Member
3080 thats running at 100% usage will beat series x/ps5 any day.
What youre suggesting for us to compare is not a 3080, its a 3080 limited by other hardware and running at nowhere near 100% usage. Its pretty obvious that rtx 3080 running at 30% would not win against series x gpu thats running at 100%.
Yes but you're missing the point. The point is that on PC, it's very rare that a developer can write code that will run optimally on a given GPU and not be bottlenecked somewhere else. People love to attribute that to just "lazy developers" and "bad PC ports" and they have no idea just how difficult it is to make a game run optimally and smoothly on the PC platform due to just how much variance there is in the supported systems. The reality is that you can write the same code that will run "100%" on a console, but that same code will run at a wide range of sub-100% values on PCs depending on the particular PC they test on.
Why not compare a 3080 at 30% usage against 30% of series x performance to be fair? I can come up with stupid and unfair limitations to push some agenda like comparing 13900k + 3080 with dlss in ultra performance mode and saying its 5 times stronger than some console or turning on ray tracing and laughing that ps5 is weaker than rtx2060, these arguments would be just as ridiculous and unfair as yours is.
That's not at all a fair comparison. The point again is that a developer will never write code that will run at only 30% on a console because there is a lot more in their control in terms of how their code will run. On a PC, there are always so many unknowns and no developer in the world can test every HW and SW permutation to ensure that the code runs optimally in all cases. The goal here isn't to compare the theory of their performance. Limiting a console to 30% utilization when that will never be the case in an actual game is just silly and only serves to say hey, this console is only X% of the performance of this PC card in theory. But in reality with real app code, the functionality in how that code is run and executed is going to vary greatly between console and PC. That is the point. Of course the dev would want that code to run 100% on every platform it will run on, but they generally can only guarantee that on a console by the fact that is is fixed known config. They can never say that on "the PC" as a platform, only certain specific configs that they may have access to for testing.

@T4keD0wN :
I dont know if youve noticed, but most people have moved on from hdds a long time ago (even low-end prebuilts had ssds long before consoles did). Who would pair a 3080 with a 6700 anyway? Thats an unrealistic and very badly balanced build.
LOL, I've been working in the PC gaming industry for over 12 years now. I'm well aware of the trends and growth of SSDs and how pairing a 3080 with a 6700 doesn't make sense in practical terms. But again...you're missing the point. The reality is that the vast majority of the 2 billion global gamers don't have the time, resources (i.e. $$), or education to make "optimal", balanced, and sensible PC builds. That's why the majority of gamers at the start of 2023 are still gaming on 1080p 60hz monitors, with less than 8 CPU cores, and GPU performance that's well below a 2070 level. I can spend hours telling you all the crazy configurations folks have in their PCs and again the point is as a PC developer you HAVE to account for this as unfortunate and silly as it is. You have to account for the fact that people who are less knowledge in the PC space may upgrade their GPUs without updating the rest of the system, that they may still be using standard HDDs, that they may not have an RT capable GPU (despite the fact that Nvidia's 2xxx series came over 4 years ago with RT and we're on our 3rd generation of RT HW) etc. It's the reality of the world and while NeoGaf members can choose to ignore it and live in their bubble of PC dominance and multi thousand $$ rigs, the average developer cannot afford to do the same (if they want to make any money that is).
 

Mozzarella

Member
Crazy specs requirements, fortunately for me, i am skipping this, at least for half a year (if it turns out good) if not i will probably just skip it all, i feel its not my type of a game.
 

Gaiff

SBI’s Resident Gaslighter
If the 6 core 7600X is bottlenecking a 4090 then the 8 core 7700X will also bottleneck the 4090.

Theres a ~5% performance differential between the 6 core 7600X and the 16 core 7950X
At 100+fps we are talking about a difference of maybe 5fps......literally negligible.

Even in heavy games like Spider-Man with High RT the fps and 1% lows are separated by sub 5%.

One of the biggest gaps ive seen is in Shadow of the Tomb Raider and Cyberpunk where the average is pretty much equal but 1% lows are 136 vs 126.
1% lows of 10fps at well over 100fps is the definition of negligible and could easily be attested to binning.
At 4K these numbers close up even more cuz now the GPU is doing the heavy lifting.

Average-p.webp


Even here the true 6 core intel 12400 is 10fps at 166fps behind the 12600K.
A better binned 12400 could easily make that 10fps up.
The 7600X is a new-gen CPU released just a few months ago. Remember how the argument between you and GHG started. He claimed that those saying 6-cores were enough back then for high-end gaming are proven wrong now. That's why I used the 12400K as an example, not the 7600X. For instance, back in 2018, the 9900K and 9600K were dead even but a few years later, the 9600K is looking much worse.

RThWeXh.png

Here the 9900K can be as much as 20% faster on lows and 15% faster on average.

In RDR2, the 9600K suffers immensely (likely because it only has 1 thread per core).
SfCHSqZ.png
Or how about the 3600X vs the 3700X? Back then, people said the 3700X was a waste of money because the 3600X was cheaper and performed almost the same.

utRE8UZ.png

JW9n8vE.png

Point is, historically, 6-core CPUs haven't aged well compared to their bigger cousins and we saw initial negligible performance differentials ballooned into substantial differences later on. Do they get unusable and belong in the trash? Of course not, they remain serviceable but their performance decrease over time is more pronounced than the higher models and since GPUs are kept for much longer than GPUs, it's often worth it to get a better one to squeeze another 1-2 years out of your system.

tl;dr: Are you wrong? Not necessarily, 6-core CPUs tend to perform very well for their majority of their lives for high-end gaming but since people tend to keep their CPUs for quite a bit longer than their GPUs, it isn't exactly the best solution to get a mid-tier CPU if you plan on sticking to top-tier GPUs for a while. They will often be a bottleneck.
 
  • Like
Reactions: GHG

lukilladog

Member
what is the drama about? isn't always the same thing?

Yeah, the guy or girl at some publisher's office that has not touched a Pc game on their life must be thinking if they copy pasted the requirements wrongly... I remember one of the Avalanche guys saying that the actual programmers often have no much saying on this (he said that the requirements for Renegade Ops made not sense compared to the Just Cause games).
 
Last edited:

Zathalus

Member
Wow.. people here are actually serious when they mention the PS5 and compare it to a 3060 TI.. what a joke .,.. I am out. I am not even gonna bother trying. because honestly If I start, I will be ending up cussing these people to oblivion for their stupidity and probably get banned. good luck to anyone who will take that task.

and no. the fucking PS5 is not close to a 3060 ti.

ffs.
Look dude, if you think the PS5 can't perform similar to a 3060 Ti in some games take it up with Digital Foundry and not me. I didn't measure the FPS, they did. I also clearly stated this is not always the case as often the PS5 performs around 2060 Super level.

I'm sorry that facts trigger you.
 

01011001

Banned
Looks like a PS3 game, runs like a PS3 game too, and the PC port is bad like many of the PS360 console ports of the era.... I got it, the game was made as a giant tribute to the 7th generation of games!!! Square is trying to take us on a trip back to 2008 :messenger_beaming:

very nostalgic indeed.

and I bet the unfathomably awful input lag is meant to simulate bad mid 2000s HD TVs that didn't have game modes and lots of image processing
 
Last edited:

Sleepwalker

Member
Once again I would like to say with chest loud and proud so the dunces in the back can hear.

PS5 hardware is NOT one to one “equal“ to any PC.
The APIs are different.
The hardware is custom.

To match certain low level efficiencies on PS5 you need to vastly over spec your gaming PC which of course can achieve better results but you can not say “oh it’s like a 3060” etc as a hardware configuration of the ps5 does not exist in desktop nor are the systems and APIs driving it.

Insane low level efficiency by lowering the resolution to 720p
 
Top Bottom