• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

God of War on PC - Digital Foundry Tech Review

Cryio

Member
Found this benchmark of all PC GPUs at PS4 settings.

FJPQcrlXMAEK5jR


Those numbers look great until you realize the 6800xt is a 20 tflops card with a 50% IPC gain over the original PS4 offering essentially 15x more power over the original PS4 GCN gpu.

The boost you are getting is less than 6x. The 2070 Super and the 5700xt shows just how poorly this game performs on AMD cards, but the 2070 Super should be able to do far better than just 115 fps using PS4 settings at 1080p.

NX Gamer's 2070 sits at 70% utilization at higher framerates because the game is CPU bound and his ryzen 2700 is a shit CPU, but that is a very interesting comparison because that CPU is roughly equivalent to the PS5 CPU in terms of performance.
1 TF from one uArch isn't equal to 1 TF from a different uArch.

Let's say a PS4 is roughly equivalent to a HD 7850. An RX 580 is a bit over twice faster. An RX 5700 TX is about twice faster than a 580. A 6900 XT is about twice faster than a 5700 XT at 4K.

So the 6900XT is a bit over 8X as fast as a base PS4. Maybe 9-10x, but that's stretching it. Nowhere near 15-16x, lol.

Besides raw horsepower, due to various uArch improvements, the final performance number can vary. Is the game more geometry heavy? More tessellation heavy? More compute heavy? More bandwidth heavy? Not all things scale linearly the higher up you go, so that's why you don't always see a 1:1 scale or BETTER than 1:1. It's usually much, much worse than 1:1
 
Last edited:

ACESHIGH

Banned
The game is a CPU hog even at original settings and 30 FPS What would I need a Ryzen CPU that runs circles over Jaguar cores to run this game at such low settings. Tech reviewers should be pointing this out and pushing for fixes.
It hits my CPU worse than CP 2077 or FS 2020. And it's a wide linear game with a couple of enemies on screen.

DX11 was NOT the right choice for this game. But as always, devs expect PC gamers to brute force through the issues. Stellar port my arse. Decent at best.

Doom games, gears 4 and 5 GTAV, FH4 and 5, Sekiro, DS3, Days gone, death stranding, those are great PC versions of games.
 
Last edited:

yamaci17

Member
The game is a CPU hog even at original settings. Tech reviewers should be pointing this out and pushing for fixes.
It hits my CPU worse than CP 2077 or FS 2020. And it's a wide linear game with a couple of enemies on screen.

DX11 was NOT the right choice for this game. But as always, devs expect PC gamers to brute force through the issues. Stellar port my arse. Decent at best.

Doom games, gears 4 and 5 GTAV, FH4 and 5, Sekiro, DS3, Days gone, death stranding, those are great PC versions of games.

interesting, my 2700x tells another tale

i got consistent 80 fps in my 3 hr gameplay. i cannot get a consistent 80 fps neither on cyberpunk or fs 2020 with this cpu lmao what the hell
 

ACESHIGH

Banned
interesting, my 2700x tells another tale

i got consistent 80 fps in my 3 hr gameplay. i cannot get a consistent 80 fps neither on cyberpunk or fs 2020 with this cpu lmao what the hell

Your 2700x trades blows with the CPU on the next gen consoles and runs circles on the PS4 Jaguar cores. It should be running God of war at that frame rate and then some.

FS2020 is one of the few true next gen games and runs at 30 FPS or inconsistent 60 on next gen consoles. It would melt the Jaguar CPU of the Xbox one.
I am not saying they should all run at the same frame rate. Just saying God of war should run better all things considered.
 
Good optimization game is isnt too hard to run maxed out. Then again it was designed to run on a Walmart tablet CPU.

Hope to see hair fx, RT or some other special features when Rag releases on steam.
 

yamaci17

Member
Your 2700x trades blows with the CPU on the next gen consoles and runs circles on the PS4 Jaguar cores. It should be running God of war at that frame rate and then some.

FS2020 is one of the few true next gen games and runs at 30 FPS or inconsistent 60 on next gen consoles. It would melt the Jaguar CPU of the Xbox one.
I am not saying they should all run at the same frame rate. Just saying God of war should run better all things considered.

i see where you're getting but i just lowered the resolution and i can get consistent 100+ fps as well. what do you want, 240 fps god of war experience? which GPU will provide that? rtx 3090 at 720p? (im sceptical still). game runs and looks like dream even at a locked 60 fps.



even the 8+ year old i7 has no trouble hitting a consistent 60+ fps. im sure 5800x and co will push 150+ fps consistent. but there are no GPUs thare up to the task unless you play at very low resolutions

yeah that i7 will also run circles around ps4 but so what? at 1080p medium, gtx 1080 is the limiting factor so i guess you must understand where I'm getting at.



i dont understand this argument, or that why gow should be any different than other console ports. the entire gen the games were developed around those Jaguar cores, why would GoW be any different?. there are way worse offenders than GoW that looks and performs worse. Control, Star Wars Fallen order and many more recent titles comes to mind. but you do you i guess.
 
Last edited:

rofif

Can’t Git Gud
Found this benchmark of all PC GPUs at PS4 settings.

FJPQcrlXMAEK5jR


Those numbers look great until you realize the 6800xt is a 20 tflops card with a 50% IPC gain over the original PS4 offering essentially 15x more power over the original PS4 GCN gpu.

The boost you are getting is less than 6x. The 2070 Super and the 5700xt shows just how poorly this game performs on AMD cards, but the 2070 Super should be able to do far better than just 115 fps using PS4 settings at 1080p.

NX Gamer's 2070 sits at 70% utilization at higher framerates because the game is CPU bound and his ryzen 2700 is a shit CPU, but that is a very interesting comparison because that CPU is roughly equivalent to the PS5 CPU in terms of performance.
Yep exactly. 3080 is top of the line, 700usd card from the future. Just as well as the cpu's in that system compared to ps4 :p
So if ps4 at 1080p original settings was dong about 30-35fps (presumably not just 30 always because there is a cap), then its like you said
2tf vs 30tflops(3080) + huge IPC gains on cpu. So we should expect 450 fps or more on pc :p
That is absolutely crazy when it comes to just pure math haha
 

SlimySnake

Flashless at the Golden Globes
1 TF from one uArch isn't equal to 1 TF from a different uArch.

Let's say a PS4 is roughly equivalent to a HD 7850. An RX 580 is a bit over twice faster. An RX 5700 TX is about twice faster than a 580. A 6900 XT is about twice faster than a 5700 XT at 4K.

So the 6900XT is a bit over 8X as fast as a base PS4. Maybe 9-10x, but that's stretching it. Nowhere near 15-16x, lol.

Besides raw horsepower, due to various uArch improvements, the final performance number can vary. Is the game more geometry heavy? More tessellation heavy? More compute heavy? More bandwidth heavy? Not all things scale linearly the higher up you go, so that's why you don't always see a 1:1 scale or BETTER than 1:1. It's usually much, much worse than 1:1
But thats literally how IPC gains are measured. They take the same game running on the same specs, but on two different uArch cards and if there is a performance increase on the latest one they know there is an IPC gain. We have seen this with Polaris compared to GCN1.0 and then again when they went to RDNA 1.0. 25% gains in performance every time when clocks and CUs are the same.

Digital Foundry ran some tests and found the IPC gains to be around 40 to 60 percent in games, and in synthetic benchmarks like Firestrike, Timespy and GFX benchmarks they are roughly 45-70% higher than GCN 1.0 to RDNA. When AMD advertised their 25% improvement figure over Polaris, I remember they gave a list of games and averaged out the performance increases. This lines up with DF's findings.

And I have been gaming on PC for the last 3 years, I had a GTX 570 the PS3 gen, and even built one in the PS2 gen, and this is just what my experience has been. There is absolutely no need to defend a port that performs this badly on a 580 or on any other cards. RDNA and Nvidia RTX and Pascal cards included.

You can look at other 580 vs 1060 benchmarks, they run games at high or max settings at 60 fps in other games. Sometimes over, sometimes under. The worst performing games are Flight sim and Cyberpunk and this game is worse than them lol.



Even Horizon which was widely considered a bad port runs at 45 fps on average at max settings.
Death Stranding at max settings runs at 70 fps.
GOW at max settings runs at 26 fps.

I rest my case.

FJPQcrjXwAMyYql
 

Guilty_AI

Member
Even Horizon which was widely considered a bad port runs at 45 fps on average at max settings.
Death Stranding at max settings runs at 70 fps.
GOW at max settings runs at 26 fps.
Many ultra/max settings in PC games are highly unoptimized, often there just because they can. RDR2 for example had ultra settings that would murder your fps for pratically no visual gain.

Its really not just a matter of numbers or how powerful X hardware is, some engines just aren't cut to do certain types of rendering efficiently, and that seems to be the case for GOW's engine shadow mapping. Try to make a large populated open world with the old source engine and watch your performance tank even on the best of modern hardwares.

With gtx 1060 for example, GOW at max settings runs below 30, but at high it runs at +40 fps.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Many ultra/max settings in PC games are highly unoptimized, often there just because they can. RDR2 for example had ultra settings that would murder your fps for pratically no visual gain.

Its really not just a matter of numbers or how powerful X hardware is, some engines just aren't cut to do certain types of rendering efficiently, and that seems to be the case for GOW's engine shadow mapping. Try to make a large populated open world with the old source engine and watch your performance tank even on the best of modern hardwares.

With gtx 1060 for example, GOW at max settings runs below 30, but at high it runs at +40 fps.
I think you should read my post again. The video in the post shows several games running at max settings and their average FPS is way higher than the GOW average. GOW is no different than any other game being compared.

Also listed in the comparison is the 1060 which averages 23 fps at max settings.

The same is true for medium or PS original settings which do not offer the performance we have come to expect from other console games on PC running on medium settings. Calling this port superb or excellent like DF has done here is ridiculous when benchmarks show Horizon performing way better. A game that was eviscerated by the same so called experts.
 

Guilty_AI

Member
I think you should read my post again. The video in the post shows several games running at max settings and their average FPS is way higher than the GOW average. GOW is no different than any other game being compared.
I said many games, not all. This isn't a black and white thing, there'll naturally be some games which ultra settings or max settings won't perform as a badly as others, its highly dependent on the settings the dev makes available to the users, how the engine works, etc. Some games like Witcher 3 have 'hidden' ultra-ultra settings for example, ones that allow you to render grass and shadows for longer distances, and when you turn those up you'll get severe performance losses.

And GOW is different in the sense that it uses a complete different engine, with different things that it can do well or not in comparison with others, as well as different in the sense of what the devs have allowed you to tweak. If they had simply not made that ultra setting available and made it so high was the 'max setting', you wouldn't be saying any of that.

The same is true for medium or PS original settings which do not offer the performance we have come to expect from other console games on PC running on medium settings. Calling this port superb or excellent like DF has done here is ridiculous when benchmarks show Horizon performing way better. A game that was eviscerated by the same so called experts.
Again, highly dependent on the game. They might be all ex-playstation exclusive games, but ultimately they're different games, made by different devs, often running on different engines (among the ports we have UE4, Decima and a in-house engine with no name for GOW).

GOW can run smoothly on the most popular PCs at original and high settings at 40-50 fps, more powerful PCs can scale the resolution and get solid 60fps, that already makes it a fairly solid port - unlike HZD that had extreme frame rate variations and stutters, not to mention crashes, and even physics issues caused by them being tied to frame rate.
 

ACESHIGH

Banned
I think you should read my post again. The video in the post shows several games running at max settings and their average FPS is way higher than the GOW average. GOW is no different than any other game being compared.

Also listed in the comparison is the 1060 which averages 23 fps at max settings.

The same is true for medium or PS original settings which do not offer the performance we have come to expect from other console games on PC running on medium settings. Calling this port superb or excellent like DF has done here is ridiculous when benchmarks show Horizon performing way better. A game that was eviscerated by the same so called experts.

Yep, my thoughts exactly. Something smelled funny after seeing the PC specs. They basically said "Meh, lets make this sure this runs, PC gamers will brute force the game with their HW"

An FX6300 should be easily running this game at 30 FPS Original settings. I compare my PC FX6300 - RX 580 8GB - 256 GB SSD vs the PS4 Pro/Xbox one X, performance should be on the ballpark: And this was the case through the generation. I can understand something like Cyberpunk, Flight Simulator or a big RTS being too much for the CPU but a third person action game with few enemies on screen? This PC can run FH5 a big ass openworld game at 60 FPS most of the time. All those Ubisoft open world games at 30 FPS as well. God of War, just like Halo infinite is a terrible PC port. It does not matter if it runs well on a 3090, it should run well on low end hardware, considering its designed for 8 year old consoles that were mid range/low end when released.
 

Buggy Loop

Member
But thats literally how IPC gains are measured. They take the same game running on the same specs, but on two different uArch cards and if there is a performance increase on the latest one they know there is an IPC gain. We have seen this with Polaris compared to GCN1.0 and then again when they went to RDNA 1.0. 25% gains in performance every time when clocks and CUs are the same.

Digital Foundry ran some tests and found the IPC gains to be around 40 to 60 percent in games, and in synthetic benchmarks like Firestrike, Timespy and GFX benchmarks they are roughly 45-70% higher than GCN 1.0 to RDNA. When AMD advertised their 25% improvement figure over Polaris, I remember they gave a list of games and averaged out the performance increases. This lines up with DF's findings.

And I have been gaming on PC for the last 3 years, I had a GTX 570 the PS3 gen, and even built one in the PS2 gen, and this is just what my experience has been. There is absolutely no need to defend a port that performs this badly on a 580 or on any other cards. RDNA and Nvidia RTX and Pascal cards included.

You can look at other 580 vs 1060 benchmarks, they run games at high or max settings at 60 fps in other games. Sometimes over, sometimes under. The worst performing games are Flight sim and Cyberpunk and this game is worse than them lol.



Even Horizon which was widely considered a bad port runs at 45 fps on average at max settings.
Death Stranding at max settings runs at 70 fps.
GOW at max settings runs at 26 fps.

I rest my case.

FJPQcrjXwAMyYql


What resolution was computer base using for the benchmarks? Just odd that the 1060 vs 580 findings are opposite of DF’s, the 1060 being a good 10 fps lower.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
What resolution was computer base using for the benchmarks? Just odd that the 1060 vs 580 findings are opposite of DF’s, the 1060 being a good 10 fps lower.
They tested all resolutions. I chose their 1080p benchmark in order to compare against the base ps4.
 

ACESHIGH

Banned
GOW can run smoothly on the most popular PCs at original and high settings at 40-50 fps, more powerful PCs can scale the resolution and get solid 60fps, that already makes it a fairly solid port - unlike HZD that had extreme frame rate variations and stutters, not to mention crashes, and even physics issues caused by them being tied to frame rate.

It's in a better state than HZD, Batman Arkham Knight or AC Unity at release I give you that.

But we should not have such low standards with PC ports. Just running at a decent frame rate on popular configs (it does not) and not crashing is not enough. (And the game does stutter thanks to using the ancient DX11 API instead of a more modern one because it was too much work)

As customers we should only care about the end results, how do they get there is for them to figure out. They should scratch their heads until they make the game run at 30 fps on PCs that are capable.
Like Santa Monica did to make the game run and look as well as it did on Jaguar cores at 1.6 and a souped up 7850.
 

SlimySnake

Flashless at the Golden Globes
So just for shits and giggles I decided to compare the 2070S's 115 fps average using the PS4 setting or Medium preset against all of the games currently installed on my PC with an rtx 2080. Now I know every game's medium settings are different and unlike Horizon and GOW, medium settings in other PC ports dont neccessarily translate into console settings, but I just wanted to bench my rtx 2080 which is roughly 5-10% faster than the 2070 Super, but that computerbase.de benchmark is using an i9-12900k which is roughly 20% faster than my i7-11700k so lets just call it even since on higher fps, the CPU does tend to make a difference.

What surprised me was how similar almost all of the results were. Nearly all games hit 120 fps (my max on my LG CX) with just 65-75% GPU utilization meaning I couldve easily pushed the FPS by a good 30%.

Medium Settings. 1080p.

- Forza - 120 fps with just 65-70% utilization, bumped to around 75% when using High settings.
- Avengers - 120 fps with just 65-70% utilization.
- Kena - 120 fps with 75% utilization.
- Batman Arkham Knight - 90 fps (no option to push it beyond 90) at just 40% utilization. Increased to 45% using high settings.
- Mafia - 140 fps while driving around, the game doesnt cap its framerate to my display and goes up to 170 fps indoors.

Now drumroll for the bad ports.
- Halo Infinite - 90 fps
- Horizon - 105 fps

Horizon was infuriating because it actually scales up just fine till around 60 fps and then it simply stops scaling. You can drop resolution, setting, pick DLSS all you want and it just would not go up to 120 fps. Halo does the same thing.

As a final test, I ran cybperpunk on medium settings. this game with maxed out ray tracing runs at 6 fps on my PC at 1440p . Turn those off and i was surprised to see it hit 100 fps at medium settings. And unlike the other games that look like shit at 1080p medium, it still looks absolutely stunning.

These GPUs might be mid range today but they are top of the line GPUs for last gen games. And I am used to getting native 4k 60 fps at high or 1440p 120 fps at a mixture of high-medium settings. I have never ever had to drop down to 1080p on this card let alone medium settings with the exception of Control but i wanted ray traced debris so something had to give. Now they are telling me i have to settle for 1440p 60 fps on medium PS4 settings? gtfo. Thats some Uncharted 4 PS5 Port level of bullshit. We were just talking about how the PS5 should be able to run Uncharted at way better than 1440p 60 fps and well now we know its a quick and dirty port.
 
Last edited:

ACESHIGH

Banned
These GPUs might be mid range today but they are top of the line GPUs for last gen games. And I am used to getting native 4k 60 fps at high or 1440p 120 fps at a mixture of high-medium settings. I have never ever had to drop down to 1080p on this card let alone medium settings with the exception of Control but i wanted ray traced debris so something had to give. Now they are telling me i have to settle for 1440p 60 fps on medium PS4 settings? gtfo. Thats some Uncharted 4 PS5 Port level of bullshit. We were just talking about how the PS5 should be able to run Uncharted at way better than 1440p 60 fps and well now we know its a quick and dirty port.

And if I did the same exercise with my PC, most AAA games released this gen should run at 1800p 30 FPS console settings or whatever settings the xbox one X uses. Folks think these old cards should be running games at 720p or something because an RX580 is basically 6 years old tech but don't understand how large is the gap vs last gen consoles, specially on the CPU front.
Halo infinite (SP and MP) is one of the most unoptimized games I have seen. Very disappointing specially after playing the excellent FH5. That game runs at 60 FPS medium/high settings on almost any CPU you throw at it and it looks great to boot. It shows all the optimization the engine had since FH3 on PC, which was a CPU hog, maxing one core for the most part.

Halo infinite ancient 4 vs 4 or 6 vs 6 MP matches peg the CPU like a Prime 95 run. And don't get me started on the campaign once you get to the open world. If they are so bad at optimizing PC games now that there's a huge gap vs last gen consoles, things are gonna get real bad once we get next gen only games.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
And if I did the same exercise with my PC, most AAA games released this gen should run at 1800p 30 FPS console settings or whatever settings the xbox one X uses. Folks think these old cards should be running games at 720p or something because an RX580 is basically 6 years old tech but don't understand how large is the gap vs last gen consoles, specially on the CPU front.
Halo infinite (SP and MP) is one of the most unoptimized games I have seen. Very disappointing specially after playing the excellent FH5. That game runs at 60 FPS medium/high settings on almost any CPU you throw at it and it looks great to boot. It shows all the optimization the engine had since FH3 on PC, which was a CPU hog, maxing one core for the most part.

Halo infinite ancient 4 vs 4 or 6 vs 6 MP matches peg the CPU like a Prime 95 run. And don't get me started on the campaign once you get to the open world. If they are so bad at optimizing PC games now that there's a huge gap vs last gen consoles, things are gonna get real bad once we get next gen only games.
Yeah, Halo is a really bad port. It's clear that they took the same engine that worked for small corridor areas in Halo 4 and 5 and then tried to make it work in an open world.

I played GOW A LOT on my Pro. Beat the game on NG+ GMGOW difficulty not once but twice. Have maybe 200 hours in the game just doing the musphelheim and Nilfheim challenges over and over again. Almost exclusively on the Pro's 1080p 60 fps version.

Yes, it dropped frames but it really wasnt that bad during action. I know John over at DF made a huge deal about it but I really didnt notice it. DF as always focused on the worst case scenario while NX Gamer ran common sense benchmarks during gameplay and boss fights that showed the game averaged around 56 fps.

cjhCtrl.jpg


This was taken at the end of the first boss fight. Shows the fight averaged 56 fps.

Other sections show a similar fps.

OC8x3f1.jpg


So this 4.2 tflops GPU with an awful jaguar CPU running at just 2.1 ghz with just 6 cores available for gaming with no additional threads is running the game at a better framerate than a GPU thats 44% more powerful with more bandwidth, a frigging 16 core 24 thread CPU that can hit 5.2 ghz? What?

Even the CPU that Alex used for his testing has 6 core 12 threads and goes up to 4.2 ghz. Unable to match PS4 Pro performance.
And yet:

God of War on PC is a simply sensational port​

Face-off by Alex Battaglia, Video Producer, Digital Foundry
Updated on 14 January 2022
 

yamaci17

Member
mate what? 3600 at 4.2 ghz will push a consistent 100+ fps in this game. i don't understand where you're reaching those conclusions or as to how it cannot match a ps4 pro

my 3.7 ghz 2700x pushes a consistent 100 fps as long as the scene is not heavily GPU bound.

i just did a 1.1 ghz 6c/6t game performance test just for your claims. given that zen+ has roughly 2 times IPC over Jaguar, i decided 1.1 ghz would be a good point. despite high inter-ccx latency being even more pronounced at such a frequency (inter-ccx latency scales upwards and downwards with CPU frequency in zen/zen+/zen 2 CPUs), I still got 40-50 fps which resulted as a still-playable gameplay. if this was an Intel CPU at 1.1 ghz, it would perform better due to the aforementioned reasons. the stutters here and there are mostly related to the inter-ccx architecture that is reliant to higher clocks. it still performed way above my expectations.

start of the video is a bit more laggy than actual gameplay due to the start of recording.




you're looking at stuff at wrong directions. hardwares do not scale as you actually expect it to. this will also be the case with zen 2 found on consoles. you don't magically get 4 times more FPS with 4 times single thread performance increase. this is why games that runs at a rock solid 30 fps on ps4 may run at 85-100 fps on ps5. its not necessarily because of backwards compability or anything, its just that the performance does not scale as you expect it to be. see, i can get 40-45 fps at 1.1 ghz, yet i'm not getting 150-160 fps at 4 ghz, instead i'm getting 90-110 fps.

and this example is an extreme case. its a literal modern CPU forced to run at 1.1 ghz and the game is still playable somehow well above 40+ fps. i made my point. my point stands. this port is damn GOOD. i don't know who told you that 4.2 ghz 3600 or that 5.2 ghz intel CPU runs the game poorly. this game is highly playable on what can be considered lowend CPUs.

by the way, i've checked several other videos and ps4 pro has no business averaging 56 fps. you're also overpraising ps4 pro performance (i don't question your own experience. its not like 40-50 fps is bad. you may or may not notice it. even in that video I've captured, I didn't care about what FPS I got, game was playable and smooth to me and I've beaten the valkyrie). in most videos i've seen PS4 PRO's performance is ranging from 40 to 55 and its highly variable and nowhere near constant near the 60 you claim it is to be. but that's not the point anyways, given that I've proven that console level CPU on desktop provides 40-45 fps which is not far from the ps4 pro's performance. you can say there's a %30-35 CPU bound performance between platforms which is fantastic given that ps4/pro uses a specialized API that is specifically tailored for such games. (by the way, correct me if i'm wrong but ps4 pro actually uses 7 threads/7 cores for games. i also know that ps4/pro has an additional ARM CPU that handles various other tasks. so my test is actually invalid and should be done with one extra core lmao but im not gonna bother any further)

but yeah: https://www.tweaktown.com/news/4868...s4s-seventh-core-boost-performance/index.html

if you want to see bad ports, steam reviews tell you that. more than half of the bad reviews Cyberpunk got because of its bad optimization. flight simulator is excusable and it is not targeted towards casuals anyways. horizon zero dawn, despite being a good game, started off pretty rough with steam reviews, because of its horrible optimization and lagginess across majority of systems. after it is patched, reviews got more positive later on. gow however started off fantastic at %90+ score. not only because it is legit a great game, also because it performs good for the majority of PC player base. it practically runs 60+ fps on your average gtx 1660s + i5/r5 build.



finally, sry for noob plays. it has been 3 years since i've played the game.
 
Last edited:

Kenpachii

Member
Yeah, Halo is a really bad port. It's clear that they took the same engine that worked for small corridor areas in Halo 4 and 5 and then tried to make it work in an open world.

I played GOW A LOT on my Pro. Beat the game on NG+ GMGOW difficulty not once but twice. Have maybe 200 hours in the game just doing the musphelheim and Nilfheim challenges over and over again. Almost exclusively on the Pro's 1080p 60 fps version.

Yes, it dropped frames but it really wasnt that bad during action. I know John over at DF made a huge deal about it but I really didnt notice it. DF as always focused on the worst case scenario while NX Gamer ran common sense benchmarks during gameplay and boss fights that showed the game averaged around 56 fps.

cjhCtrl.jpg


This was taken at the end of the first boss fight. Shows the fight averaged 56 fps.

Other sections show a similar fps.

OC8x3f1.jpg


So this 4.2 tflops GPU with an awful jaguar CPU running at just 2.1 ghz with just 6 cores available for gaming with no additional threads is running the game at a better framerate than a GPU thats 44% more powerful with more bandwidth, a frigging 16 core 24 thread CPU that can hit 5.2 ghz? What?

Even the CPU that Alex used for his testing has 6 core 12 threads and goes up to 4.2 ghz. Unable to match PS4 Pro performance.
And yet:

Not sure if serious, 3600 steamrolls the PS4 and probably beats 5 CPU on every front, that cpu will run this game at 140+ fps on 1080p original settings.

Also 580 seems to run this game perfectly fine at 60 fps with FSR on. So again not really sure what the problem is here.

U need to realize that both DF and NX gamer have preview copy's that probably don't involve day 1 patch + gpu drivers update that u get on day one. Which makes performance far worse in general. U need to take this into account.
 
Last edited:

GHG

Gold Member
SlimySnake SlimySnake

In all honesty it just looks like the game underperforms slightly on AMD GPUs compared to Nvidia hardware:




And even still, when you look at the example above the performance discrepancy makes sense when you take into account the respective power draws of the two cards.
 

Cryio

Member
It's in a better state than HZD, Batman Arkham Knight or AC Unity at release I give you that.

But we should not have such low standards with PC ports. Just running at a decent frame rate on popular configs (it does not) and not crashing is not enough. (And the game does stutter thanks to using the ancient DX11 API instead of a more modern one because it was too much work)

As customers we should only care about the end results, how do they get there is for them to figure out. They should scratch their heads until they make the game run at 30 fps on PCs that are capable.
Like Santa Monica did to make the game run and look as well as it did on Jaguar cores at 1.6 and a souped up 7850.
It's on a better state at release than those 3 games but it's in a worse state than those game patched.

The fact the game port is called superb is insanity to me.
 

winjer

Gold Member
This explains the limited performance of this game, compared to consoles.
Using a high level API like DX11 in a game like this was a stupid idea.

LL
 

ACESHIGH

Banned
Also 580 seems to run this game perfectly fine at 60 fps with FSR on. So again not really sure what the problem is here.

Why would you need FSR to run it at 60 FPS original settings? The RX 580 should be more than capable to do it without it. I can undestand a few FPS drops from 60 here and there but the thing is that the game is nowhere close to deliver a 60 FPS locked experience with that card or the GTX 1060. That's how we slowly have been lowering our standards with PC ports. Devs and publishers will use anything to slack on the job when putting together a PC version of the game and ask the player to bruteforce it. DLSS and FSR are perfect for this: "Tetris runs at 45 FPS on my 1080ti" Dev: "Just enable FSR and you are good to go, no patches required"

Hopefully the game gets patched to improve performance in AMD cards and CPU utilization. If that happens then it is going to be a great PC version.
 

yamaci17

Member
Wait, people complaining about being able to lock the game to 120 fps? How many do you need in a single player game?

it is actually possible. with 6c/6t 3.7ghz zen+ CPU. zen 2 and onwards should fairly be capable of locking to 144 and above

zmVSDJb.jpg



another video is in order. i forgot to open the cores and SMT, so its 6 cores 6 threads at 3.7 ghz

its just that they dont like the gpu performance. i dont either. but i dont understand their fixation with CPU. somehow the examples they give are GPU bound yet they still say stuff like "5.2 ghz intel cpu and 4.2 ghz zen 2 cpu cannot match ps4 pro". im unable to comprehend their logic in terms of game's CPU performance

then again, i've been observant on gtx 1060-rx580's performance for the last 2 years, since one of my close friends has a 1060. this is not the first time gtx 1060/580 struggled to push 2x over PS4 with PS4 settings. this is what he gets wrong. he somehow thinks this game is an exception, whereas it is not. for the last 2-3 years, gtx 1060 is unable to push a consistent 60 fps in the majoritity of AAA titles with console-like settings. its a brilliant GPU, but its not what it is used to be. i don't know what happens with the rx 580, since it should synergistically benefit from GCN ports but stuff happens I guess. this game could've used async support. yet async support wrecks rx 580's performance in halo infinite. so there are some very very different stuff going on for consoles that cannot be translated over to PC

i accepted that consoles simply have a better, more efficient, more optimized API. this will never happen on PC. every part is so apart and desycned. cpu, gpu, ram... everything on its own. consoles? everything is held together with special bonds. maybe if they can comprehend that...

kena, medium, 50-60 fps, just like gow



deathloop medium 45-60 fps, just like gow



re village medium 70-80 fps (ps4 has a 45 fps mode so in his "perfect optimization" world, 1060 should've been able to push 90+ frames)



outriders medium 55-60 fps



rdr 2 similar story. you just have to get along with 50 fps



its simple, for the last 2-3 years, neither 580 nor 1060 is able to push a consistent 60 fps even at console-like medium settings whether they look bad, good, or console-like. gow is no exception. and this won't change. only way to get consistent 60 fps with these GPUs are to brutally murder graphics to a point where they look worse than ps4/xbox one.

as a matter of fact, GoW at medium(original) preset LOOKS millions times better than the aforementioned titles that are unable to run at 60 fps on 1060. yet GoW performs same as these games do. that's a testament to their SOFTWARE capability. its a brilliant game and this is a brilliant port.
 
Last edited:

ZywyPL

Banned
Also 580 seems to run this game perfectly fine at 60 fps with FSR on. So again not really sure what the problem is here.

FSR is terrible in this game, even at Ultra Quality setting it blurs the image too much, if you don't have a DLSS-capable GPU you're better off using the integrated scaling, much better IQ results for the same framerate. Or if FSR quality doesn't bother you that much, you can use even lower rendering res with TAA for the same IQ but get that much better framerate. Strongly recommend.
 
it is actually possible. with 6c/6t 3.7ghz zen+ CPU. zen 2 and onwards should fairly be capable of locking to 144 and above

zmVSDJb.jpg



another video is in order. i forgot to open the cores and SMT, so its 6 cores 6 threads at 3.7 ghz

its just that they dont like the gpu performance. i dont either. but i dont understand their fixation with CPU. somehow the examples they give are GPU bound yet they still say stuff like "5.2 ghz intel cpu and 4.2 ghz zen 2 cpu cannot match ps4 pro". im unable to comprehend their logic in terms of game's CPU performance

then again, i've been observant on gtx 1060-rx580's performance for the last 2 years, since one of my close friends has a 1060. this is not the first time gtx 1060/580 struggled to push 2x over PS4 with PS4 settings. this is what he gets wrong. he somehow thinks this game is an exception, whereas it is not. for the last 2-3 years, gtx 1060 is unable to push a consistent 60 fps in the majoritity of AAA titles with console-like settings. its a brilliant GPU, but its not what it is used to be. i don't know what happens with the rx 580, since it should synergistically benefit from GCN ports but stuff happens I guess. this game could've used async support. yet async support wrecks rx 580's performance in halo infinite. so there are some very very different stuff going on for consoles that cannot be translated over to PC

i accepted that consoles simply have a better, more efficient, more optimized API. this will never happen on PC. every part is so apart and desycned. cpu, gpu, ram... everything on its own. consoles? everything is held together with special bonds. maybe if they can comprehend that...

kena, medium, 50-60 fps, just like gow



deathloop medium 45-60 fps, just like gow



re village medium 70-80 fps (ps4 has a 45 fps mode so in his "perfect optimization" world, 1060 should've been able to push 90+ frames)



outriders medium 55-60 fps



its simple, for the last 2-3 years, neither 580 nor 1060 is able to push a consistent 60 fps even at console-like medium settings whether they look bad, good, or console-like. gow is no exception. and this won't change.


Once I saw those 'complaints' I tried capping at 120 fps and it held what seems completely stable lock throughout all of that midget fight. But I made sure not to be gpu bound. How many games let you get a locked 120 fps? Not many.
 
Top Bottom