• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Starfield PC Performance Thread

Bojji

Member
Alright, whats the best DLSS mod to use? i downloaded the Pure Dark mod hack that was posted here earlier today but i dont have a 40 series card so I am good with any DLSS 2 mods.

Also, i dont want to use the Neutral lighting mod, but whats the best HDR mod?

Pure Dark mod from Nexus is good, it looks better than FSR for sure.

I am hoping my CPU is not too low. I just recently put a 4070RTX in.

What CPU?

Are they gonna release a day one patch to fix the shit performances?

Hahahaha, It will probably take them months or they will never do it.
 

CrustyBritches

Gold Member
Low Moral Fibre Low Moral Fibre

Good news! PureDark updated the mod a few times to address some of the issues including the skybox issue and crashing. I can confirm that the flickering sky/star cluster issue I was having no longer happens. Also, I didn't have any crashes tonight even without using the 'Disable in Menu' option. Couldn't be happier. This dude is really going the extra mile to support his mod.

Patch notes:
**Changelog**
*Beta 02 Hotfix1*
1.Fixed image not showing in photo mode.
2.Fixed sky flickering by downgrading the FG version to 3.1.1
**Currently Disable FG only in Loading Menu optioin is not working for XBOX APP version!**
**Switching to All Menus before I fix it !**
---
**Starfield Upscaler FG Build Beta 03**
This mod enables DLSS 3 in Starfield.
**Changelog**
*Beta 03*
1.Improved stability
2.Fixed scope freezing
3.Added the option to only disable FG in loading screen.
**Installation**
0.Remember to delete d3d12.dll AND dxgi.dll in the game's root folder if you have installed any other version of the mod.
1.Extract to game's exe folder, if you have been using ReShade, don't overwrite the ReShade.ini so that you can use your own preset.
2.If you want the Steam Overlay or your controller is not working, rename d3d12.dll to dxgi.dll.
3.Enable FSR2 and disable Dynamic Resolution in the game settings.
**4.Disable in-game VSync! It might crash when the setting will be saved**
5.Press END to open the menu, then authenticate for the first time, no continuous subscription needed, if you have subscribed you can pass the authentication after the subscription expires.
6.Enable DLSS2 and DLSS FG in the mod menu.
7.There's hotkey for toggling them, and you can change them in the mods/StarfieldUpscaler.ini.
---
**Changelog**
**Beta 03 Hotfix1**
1.Fixed new option for only disabling FG in Loading Menu not working for Game Pass version
---
**Changelog**
**Beta 04**
1.Fixes loading freezing due to wrong menu event detection
2.Default to disable FG in All Menus
**Installation**
1.It has not changed, read above.
---
**Changelog**
**Beta 05**
1.Re-added loading menu detection and other things based on Beta 02 since it might be the most stable one
2.Added a new Fix Scope View option which should fixes all scope freezes if checked
3.Fixed the ship menu freezes
**Installation**
1.It has not changed, read above.
 

ZehDon

Gold Member
i9 10900k, RTX3090, 64GB RAM, installed on PCIe Gen4 SSD, Ultrawide 3440x1440p, all settings on Ultra.

Vanilla performance was mostly fine, but FSR2 looks like poop. Interiors ran perfect, most exteriors were really solid, but the shadow-heavy forests and New Atlantis definitely had noticeable drops.

Using the DLSS2 mod and enabling ReBAR via the NVidia Profiler has this running pretty much perfect across the board, never drops below 60, with GSYNC smoothing it all out. The IQ from DLSS2 is noticeably better than FSR2, too. Hopefully DLSS gets patched in, and ReBAR gets set via a driver update, so we can have this additional performance out of the box. Overall, I'm happy with performance right now, but its clear there's still optimisation work to be done that would help less speced rigs.
 
Last edited:

M1987

Member
i9 10900k, RTX3090, 64GB RAM, installed on PCIe Gen4 SSD, Ultrawide 3440x1440p, all settings on Ultra.

Vanilla performance was mostly fine, but FSR2 looks like poop. Interiors ran perfect, most exteriors were really solid, but the shadow-heavy forests and New Atlantis definitely had noticeable drops.

Using the DLSS2 mod and enabling ReBAR via the NVidia Profiler has this running pretty much perfect across the board, never drops below 60, with GSYNC smoothing it all out. The IQ from DLSS2 is noticeably better than FSR2, too. Hopefully DLSS gets patched in, and ReBAR gets set via a driver update, so we can have this additional performance out of the box. Overall, I'm happy with performance right now, but its clear there's still optimisation work to be done that would help less speced rigs.
Is using ReBar via Nvidia Profiler different from enabling it in the bios?
 

ZehDon

Gold Member
You're wasting frames for 0 visual benefits. Drop everything to High and enjoy at least 20+ frames for free.
The only setting that’s given any meaningful improvement for myself was shadows - dropping to high had a good performance boost. But hitting 60 was the goal for me, so I’m happy enough with that for now.
Is using ReBar via Nvidia Profiler different from enabling it in the bios?
Im not sure, sorry. With the profiler, it’s enabled on a per-title basis - I imagine bios is a more blanket setting. It’s easy enough to setup - there’s a great post a couple of pages back with the info.
 

StueyDuck

Member
Man... I'm excited to start tonight after work, but I'm also dreading how shit this game is going to run.

I'm on i9 10850k and 3080, sounds like I'm going to be maxing out 40fps at 1080 🤣 what a shit show. I really don't want this to sour my opinion on the game
 
Last edited:

Virex

Banned
Dorsia?

Nobody, goes there anymore.

By the way, next time you are at Tex-Aracana try the pork loin with lime jello... it's to die for.
raw
 

winjer

Gold Member
Using a 5800X3D, DDR4-3800, 6800XT.

After playing the game for a bit, I have to say that performance really sucks.
It wasn't terrible during most of the time. And GPU usage was going from 80-99%, Ultra with FSR2 70%.
But at New Atlantis, performance dropped off a cliff. Even with HuB's optimized settings, i's really bad. Probably averaging 60 fps, but with drops into the 40's.
But this level looks bad. Graphics are average at best and NPC quality is really lacking. And still, it pushes performance really low.
And I saw GPU usage dropping to as low as 38%. Seriously, wtf is wrong with this crap game engine.

I'm using the PC gamepass version which probably has worse performance than the Steam version. At least in several other games, the gamepass version would often perform around 5%-10% worse.
There aren't stutters, like in UE4 games. So that's a positive.
But still, this is probably the worst optimized game of the year.
 
Last edited:

S0ULZB0URNE

Member
LOL, zero optimization for Nvidia.

The best thing that this is 1080p, this game doesn't have any RT, Lumen (or any form of GI), Nanite etc. NOTHING that justifies this performance, even SSR is absent just ancient cube map reflections.

What the fuck Bethesda?
It's 1440p...
 

SlimySnake

Flashless at the Golden Globes
Using a 5800X3D, DDR4-3800, 6800XT.

After playing the game for a bit, I have to say that performance really sucks.
It wasn't terrible during most of the time. And GPU usage was going from 80-99%, Ultra with FSR2 70%.
But at New Atlantis, performance dropped off a cliff. Even with HuB's optimized settings, i's really bad. Probably averaging 60 fps, but with drops into the 40's.
But this level looks bad. Graphics are average at best and NPC quality is really lacking. And still, it pushes performance really low.
And I saw GPU usage dropping to as low as 38%. Seriously, wtf is wrong with this crap game engine.

I'm using the PC gamepass version which probably has worse performance than the Steam version. At least in several other games, the gamepass version would often perform around 5%-10% worse.
There aren't stutters, like in UE4 games. So that's a positive.
But still, this is probably the worst optimized game of the year.
Game favors amd gpus and Intel cpus.

Your 5800x3d is worse than my 11700k in this game and gets roundly trounced by every 7000 series card.

XymK0XR.jpg


I would recommend going from ultra to high. Made a difference for me.
 

winjer

Gold Member
Game favors amd gpus and Intel cpus.

Your 5800x3d is worse than my 11700k in this game and gets roundly trounced by every 7000 series card.

XymK0XR.jpg


I would recommend going from ultra to high. Made a difference for me.

No point in playing this game until it's fixed. I have too many games in my backlog.

I tried a couple of things to see why the performance is so low.
On New Atlantis, I tried looking at the sky, but performance remains almost the same. On all other games, performance jumps a lot.
But here, it's like there is no frustum culling and the game continues to render everything around the player.
I also tried looking at a wall and moving to the side. On most games, it would cull the geometry beyond the wall, and performance would increase. But not here.
 
Man... I'm excited to start tonight after work, but I'm also dreading how shit this game is going to run.

I'm on i9 10850k and 3080, sounds like I'm going to be maxing out 40fps at 1080 🤣 what a shit show. I really don't want this to sour my opinion on the game
Got the same spec and the frame rate was bad at 4k until I installed the dlss mod. There's a link in this thread I think.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
No point in playing this game until it's fixed. I have too many games in my backlog.

I tried a couple of things to see why the performance is so low.
On New Atlantis, I tried looking at the sky, but performance remains almost the same. On all other games, performance jumps a lot.
But here, it's like there is no frustum culling and the game continues to render everything around the player.
I also tried looking at a wall and moving to the side. On most games, it would cull the geometry beyond the wall, and performance would increase. But not here.
Because the game is cpu bound in atlantis. Frustrum culling won’t do anything. Your gpu load is already at 38%.

But yeah, probably best to hold off.
 

winjer

Gold Member
Because the game is cpu bound in atlantis. Frustrum culling won’t do anything. Your gpu load is already at 38%.

But yeah, probably best to hold off.

Frustrum culling and HZB culling would drastically reduce the amount of rendered unseen geometry and pixels.
Having a ton of geometry, actors, NPCs, pixels, etc, culled when the rendering pass starts, would save a ton of processing power. Both on the CPU and GPU.
 

SlimySnake

Flashless at the Golden Globes
Frustrum culling and HZB culling would drastically reduce the amount of rendered unseen geometry and pixels.
Having a ton of geometry, actors, NPCs, pixels, etc, culled when the rendering pass starts, would save a ton of processing power. Both on the CPU and GPU.
From what i understand, the cpu needs to instance each npc as you load the map and keep track of them at all times. Frustrum culling only saves on gpu performance. The NPC logic cant be culled so to speak. The game has to keep everything instanced the whole time you are there.

I am sure it saves GPU cycles, but your GPU is not even close to being maxed out here. There is only one bottleneck here and its the CPU. We can argue whether or not the game is optimized but its scaling well with newer Ryzen CPUs and old intel CPUs. More powerful your CPU, the better performance you will get in cities.

Here is my run in Atlantis. On a 3080 which is typically on par with the 6800xt in standard rasterization, but here the 6800xt performs more like a 3090 Ti. Roughly 40% better. And yet because your CPU isnt able to keep up with the demands of rendering a city full of NPCs, you are getting massive drops. Meanwhile im pretty much average 48-50 fps with none of the massive 1% stutters and drops we saw in Star Wars koboh. Its a steady framerate dropping only once to 40 fps when i went into the cafe. My GPU is at 98% most of the time. Wouldve expected it to be 99% if it was fully GPU bottlenecked but honestly 98% is not bad and definitely not as bad as your 38%.



3bAn84A.png
 

winjer

Gold Member
From what i understand, the cpu needs to instance each npc as you load the map and keep track of them at all times. Frustrum culling only saves on gpu performance. The NPC logic cant be culled so to speak. The game has to keep everything instanced the whole time you are there.

I am sure it saves GPU cycles, but your GPU is not even close to being maxed out here. There is only one bottleneck here and its the CPU. We can argue whether or not the game is optimized but its scaling well with newer Ryzen CPUs and old intel CPUs. More powerful your CPU, the better performance you will get in cities.

Here is my run in Atlantis. On a 3080 which is typically on par with the 6800xt in standard rasterization, but here the 6800xt performs more like a 3090 Ti. Roughly 40% better. And yet because your CPU isnt able to keep up with the demands of rendering a city full of NPCs, you are getting massive drops. Meanwhile im pretty much average 48-50 fps with none of the massive 1% stutters and drops we saw in Star Wars koboh. Its a steady framerate dropping only once to 40 fps when i went into the cafe. My GPU is at 98% most of the time. Wouldve expected it to be 99% if it was fully GPU bottlenecked but honestly 98% is not bad and definitely not as bad as your 38%.

Of course NPCs can be culled. They are culled in every game to save on performance.
 
Crazy that I feel 'happy' about not dipping under 30fps with a 2070S/2700x in New Atlantis at 3440p/1440p. High settings with DLSS and performance mod.
 

SlimySnake

Flashless at the Golden Globes
Of course NPCs can be culled. They are culled in every game to save on performance.
nope. Just their rendering load. The actual CPU logic needs to be there at all times. They are not instantiating a new NPC simulation every time you turn around. That happens when you load into the level.
 
Last edited:

winjer

Gold Member
nope. Just their rendering load. The actual CPU logic needs to be there at all times. They are not instantiating a new NPC simulation every time you turn around.

That's not how games work. You don't need to simulate NPCs that are not visible.

For example, Starfield does have a distance culling system.
It's setting is Ugrids=5 uExterior=36
This is the number of cells loaded and rendered in distance to the player.
 
Last edited:
The DLSS Frame Gen (Nexus one) keeps crashing so have stopped using it. Clearly when in use it works well but the crashing is not good. May try the paid for if it's not fixed.

*Edit: Set the framerate within the config file to 120 and all seems well now (just in case anyone is having the same problems).
 
Last edited:

Xcell Miguel

Gold Member
The DLSS Frame Gen (Nexus one) keeps crashing so have stopped using it. Clearly when in use it works well but the crashing is not good. May try the paid for if it's not fixed.
It's crashing for a lot of users, the modder is on it, he posts updates regularly in the "Posts" tab of the mod.

I went back to his previous mod, the DLSS2 one, for now I don't really need Frame Gen in the end, with GSync the game is really smooth and in big cities I get 55-80 FPS in Ultra.
 

Mr Hyde

Member
I have a Radeon 6850 xmt and I'm running the game at 1080p with fsr enabled with a mix of high and medium settings. I'm getting about 65-75 fps indoors and about 35-45 outdoor. I was under the impression I could do stable 60 fps with this setup but it fluctuates like crazy. Which makes me worried that performance will tank when in cities. Does anyone have recommended settings for more stable framerates? I think it's strange that Bethesda doesn't let us cap the framerate. Would much rather have a locked 30 than the framerate fluctuating like this from anywhere to 35 to 75. Also the picture looks kinda washed out. Very bright and not so good looking. Tried to reduce brightness, and even adjust settings on my tv, but still, doesn't look good. Would like a bit more color to it.
 

SlimySnake

Flashless at the Golden Globes
That's not how games work. You don't need to simulate NPCs that are not visible.

For example, Starfield does have a distance culling system.
It's setting is Ugrids=5 uExterior=36
This is the number of cells loaded and rendered in distance to the player.
Yes the game streams in npcs as you go but just because you look up at the sky it is not going to stop the cpu from doing whatever it needs to do to get the game logic running… be it npcs, physics ir whatever else the cpu is doing in Bethesda games.

If anything that setting shows that they did attempt to optimize it but cpu bottlenecks simply can’t be removed by frustrum culling.

How can a AAA studio not know this, or care. Those performance figures in the graph above are diabolical.
Diabolical how? Aside from the nvidia cards not performing as well as amd cards, the performance is scaling well with better gpus and cpus. This isn’t the first game to favor nvidia cards and it’s possible that the game was designed around xbox hardware.

They game is pushing a lot of npcs, fancy physics, gorgeous lighting and asset quality in spots. Of course it’s going to be heavy on gpu and cpu. You guys can’t continue to expect last Gen performance out of your cards. The xbox runs this at 1440p 30 fps on a 12 tflops gpu. You will need 24 tflops to run this at 60 fps at same resolution and settings on top of a cpu that can handle the big cpu load for a game like this. Is hardly diabolical and only seems out of the ordinary because we have gone 3 years without doing anything remotely taxing with gpu and cpus beyond forced ray tracing.
 

Elysium44

Banned
Diabolical how? Aside from the nvidia cards not performing as well as amd cards, the performance is scaling well with better gpus and cpus. This isn’t the first game to favor nvidia cards and it’s possible that the game was designed around xbox hardware.

They game is pushing a lot of npcs, fancy physics, gorgeous lighting and asset quality in spots. Of course it’s going to be heavy on gpu and cpu. You guys can’t continue to expect last Gen performance out of your cards. The xbox runs this at 1440p 30 fps on a 12 tflops gpu. You will need 24 tflops to run this at 60 fps at same resolution and settings on top of a cpu that can handle the big cpu load for a game like this. Is hardly diabolical and only seems out of the ordinary because we have gone 3 years without doing anything remotely taxing with gpu and cpus beyond forced ray tracing.

Doing 20-30fps on cards faster than most people on Steam have to render a basic looking game like this is obviously silly. You think it looks good, that's your opinion, I and others think it looks unremarkable relative to its very high system requirements. It isn't doing anything amazing, it's just working very very inefficiently.
 
Doing 20-30fps on cards faster than most people on Steam have to render a basic looking game like this is obviously silly. You think it looks good, that's your opinion, I and others think it looks unremarkable relative to its very high system requirements. It isn't doing anything amazing, it's just working very very inefficiently.

This is not a basic looking game. Ya'll are kidding yourselves.

I hate that people simultaneously whine about games not looking next-gen enough, while at the same whining about games trying to push the bar (and therefore requiring more capable hardware).
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Doing 20-30fps on cards faster than most people on Steam have to render a basic looking game like this is obviously silly. You think it looks good, that's your opinion, I and others think it looks unremarkable relative to its very high system requirements. It isn't doing anything amazing, it's just working very very inefficiently.
I’m sorry but that’s nonsense. Most people are rocking a 1060 on steam. A 4.5 tflops card. No one is their right mind should be expecting devs to hold themselves back to appease those users.

No one is asking you to admit that the game looks good but it isnt doing fallout 4 graphics at these framerates. There is a clear leap in visual fidelity over their last Gen efforts and it is bound to come at a cost.
 

Elysium44

Banned
It's doing shitty fps on a 13th gen i9 CPU and on GPUs which 95% of the market has, which are not remotely outdated. People defending this are embarrassing themselves. Look at the chart above, even the recommended RTX 2080 Ti can barely crack 30fps. Even when paired with that top of the line CPU.
 
Last edited:

blastprocessor

The Amiga Brotherhood
Nvidia 4070 with an AMD 7700 high detail tend to see over 60 fps at 1440p native indoors with lows of 47 fps seen outside. G-sync really helps.
 
Last edited:

winjer

Gold Member
Yes the game streams in npcs as you go but just because you look up at the sky it is not going to stop the cpu from doing whatever it needs to do to get the game logic running… be it npcs, physics ir whatever else the cpu is doing in Bethesda games.

If anything that setting shows that they did attempt to optimize it but cpu bottlenecks simply can’t be removed by frustrum culling.

The only thing the game has to do is to keep track of the important NPCs. But this is just a pointer to their position.
There is no reason to render an NPC or any object or geometry that is not in the players field of view.
Proper game engines will cull both hidden geometry and frustrum.
 

GymWolf

Member
Yes the game streams in npcs as you go but just because you look up at the sky it is not going to stop the cpu from doing whatever it needs to do to get the game logic running… be it npcs, physics ir whatever else the cpu is doing in Bethesda games.

If anything that setting shows that they did attempt to optimize it but cpu bottlenecks simply can’t be removed by frustrum culling.


Diabolical how? Aside from the nvidia cards not performing as well as amd cards, the performance is scaling well with better gpus and cpus. This isn’t the first game to favor nvidia cards and it’s possible that the game was designed around xbox hardware.

They game is pushing a lot of npcs, fancy physics, gorgeous lighting and asset quality in spots. Of course it’s going to be heavy on gpu and cpu. You guys can’t continue to expect last Gen performance out of your cards. The xbox runs this at 1440p 30 fps on a 12 tflops gpu. You will need 24 tflops to run this at 60 fps at same resolution and settings on top of a cpu that can handle the big cpu load for a game like this. Is hardly diabolical and only seems out of the ordinary because we have gone 3 years without doing anything remotely taxing with gpu and cpus beyond forced ray tracing.
A lot of npcs...fancy physics...are we playing the same game?

Dude the gif with the potatoes or when he suspend a lot of objects in the air is just basic ragdoll, nothing more, the amount of objects does not make it fancier in any way, shape or form.

The npcs are in good number but absolutely no more than other big open world with cities and their ia\animation system\enviromental interactions are ps2 tier level...

They are like zombies and we had a lot of games full of zombies in the streets...(except the zombies in dying light 1 animations and interactions are light years better than starfield)


Nextgen
ZfAd1J.gif
 
Last edited:
I'm struggling and I know my specs are on the outdated side, but the slowdown is immense in this game. Honestly, i think I'm done with it until it gets patched or the GOTY edition comes; or better yet, upgrade my system. Ugh

Ryzen 7 3700x
16 GB DDR4
RTX 2060
 
Last edited:

SlimySnake

Flashless at the Golden Globes
This is not a basic looking game. Ya'll are kidding yourselves.

I hate that people simultaneously whine about games not looking next-gen enough, while at the same whining about games trying to push the bar (and therefore requiring more capable hardware).
It’s so idiotic and completely delusional. I feel like gamers are losing their minds. I get that we have had a very long cross gen period with some very shoddy ports this year but this aint a hogwarts legacy or TLOU part 1 or star wars situation. This game is actually doing shit thats well beyond what their previous games were doing. TLOU running like shit while looking fucking identical to TLOU2 didnt make sense. This looks nothing like fallout 4 or fallout 76. I can even get behind calling the Matrix demo unoptimized because UE5 is CPU bound with poor multithreading. But that aint the case here.

Expecting games to look better while running on last gen hardware is beyond bizarre, it makes me question just how smart these gamers really are. Better graphics = higher graphics requirements. Its PC gaming 101. This is a next gen game running at 1440p 30 fps sometimes sub 30 fps on the most powerful console this gen. Expecting 60 fps on GPUs half or one third as powerful is absolutely idiotic.
A lot of npcs...fancy physics...are we playing the same game?

Dude the gif with the potatoes or when he suspend a lot of objects in the air is just basic ragdoll, nothing more, the amount of objects does not make it fancier in any way, shape or form.
lol you cant give examples about a game utilizing a cpu and use that to discredit that it is doing nothing with the cpu. We are discussing why the CPU utilization is high, i couldnt care less about if its next gen or last gen. I am explaining why we are seeing such high CPU usage.

Judging this game's NPCs with other open world action adventure games like spiderman is foolish. Bethesda games arent just RPGs, but they are doing far more under the hood than other RPGs like Mass Effect let alone basic ass games like Spiderman. if this game had 10% CPU utilization then fine, I'd concede its poorly optimized but its regularly above 70% in those cities. It's not sitting there doing nothing like we saw in star wars. The CPU is always pegged.

The only thing the game has to do is to keep track of the important NPCs. But this is just a pointer to their position.
There is no reason to render an NPC or any object or geometry that is not in the players field of view.
Proper game engines will cull both hidden geometry and frustrum.
again, you keep confusing rendering with what the CPU is doing. Rendering shit like geometry is completely different from having the CPU track NPCs on screen, and no it will need to track everyone because when you look down from the sky, you would have expected the NPC to continue walking his path and be at a different place than where he was before. Where as a plant or a rich geometrical asset like a fountain or statue will always be in the same spot so the GPU can simply forget about it when it goes out of view and then load it back in when you look at it again.

Yes, the GPU wont have to actually render those NPCs or those sandwiches but the CPU has to keep track of them. Maybe its poorly optimized in the backend and their code can be made better, but its not like the cpu is sitting there idle like we see in UE5 games and Star Wars. the cpu is doing something and GOOD CPUs are able to handle it just fine at 60 fps. You just bought the wrong CPU that didnt age. Blame AMD. Their latest lineup is destroying that CPU so clearly the 5800x3d was trash seeing as how its aged so poorly while its intel equivalent havent.
 

GymWolf

Member
Their npcs system do jack shit under the hood, especially in starfield where they are completely brain dead and their reactions even simpler than their past games, if their internal tech does much more, it is not showed ingame AT ALL.

It's probably their engine sucking ass as usual that can't handle a bigger scope.

Not everything that is heavy has a gameplay reason for it, unoptimized games\tech EXIST.

The video with a lot of potatoes are gamers creating an unnatural situation that you never encounter in the game, there is never an instance where you move all those objects at once, not even when you throw a granade in a room full of objects (that just move around with basic havok, no damage or anything), so the cpu is never stressed that way, i didn't give you the reason why the cpu is stressed, i just showed you the extent of starfield incredible physics...something that a x360 could already do in oblivion.
 
Last edited:

winjer

Gold Member
again, you keep confusing rendering with what the CPU is doing. Rendering shit like geometry is completely different from having the CPU track NPCs on screen, and no it will need to track everyone because when you look down from the sky, you would have expected the NPC to continue walking his path and be at a different place than where he was before. Where as a plant or a rich geometrical asset like a fountain or statue will always be in the same spot so the GPU can simply forget about it when it goes out of view and then load it back in when you look at it again.

Yes, the GPU wont have to actually render those NPCs or those sandwiches but the CPU has to keep track of them. Maybe its poorly optimized in the backend and their code can be made better, but its not like the cpu is sitting there idle like we see in UE5 games and Star Wars. the cpu is doing something and GOOD CPUs are able to handle it just fine at 60 fps. You just bought the wrong CPU that didnt age. Blame AMD. Their latest lineup is destroying that CPU so clearly the 5800x3d was trash seeing as how its aged so poorly while its intel equivalent havent.

I'm talking about culling geometry and pixels on the early part of the pipeline stage. That is the biggest performance hog in games.
Tracking the position of an NPC is easy stuff in comparison.
Most important NPCs will have a fixed position or a select path to walk about. But these are only calculated when the player spawns near. In this case, inside of a ugrid.
The rest of the NPCs are just randomized. They don't need tracking.

You keep insisting that culling techniques that have been used in many game engines, don't exist.
But the reality is that almost modern game engines do these things to save on performance. But Starfield seems not to do this.

And the 5800X3D performs very well on all games, Starfield is one of the rare exceptions. And the reason is that it's coded by a company that has always been very bad at this job.
 

T4keD0wN

Member
Yes the game streams in npcs as you go but just because you look up at the sky it is not going to stop the cpu from doing whatever it needs to do to get the game logic running… be it npcs, physics ir whatever else the cpu is doing in Bethesda games.

If anything that setting shows that they did attempt to optimize it but cpu bottlenecks simply can’t be removed by frustrum culling.


Diabolical how? Aside from the nvidia cards not performing as well as amd cards, the performance is scaling well with better gpus and cpus. This isn’t the first game to favor nvidia cards and it’s possible that the game was designed around xbox hardware.

They game is pushing a lot of npcs, fancy physics, gorgeous lighting and asset quality in spots. Of course it’s going to be heavy on gpu and cpu. You guys can’t continue to expect last Gen performance out of your cards. The xbox runs this at 1440p 30 fps on a 12 tflops gpu. You will need 24 tflops to run this at 60 fps at same resolution and settings on top of a cpu that can handle the big cpu load for a game like this. Is hardly diabolical and only seems out of the ordinary because we have gone 3 years without doing anything remotely taxing with gpu and cpus beyond forced ray tracing.
It’s so idiotic and completely delusional. I feel like gamers are losing their minds. I get that we have had a very long cross gen period with some very shoddy ports this year but this aint a hogwarts legacy or TLOU part 1 or star wars situation. This game is actually doing shit thats well beyond what their previous games were doing. TLOU running like shit while looking fucking identical to TLOU2 didnt make sense. This looks nothing like fallout 4 or fallout 76. I can even get behind calling the Matrix demo unoptimized because UE5 is CPU bound with poor multithreading. But that aint the case here.

Expecting games to look better while running on last gen hardware is beyond bizarre, it makes me question just how smart these gamers really are. Better graphics = higher graphics requirements. Its PC gaming 101. This is a next gen game running at 1440p 30 fps sometimes sub 30 fps on the most powerful console this gen. Expecting 60 fps on GPUs half or one third as powerful is absolutely idiotic.

lol you cant give examples about a game utilizing a cpu and use that to discredit that it is doing nothing with the cpu. We are discussing why the CPU utilization is high, i couldnt care less about if its next gen or last gen. I am explaining why we are seeing such high CPU usage.

Judging this game's NPCs with other open world action adventure games like spiderman is foolish. Bethesda games arent just RPGs, but they are doing far more under the hood than other RPGs like Mass Effect let alone basic ass games like Spiderman. if this game had 10% CPU utilization then fine, I'd concede its poorly optimized but its regularly above 70% in those cities. It's not sitting there doing nothing like we saw in star wars. The CPU is always pegged.


again, you keep confusing rendering with what the CPU is doing. Rendering shit like geometry is completely different from having the CPU track NPCs on screen, and no it will need to track everyone because when you look down from the sky, you would have expected the NPC to continue walking his path and be at a different place than where he was before. Where as a plant or a rich geometrical asset like a fountain or statue will always be in the same spot so the GPU can simply forget about it when it goes out of view and then load it back in when you look at it again.

Yes, the GPU wont have to actually render those NPCs or those sandwiches but the CPU has to keep track of them. Maybe its poorly optimized in the backend and their code can be made better, but its not like the cpu is sitting there idle like we see in UE5 games and Star Wars. the cpu is doing something and GOOD CPUs are able to handle it just fine at 60 fps. You just bought the wrong CPU that didnt age. Blame AMD. Their latest lineup is destroying that CPU so clearly the 5800x3d was trash seeing as how its aged so poorly while its intel equivalent havent.
While i think that Starfield is a very good looking game there are better looking or similar games which perform way better than starfield while having the disadvantage of not being instanced and being fully open world. The quality of visuals and performance itself varies pretty wildly (in my observations very dependant on the lightning), but the game is very clearly poorly gpu optimized, i get 99% gpu usage and the card is pulling way less power (~30% less) than in most other games under full load. I had a 3070ti use 126w at one planet while sitting @ 99% usage, it would usually be above 200w.

Curiously my performance is consistently at its worst when flying a spaceship, which is where i would expect it to run way better than in those imo impressive cities, but hey its either the same or even worse for me.
Its bizzare that even when i am in relatively empty locations (space, lol) when there is nearly nothing going on on the screen besides the ship it still runs poorly. I am on alder lake so i am pretty much gpu bound 100% of the time, even in cities, but this is far from well gpu optimized.

(I have no complaints on the CPU side of things, which also seems abnormal, but i think those are at least somewhat justified)
 
Last edited:
guys i watched a tutorial on how to do the dlss2 mod and followed it correctly but to bring up the menu thing to turn dlss on it says hit the end key my keyboard doesn’t have that is there another way to bring it up?

pure dark mod btw
I had another issue and while I was reading through the posts on that mod, someone mentioned your same issue. There is a way to remap the "End" key to something else in the mod configuration file. Cant recall exactly how off-hand. Maybe you can find the post I'm talking about on Nexus
 

SlimySnake

Flashless at the Golden Globes
I'm talking about culling geometry and pixels on the early part of the pipeline stage. That is the biggest performance hog in games.
Tracking the position of an NPC is easy stuff in comparison.
Most important NPCs will have a fixed position or a select path to walk about. But these are only calculated when the player spawns near. In this case, inside of a ugrid.
The rest of the NPCs are just randomized. They don't need tracking.

You keep insisting that culling techniques that have been used in many game engines, don't exist.
But the reality is that almost modern game engines do these things to save on performance. But Starfield seems not to do this.

And the 5800X3D performs very well on all games, Starfield is one of the rare exceptions. And the reason is that it's coded by a company that has always been very bad at this job.
Didnt you say in your original post that the game was performing well indoors and even outdoors until you got to Atlantis when it started dropping below 40 fps? So what do you think is different? Its obviously the NPCs and whatever else their engine needs to keep track of in big cities. Vendors, quests, whatever. Thats why you see GPU usage drop to 38% when in the open world it probably sits around 99%. Because in the cities there is more CPU tasks the engine needs to do.

If anything i am pissed that they downgraded the games outdoor areas so heavily. The comparisons ive seen from the 2022 demo show a clear downgrade in terrain geometry, textures, and draw distance. lighting is more or less the same, but i am the complete opposite of you in that i want them to push the visuals more even if it comes at the expense of my 3080 not being able to run it at 4k dlss 60 fps. We shouldnt be chastizing devs for trying to push visual fidelity.

P.S I recently went back to play Witcher 3 on PC and that game's new remake has several settings related to NPCs. My CPU simply cant handle it. Goes from 60-80 fps in the wilderness to 30-40 fps in cities. I checked the settings and they really upped the NPC count in these remake/remaster. lowering the setting reduces the cpu load which lets my framerate go up. the same thing is happening here. they can probably add or remove more NPCs than they do in the crowd density setting because in Witcher going from Ultra + To Ultra or High is a massive 50% reduction in NPC count. here it hardly feels different.
 
Top Bottom