• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF Direct Special dedicated to Starfield

JimboJones

Member
Intentionally targeting DF to "make them pro Xbox" and yes I have receipts and have proved to the authorities here Xbox brass made that direct statement I quoted

Yeah but we need receipts to show that they are pro Xbox. I'm sure every publisher wants DF to be pro whatever there selling.
Healthy skepticism is good but what some people are accusing them of needs much harder evidence than that.
 

Draugoth

Gold Member
im not watching it, i just read the summaries on here lmao

TLDR;
  • Running in 4K with FSR 2 in quality mode, the game had mixed results. In closed places, such as laboratories, the game reached 60fps. In space, Starfield ended up varying between 35fps and 50fps. Finally, in the rest of the game, such as on the surface of the planets, the game was between 30fps and 40fps.
  • Considering that the game was made to run at 30fps, with the current quality it was expected that it would not be able to reach 60fps. The channel then decided to lower the game's resolution, but maintain the same graphic quality.
  • They managed to maintain 60fps for much longer, with the minimum performance being 40fps. The highlight is the Neon city and space combat, with both running at a locked 60fps.
  • In the end, while the game can't consistently maintain 60fps, Digital Foundry claims there's room for a 40fps mode for 120hz screens or even a framerate mode freed up for gamers with VRR monitors.
 

SlimySnake

Flashless at the Golden Globes
DF has become the game developer defense force as of lately. They claimed Horizon FW was a technical masterpiece when it took the developers months to polish the performance mode and tried to excuse latest Star Wars performance mode as too next gen for players to expect a solid framerate. I expect them to continue giving game developers the benefit of the doubt to keep their access.
I have only seen the first 25 minutes or so but i dont see much developer defending here. They acknowledge the fact that this is a CPU heavy game and even brought receipts to show the xsx equivalent GPU simply being bottlenecked by the XSX CPU. It is what it is. The game is simply too heavy to run at 60 fps in all scenarios.

And 30 fps is a solid framerate. 99.9% of GOTY winners have been 30 fps on consoles. I just played FF16 at 30 fps and it felt smooth as butter. The fact that they had to drop to 720p to hit 60 fps is proof that these consoles are the problem. Not the developers. Maybe DF could be more critical of MS and Sony, but honestly i would not blame the developers for this.

There is a time and a place for criticism. Blame developers for trash like Gotham Knights, Star Wars and TLOU Part 1 on PC. Which they did. Loudly. But blaming devs for the fact that in order to hit 60 fps in these next gen games, you need to go all the way down to 720p like immortals, ff16, and remnant 2? eh. thats not on the devs.

I will watch the rest of video before i comment, but one thing I didnt like was them mentioning then ignoring the fact that the game only uses 5 GB of vram. clearly not pushing textures and geometry. it is clearly a series s limitation which is affecting both the xsx and pc versions. why? textures are essentially free as long as you have enough vram, give us better textures if you cant find enough GPU headroom for better geometry and level of detail. And they do have more headroom in the open world as the framerates can up drastically when you step out of the busy streets and into smaller indoor levels.

In an age where we have 16-24 GB of vram on cpu cards, a 2023 game maxing out at 5GB should piss off any tech channel.
 

Denton

Member
Interesting video (especially the part where they test PC based on Xbox CPU and PS5 GPU, and the DLSS comparison part), so leave it to some GAFers to keep whining about it for pages. Some of you people really are dumbass morons.
 
u mean 85 ?
kI6yAfL.jpg

Well an 85 is still a good score. I've played games with that rating and they were pretty good.
 

zeldaring

Banned
Let’s run that through the Neogaf translator:

I have absolutely zero evidence that they are paid by Nvidia, but I definitely believe they are.


Jeez why do you think many people thought that fs3 and dsll were gonna be give 100% boost because of the way DF advertised as the next coming, and never really mentioned any of the negatives till in this video.
 
Well an 85 is still a good score. I've played games with that rating and they were pretty good.
Goes to show how inconsistent and downright wrong reviews are. This is a prime example of significantly underscoring a game. To put it into context this is the same score as the Outer Worlds! a decent game but Starfield is literally 10 times better, go figure 🤷
 
Goes to show how inconsistent and downright wrong reviews are. This is a prime example of significantly underscoring a game. To put it into context this is the same score as the Outer Worlds! a decent game but Starfield is literally 10 times better, go figure 🤷

I wouldn't say the reviews are wrong because they scored the game below a 9.0. You really can't expect everyone to score it above a 90. Some won't enjoy the game as much and they are not trying to trash or bring down the Xbox brand.
 
I wouldn't say the reviews are wrong because they scored the game below a 9.0. You really can't expect everyone to score it above a 90. Some won't enjoy the game as much and they are not trying to trash or bring down the Xbox brand.
Explain the Outer Worlds score then? similar game, same genre? Doesn’t make sense does it?
 
Not really you can't expect everyone to rate it over a 9. Some will like it more and some will like it less. No conspiracy here.
I never said anything about rating it over 9, I just pointed out its currently rated the same as the Outer Worlds which is absurd but I'm not losing any sleep over it.
 
I never said anything about rating it over 9, I just pointed out its currently rated the same as the Outer Worlds which is absurd but I'm not losing any sleep over it.

I don't think that's an issue really. Nothing to worry about since it's still a solid title. Unless your worried about it that is but I'm not.
 
Why are you guys making fun of Digital Foundry? This is only their 4th Starfield video. I'm expecting at least 6 more from them by Sunday night! :messenger_tears_of_joy:
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
Solid video with caveats.

Hopefully they're able to add a 40hz mode with future patches.
 

TrueGrime

Member
Solid video with caveats.

Hopefully they're able to add a 40hz mode with future patches.

I'm more than certain that mods will give it a 60 FPS mode on consoles (And DF did a nice video on that as well for Skyrim).

In any case, I've got 23 hours in the game and haven't even done the main quest yet, so back to it!
 

Bojji

Member
I have only seen the first 25 minutes or so but i dont see much developer defending here. They acknowledge the fact that this is a CPU heavy game and even brought receipts to show the xsx equivalent GPU simply being bottlenecked by the XSX CPU. It is what it is. The game is simply too heavy to run at 60 fps in all scenarios.

And 30 fps is a solid framerate. 99.9% of GOTY winners have been 30 fps on consoles. I just played FF16 at 30 fps and it felt smooth as butter. The fact that they had to drop to 720p to hit 60 fps is proof that these consoles are the problem. Not the developers. Maybe DF could be more critical of MS and Sony, but honestly i would not blame the developers for this.

There is a time and a place for criticism. Blame developers for trash like Gotham Knights, Star Wars and TLOU Part 1 on PC. Which they did. Loudly. But blaming devs for the fact that in order to hit 60 fps in these next gen games, you need to go all the way down to 720p like immortals, ff16, and remnant 2? eh. thats not on the devs.

I will watch the rest of video before i comment, but one thing I didnt like was them mentioning then ignoring the fact that the game only uses 5 GB of vram. clearly not pushing textures and geometry. it is clearly a series s limitation which is affecting both the xsx and pc versions. why? textures are essentially free as long as you have enough vram, give us better textures if you cant find enough GPU headroom for better geometry and level of detail. And they do have more headroom in the open world as the framerates can up drastically when you step out of the busy streets and into smaller indoor levels.

In an age where we have 16-24 GB of vram on cpu cards, a 2023 game maxing out at 5GB should piss off any tech channel.

I can agree that UE5 games at least are doing something heavy in tech department, nanite, lumen or both.

Fuunily enough Gotham Knights has ray tracing reflections all the time and game is 4K NATIVE most of the time so it's pushing stuff, 30fps is justified from tech perspective.

TLoU1 works great on PS5 so only PC port was a mess at launch, game also looks better than 90% of games released right now.

But FFXVI and Starfield... Both of those games are not pushing any heavy tech, no RT, no advanced real time GI, nothing. Both of those games are unnecessary heavy and completely unoptimized. FFXVI aside some cutscenes also looks like PS4 game, at least Starfield have high quality objects in closed locations and great materials.

And I agree, Starfield was fucking downgraded to fit on series S, they could just easily make 2 sets of textures for each version, they probably have GBs of unused memory on SX.

In conclusion, developers are the problem, after COVID majority of games launch in unfinished state with terrible or no optimzation. Zen CPUs in consoles are few times faster than last gen jaguars and developers were making really fucking beautiful games on those machines with such constraints, now they make games that look barely better and require fucking 13900K to run above 60FPS. Something went wrong.
 

SlimySnake

Flashless at the Golden Globes
I can agree that UE5 games at least are doing something heavy in tech department, nanite, lumen or both.

Fuunily enough Gotham Knights has ray tracing reflections all the time and game is 4K NATIVE most of the time so it's pushing stuff, 30fps is justified from tech perspective.

TLoU1 works great on PS5 so only PC port was a mess at launch, game also looks better than 90% of games released right now.

But FFXVI and Starfield... Both of those games are not pushing any heavy tech, no RT, no advanced real time GI, nothing. Both of those games are unnecessary heavy and completely unoptimized. FFXVI aside some cutscenes also looks like PS4 game, at least Starfield have high quality objects in closed locations and great materials.

And I agree, Starfield was fucking downgraded to fit on series S, they could just easily make 2 sets of textures for each version, they probably have GBs of unused memory on SX.

In conclusion, developers are the problem, after COVID majority of games launch in unfinished state with terrible or no optimzation. Zen CPUs in consoles are few times faster than last gen jaguars and developers were making really fucking beautiful games on those machines with such constraints, now they make games that look barely better and require fucking 13900K to run above 60FPS. Something went wrong.
UE5 is doing realtime GI with Lumen which is basically what starfield is doing. they just didnt bother giving it a name. Lumen has software GI which is what we have here, and hardware accelerated GI which uses the RT cores on consoles and RDNA GPUs. its more accurate but the tech is the same. Realtime GI is realtime GI no matter what you call it and starfield is using it.

FF16 also uses ray traced shadows or something closer to ray shadows which is why at times it can look amazing. its also pushing very impressive setpieces that even SSM gave up on this gen. Id say both starfield and FF16 are doing things that explain why the game is running at 30 fps.

Gotham Knights on consoles makes sense for 30 fps because they ran into the same issue Larian ran into with Baldurs Gate 3. There just isnt enough CPU for two players and Gotham Knights needed to keep track of two players in one big open world. My issue with Gotham Knights is on PC. it launched with massive stutters and had a bug with ray tracing on where it would freeze and stutter at 2 fps every 20-30 minutes or so. This after they had patched out the stutter issues and other bugs.

I am all for blaming devs for poorly unoptimized games. like i said i have no issues with blaming devs for being lazy, for shipping broken PC releases like star wars and tlou, but we honestly cannot look at something like starfield which is utilizing CPU cores and threads more than ANY other game out there, and say it is unoptimized. There are no stutters here. No 40 minute shader compilations. No crashes. Only performance that is scaling literally 1:1 with consoles that utilize every last bit of their tflops and cpu horsepower. of course, this game is going to be heavy. its the first true next gen game on PC.
 

Bojji

Member
UE5 is doing realtime GI with Lumen which is basically what starfield is doing. they just didnt bother giving it a name. Lumen has software GI which is what we have here, and hardware accelerated GI which uses the RT cores on consoles and RDNA GPUs. its more accurate but the tech is the same. Realtime GI is realtime GI no matter what you call it and starfield is using it.

FF16 also uses ray traced shadows or something closer to ray shadows which is why at times it can look amazing. its also pushing very impressive setpieces that even SSM gave up on this gen. Id say both starfield and FF16 are doing things that explain why the game is running at 30 fps.

Gotham Knights on consoles makes sense for 30 fps because they ran into the same issue Larian ran into with Baldurs Gate 3. There just isnt enough CPU for two players and Gotham Knights needed to keep track of two players in one big open world. My issue with Gotham Knights is on PC. it launched with massive stutters and had a bug with ray tracing on where it would freeze and stutter at 2 fps every 20-30 minutes or so. This after they had patched out the stutter issues and other bugs.

I am all for blaming devs for poorly unoptimized games. like i said i have no issues with blaming devs for being lazy, for shipping broken PC releases like star wars and tlou, but we honestly cannot look at something like starfield which is utilizing CPU cores and threads more than ANY other game out there, and say it is unoptimized. There are no stutters here. No 40 minute shader compilations. No crashes. Only performance that is scaling literally 1:1 with consoles that utilize every last bit of their tflops and cpu horsepower. of course, this game is going to be heavy. its the first true next gen game on PC.

But Starfield is not a next gen game. It functions the same way as any other previous Bethesda game, it's based on small blocks for environments, they track objects in all of them and number of npc is fairly large - they have all those things since oblivion at least. Only thing that is different is graphical fidelity.

There is no logical explanation for massive CPU requirements in this game, only that's unoptimized, it also is not much multithreaded. After 6 cores (and that's 6 threads, not 12 with HT) it doesn't show any gains. Hardware Unboxed also said that even 4 core 3300 is faster than 6 or 8 core zen 2 chips (and very close to 3600!).

Only thing that this game seems to care about is clock speed and probably some other things that Intel CPUs are very good at.

Global illumination in this game is super simple compared to lumen, I doubt it's hardware intensive and pc setting test for it showed no difference between low and ultra (both performance and visual). There are many games with some form of real time GI but we rarely heard about them because effects aren't every impressive. That's why RTGI and lumen bring such improvement.
 

sankt-Antonio

:^)--?-<
I want you guys to keep this same energy on the next gigantic release that has multiple threads, no matter the publisher or developer.

Edit: Since everyone is reading into this post incorrectly let me stop it here: My issue is with the whiners, not the Starfield threads. There, now rewrite your posts please.
People have been bitching about GoW Ragnarok looking like a PS4 game while running in 60fps locked.

These big games always get shot on. SF is no different.
 

SlimySnake

Flashless at the Golden Globes
But Starfield is not a next gen game. It functions the same way as any other previous Bethesda game, it's based on small blocks for environments, they track objects in all of them and number of npc is fairly large - they have all those things since oblivion at least. Only thing that is different is graphical fidelity.
Well, you said it yourself. Graphical fidelity is next gen and it comes with all the bethesda object persistence, physics, and large number of NPCs. So they upgraded at least one thing and thats why the game is so heavy and needs a 12 tflops GPU like the xsx running at 1440p 30 fps. Of course you and I will need a 3080 or a 6800xt to double its framerate. Thats all I am saying and what Todd is saying.

I am actually with you on the fucking constant load screens. I have been bitching about going through loading screens after loading screens just to get into ships and small buildings. I am on your side on this, they shouldve implemented a streaming system. if no mans sky could do it with 20 devs, bethesda with its 400-500 dev team and 8 year long dev cycle should as well. but i think thats a different discussion and doesnt pertain with how heavy this game is. streaming has been around since the ps3 era. for whatever reason, bethesda just wont let us enter interiors without loading. even when the loading is now down to just 1 second. its not a next gen or last gen cpu or gpu limitation. just bethesda being bethesda.
There is no logical explanation for massive CPU requirements in this game, only that's unoptimized, it also is not much multithreaded. After 6 cores (and that's 6 threads, not 12 with HT) it doesn't show any gains. Hardware Unboxed also said that even 4 core 3300 is faster than 6 or 8 core zen 2 chips (and very close to 3600!).

Only thing that this game seems to care about is clock speed and probably some other things that Intel CPUs are very good at.
I will have to rewatch that video but I remember either them or some other channel showing CPU threads and all 16 threads were showing running at 70% which is crazy high utilization for CPUs that typically top out at 10% in last gen games with only 1-2 threads hitting over 70%. I think there is something going on with zen 2 or zen 3 chips that might be a bug or something to do with their design, because if you look at zen 4 and ANY of the intel chips 11th gen onwards, this is not an issue.

I will try and enable some logging on all cores and threads and see what i find, but this game makes my CPU go absolutely mental. Even cyberpunk with its ray tracing and higher 100 fps non RT modes dont make it go that high and cyberpunk is the best game to benchmark CPUs because of its fantastic threading and scaling with higher cores. If my results show poor CPU thread utilization in Atlantis then i will concede this point, but ive been under the impression that the game is doing proper multithreading based on the benchmarks i saw at launch.
Global illumination in this game is super simple compared to lumen, I doubt it's hardware intensive and pc setting test for it showed no difference between low and ultra (both performance and visual). There are many games with some form of real time GI but we rarely heard about them because effects aren't every impressive. That's why RTGI and lumen bring such improvement.
As for their global illumination, i think it looks pretty good. Low vs ultra might not be scaling that well, but even their low is way better than what they had last gen.

wk7dfedxrirlhlxcbkta.gif


qt0KJyN.gif
 
Last edited:

Bojji

Member
Well, you said it yourself. Graphical fidelity is next gen and it comes with all the bethesda object persistence, physics, and large number of NPCs. So they upgraded at least one thing and thats why the game is so heavy and needs a 12 tflops GPU like the xsx running at 1440p 30 fps. Of course you and I will need a 3080 or a 6800xt to double its framerate. Thats all I am saying and what Todd is saying.

I am actually with you on the fucking constant load screens. I have been bitching about going through loading screens after loading screens just to get into ships and small buildings. I am on your side on this, they shouldve implemented a streaming system. if no mans sky could do it with 20 devs, bethesda with its 400-500 dev team and 8 year long dev cycle should as well. but i think thats a different discussion and doesnt pertain with how heavy this game is. streaming has been around since the ps3 era. for whatever reason, bethesda just wont let us enter interiors without loading. even when the loading is now down to just 1 second. its not a next gen or last gen cpu or gpu limitation. just bethesda being bethesda.

I will have to rewatch that video but I remember either them or some other channel showing CPU threads and all 16 threads were showing running at 70% which is crazy high utilization for CPUs that typically top out at 10% in last gen games with only 1-2 threads hitting over 70%. I think there is something going on with zen 2 or zen 3 chips that might be a bug or something to do with their design, because if you look at zen 4 and ANY of the intel chips 11th gen onwards, this is not an issue.

I will try and enable some logging on all cores and threads and see what i find, but this game makes my CPU go absolutely mental. Even cyberpunk with its ray tracing and higher 100 fps non RT modes dont make it go that high and cyberpunk is the best game to benchmark CPUs because of its fantastic threading and scaling with higher cores. If my results show poor CPU thread utilization in Atlantis then i will concede this point, but ive been under the impression that the game is doing proper multithreading based on the benchmarks i saw at launch.

As for their global illumination, i think it looks pretty good. Low vs ultra might not be scaling that well, but even their low is way better than what they had last gen.

wk7dfedxrirlhlxcbkta.gif


qt0KJyN.gif

Alex proved in his latest video that game doesn't scale after 6 cores at all, this high utilization of all threads is just illusion:

jumAlp9.png


I think starfield looks quite good in many places while dated in others. It isn't groundbreaking in tech in any way, game like this could have been easily done on PS4 with F4 graphics and long loading times.

I think poor performance relative to what game is rendering is mostly associated with decades old engine (that was heavily tweaked for sure), core elements were build many ages ago and are not running efficiently on modern hardware at all.
 

SlimySnake

Flashless at the Golden Globes
Alex proved in his latest video that game doesn't scale after 6 cores at all, this high utilization of all threads is just illusion:

jumAlp9.png


I think starfield looks quite good in many places while dated in others. It isn't groundbreaking in tech in any way, game like this could have been easily done on PS4 with F4 graphics and long loading times.

I think poor performance relative to what game is rendering is mostly associated with decades old engine (that was heavily tweaked for sure), core elements were build many ages ago and are not running efficiently on modern hardware at all.
Am i reading the graph wrong or does the scaling continue on until 8 threads, not 6?

look at this video and tell me what other games are doing this? i think bethesda is at fault for not making physics front and center but their engine is so heavy because its doing physics or has to do physics like this.

 

Bojji

Member
Am i reading the graph wrong or does the scaling continue on until 8 threads, not 6?

look at this video and tell me what other games are doing this? i think bethesda is at fault for not making physics front and center but their engine is so heavy because its doing physics or has to do physics like this.



I remember Crysis videos like that from 200x:



Physics engine is impressive but I think there is nothing new here since Half Life 2. Havok or Physx can handle stuff like this in many games, of course players would have to had editors to enable that many objects on screen.

Alex found that game don't like too many threads so:

- 6C/6T runs "ok"
- 6C/12T runs worse than that
- 8C/8T runs the best
- 8C/16T runs worse
There is some difference between 6 and 8 cores when HT is disabled.

TB8MnmF.png


There are games where HT makes performance worse but that's usually mean that there is something wrong with how they work. My point is that this game is not utilizing many threads well despite overlays telling that all threads are very saturated.
 

SlimySnake

Flashless at the Golden Globes
Alex found that game don't like too many threads so:

- 6C/6T runs "ok"
- 6C/12T runs worse than that
- 8C/8T runs the best
- 8C/16T runs worse
There is some difference between 6 and 8 cores when HT is disabled.

TB8MnmF.png


There are games where HT makes performance worse but that's usually mean that there is something wrong with how they work. My point is that this game is not utilizing many threads well despite overlays telling that all threads are very saturated.
I just saw that. That is absolutely insane. Yeah, clearly wrong with something. How do i turn off hyper threading lol

Whats werid is that his video shows even the so-called trash performing 3060 utilizing all 12 threads. so just what exactly is it doing??


tQ89xVS.jpg


I think part of the reason why im not seeing this stuff on my setup is that im gpu bound at 4k dlss quality and high/ultra settings. i get 97-99% gpu utilization in my cities.

This is one of runs from the other day in Atlantis. I am getting higher temps and wattage in starfield at 4k than i did in cyberpunk at 1080p 120 fps when the cpu really gets taxed. i will run some tests with all 16 threads showing on msi afterburner and maybe figure out how to disable hyper threading but im getting great performance right now with dlss and a mix of high and ultra settings even in cities.

Atlantis
J8qg8sE.jpg


Akila
MFujXpV.jpg


yes, its not a locked 60 fps but i have a gsync monitor and because the framerate doesnt fluctuate too much, it is hard to tell that its not a locked 60. it feels consistent even at 50 or 55 fps.
 
Last edited:

Bojji

Member
I just saw that. That is absolutely insane. Yeah, clearly wrong with something. How do i turn off hyper threading lol

Whats werid is that his video shows even the so-called trash performing 3060 utilizing all 12 threads. so just what exactly is it doing??


tQ89xVS.jpg


I think part of the reason why im not seeing this stuff on my setup is that im gpu bound at 4k dlss quality and high/ultra settings. i get 97-99% gpu utilization in my cities.

This is one of runs from the other day in Atlantis. I am getting higher temps and wattage in starfield at 4k than i did in cyberpunk at 1080p 120 fps when the cpu really gets taxed. i will run some tests with all 16 threads showing on msi afterburner and maybe figure out how to disable hyper threading but im getting great performance right now with dlss and a mix of high and ultra settings even in cities.

Atlantis
J8qg8sE.jpg


Akila
MFujXpV.jpg


yes, its not a locked 60 fps but i have a gsync monitor and because the framerate doesnt fluctuate too much, it is hard to tell that its not a locked 60. it feels consistent even at 50 or 55 fps.

You can disable HT in CPU settings in BIOS/UEFI, you can do that so we will know how this affects 11 series Intel. But it shouldn't really affect your experience when you are GPU limited most of the time (like me).

With DLSS mod set to balanced 4K (58%), ReBar enabled and optimized (HUB) settings I'm ok with performance I'm getting, doesn't drop below 50FPS and LG supports Gsync perfectly.
 
Top Bottom