• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Todd Howard says Starfield is already optimized "You might need to upgrade your PC."

Krathoon

Member
The 7XXXs and 13XXXs by default intentionally rev themselves up to 95c and stay there full time at load. The 11700 is current enough where I would assume its spec'ed to target at least 85c, if not 95. Modern CPUs are spec'ed to run much hotter than older ones.
Yeah. Modern CPUs are designed to run hot. That is why gaming laptop fans are so loud.
 

b0uncyfr0

Member
How it still has an 88 on opencricitic is beyond me. I took a long look at the trees in New Atlantis said 'Nope'.

Looks like they came out of a PS3.
 

Elysium44

Banned
The 7XXXs and 13XXXs by default intentionally rev themselves up to 95c and stay there full time at load. The 11700 is current enough where I would assume its spec'ed to target at least 85c, if not 95. Modern CPUs are spec'ed to run much hotter than older ones.

CPUs don't set a temperature to 'target', they run at whatever load the software you're running needs them to. Usually gaming doesn't put a particularly high load on them.

SlimySnake SlimySnake , 75c isn't anything to be concerned about at all, especially on 11th gen which are known to run quite hot and be power guzzlers compared to 10th gen (which I have). If you want to keep a lid on temps or power consumption then use Intel XTU or your BIOS to set power limits as you wish. It can't damage itself no matter what because it will throttle down before that happens, or worst case scenario shut itself down entirely (but that should never happen unless your heatsink isn't making proper contact).
 

SlimySnake

Flashless at the Golden Globes
CPUs don't set a temperature to 'target', they run at whatever load the software you're running needs them to. Usually gaming doesn't put a particularly high load on them.

SlimySnake SlimySnake , 75c isn't anything to be concerned about at all, especially on 11th gen which are known to run quite hot and be power guzzlers compared to 10th gen (which I have). If you want to keep a lid on temps or power consumption then use Intel XTU or your BIOS to set power limits as you wish. It can't damage itself no matter what because it will throttle down before that happens, or worst case scenario shut itself down entirely (but that should never happen unless your heatsink isn't making proper contact).
Thanks. Yeah, i spent weeks reconfiguring my pc after buying it because it would sit at 45 on idle. got it down to around 32 degrees after buying a new AIO and some case fans so i get a bit antsy when i see it jump above 70 degrees during games.

I can always get it down to 22 degree idle if i put my tower next to my AC vent, but always thought that was a bit of an overkill. lol
The 7XXXs and 13XXXs by default intentionally rev themselves up to 95c and stay there full time at load. The 11700 is current enough where I would assume its spec'ed to target at least 85c, if not 95. Modern CPUs are spec'ed to run much hotter than older ones.
I have seen it hit 85 during cinnebench runs. 90 when i had just gotten it and was running it off a cheap AIO. TLOU Part 1's 40 minute shader building also had it running at 100% and hitting 85 degrees. My concern was mostly about sustained 80 degree temps for a game that i will likely play for 100 hours.

But that doesnt seem to be the case. temps only increase during major cities and for some reason inside my ships. the rest of the time, im in the low 60s.
 

SlimySnake

Flashless at the Golden Globes
Its one component in an excessively overhyped 'next gen PC' game that's decent looking at best. Nothing that justifies its performance either.
every game has ugly levels including best looking games like horizon forbidden west and plague's tale requiem. judging the graphics based on one tree is weird. yes, the game is uneven, but lots of games are uneven.

FMOkGR5WUAckE6J



Now we all know Horizon doesnt always look like that but compared to that, this city with ugly trees looks pretty good.

F5eGSVeWIAA19wS
 

yamaci17

Member
75 ? Dude my i5-2500k stay at 90+ for like 5 years , and its still working in my nephew PC, could been over 10 years old already .
that's because through the lifespan of 2500k, devs had to EXTREMELY optimize their codes to hit 40+ fps avg. on 1.6 ghz jaguar cores, (so that you can get playable 30 fps with tight %1 lows).
and i5 2500k, core by core, was 4-5x faster than a 1.6 ghz jaguar core. jaguar already had horrible IPC. it also had super low clocks on console.

and only 6 cores were used. so 4 cores that is 4 times faster than super slow 6 cores managed to get by all those years.

now, the consoles rocking 3.6 ghz zen 2 CPUs. even the strongest CPU right now AT BEST... is 2-2.5x times faster than that.

now imagine that.
Good of you to take care of your nephew's heating needs.
see that's the problem lol. it will be very hard, upcoming years. told everyone this would happen 2 yrs ago. told everyone that CONSOLES SHOULD OR HAVE TO target 60 fps if "pc folks" to be keep "entertained" with high framerate ideas.

either devs will have to go back to 60 fps as a baseline (impossible, console users have no desire [aside from the 'vocal minority' in forums like this]
or this will be what is going to happen
 

Erkuza

Banned
It runs better than I was expecting. On a 3080ti the only drops I'm getting are in New Atlantis, everywhere else is pretty smooth.
 

SmokSmog

Member
Game is broken on Nvidia GPUs, the GPUs are underutilised (low power consumption for the clocks and given voltages). Both 3000 and 4000 series. My 3080 at 1080p can't hold 60FPS even with optimised high settings. I'm not CPU bottlenecked.
12600K 4100cl17 manually tuned

3080 1700mhz 750mv 20Gb/s
1080P= 52-53fps in a spot

3080 1875mhz 850mv 20Gb/s
1080P= 58-60fps same spot

540P (Quarter the resolution)
1700mhz = 73-74fps
1875mhz = 80fps

Imagine that the GPU is a bottleneck even at 540P, first time see something like that.
It's like something in the Nvidia GPU is bottlenecking the rest of the GPU causing underutilisation of cores.

8b394451853c3db7821727804281f577b1842da12b94c8320f92d3fbd633fd4d.png
 

Eotheod

Member
It's doing two things that nobody else does really. One is object persistence and the other one is mod support.

But overall it's super dated especially with constant loading and animations.
On PC loading screens are very much still a thing, especially open world games. Looking at the sheer content available without hitting a loading screen, it's clear why there is a need for them. Granted, many console players especially have had it lucky with little to no loading screens thanks to the hardware only having an NVME, so it inherently has faster loading BECAUSE developers know the hardware is exactly that. Even PS5 with the customisable drives still have a specific minimum.
 

SlimySnake

Flashless at the Golden Globes
Game is broken on Nvidia GPUs, the GPUs are underutilised (low power consumption for the clocks and given voltages). Both 3000 and 4000 series. My 3080 at 1080p can't hold 60FPS even with optimised high settings. I'm not CPU bottlenecked.
12600K 4100cl17 manually tuned

3080 1700mhz 750mv 20Gb/s
1080P= 52-53fps in a spot

3080 1875mhz 850mv 20Gb/s
1080P= 58-60fps same spot

540P (Quarter the resolution)
1700mhz = 73-74fps
1875mhz = 80fps

Imagine that the GPU is a bottleneck even at 540P, first time see something like that.
It's like something in the Nvidia GPU is bottlenecking the rest of the GPU causing underutilisation of cores.

8b394451853c3db7821727804281f577b1842da12b94c8320f92d3fbd633fd4d.png
the game favors AMD GPUs but its not the first game to do so. plenty of big AAA franchises do this because they are developed primarily on consoles which use AMD GPUs. starfield just has a larger difference than those games probably because it is extremely CPU bound and there is a known nvidia CPU overhead.

You are CPU bottlenecked if you are seeing those low power consumption numbers. I dont know why you are testing this at 1080p, go to 4k and you will see proper GPU utilization and power consumption. This is me in CPU bound cities and on mars. my 3080 is a 320 watt card, it is consistently at 97-98% and hovers around 300 watts all the time unless i max out at 60 fps indoors. my CPU is a 11700k. your numbers show that you are CPU bound.

liafWtg.jpg

F5eGSVeWIAA19wS

F5eGSVbXwAETHF4


sPYgJYC.jpg


BHsBqdQ.jpg
 
Shitty PC ports as usual.
Its like 2000s decade, the worst era for PC ports

Grand Theft Auto 4 and Saint Rows 2 were considered the worst PC ports when they came out
 

Cyberpunkd

Member
sheer-fucking-hubris-star-trek.gif

But seriously, about time someone said it.
There is no penalty for hype. Some developers have more shame and humility than others but Todd is not one of them - he will hype the shit out of it and laugh on the way to the bank.
I always appreciated Americans’ ability to sell the shit out of everything without outright lying, but just being vague enough.

“It takes 100 hours to really get into Starfield…since 95 of them I left it on idle”, statements and logic like that.

Snl Fool Me Twice GIF by Saturday Night Live
 
Last edited:
Top Bottom