• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Phil Spencer: Starfield being 30fps is a "creative choice", not a hardware issue.

YeulEmeralda

Linux User
Just because you have a "creative vision or choice" that does not mean it's not a hardware issue preventing that choice from also being 60fps. It's both. I am sure Bethesda would love for it to be 60fps on the consoles with the visual vision. The PC won't have that "creative vision" then since it will exceed 30fps? Didn't think so.
I mean if the PC game has 60fps in the options menu we know he's bullshitting. If we have to hack/mod it in I believe him.
 

Helghan

Member
Oh of course. It still is indeed a hardware issue for their vision as well. If they could get 60fps out of that fidelity on the consoles, I am sure they would be ecstatic and this narrative would not exist. The hardware can't do 60fps of what they want their visual presentation to be. Simple.
Indeed, so what’s the problem then?
 
If the intention was to avoid having a mode with massive cut-backs to reach 60, then yes of course it's a creative choice.


30 FPS:

image



60 FPS:

image
The thread was over with this excellent post, and yet here we are on page 15.
 
I mean if the PC game has 60fps in the options menu we know he's bullshitting. If we have to hack/mod it in I believe him.
lol, no. The creative choice here was to go with fidelity rather than framerate. The reason 60fps would be available on PC is because PC has better hardware.

EDIT: Did anyone catch that Real-time Global Illumination is a graphical element to Starfield? That's probably where the limitation comes from, honestly.
 
Last edited:
If the intention was to avoid having a mode with massive cut-backs to reach 60, then yes of course it's a creative choice.


30 FPS:

image



60 FPS:

image
Agreed if I am walking through a forest looking for a quest item but for combat and space flying I would way prefer higher frames
 

MarkMe2525

Member
In very final fantasy 16 thread, there people shitting on the game.

In every Hfw thread, there are people shitting on the game.

In every God of war thread, there are people shitting on the game.

You Xbox fans aren't special in the way you think you are.
I think people are taking issue with the fact that people are using the 30 fps cap to not shit on the game, like you have mentioned, but to shit on the Series X hardware.

Not everyone of course, just broadly speaking.
 

DeepEnigma

Gold Member
I think people are taking issue with the fact that people are using the 30 fps cap to not shit on the game, like you have mentioned, but to shit on the Series X hardware.

Not everyone of course, just broadly speaking.
It's this comment as to why people are like, okay, sure, whatever. It's both.

"creative choice", not a hardware issue.​

The hardware limits the framerate in their creative choice/vision. No such limits will be on the PC.
 

Helghan

Member
Supposedly, a CPU bound game does not have enough CPU for what they want to do in order for it to reach 60fps. If they could, they would.
I get that, but I don't understand the problem that some people are making this out to be. The hardware isn't strong enough to support the creative vision of Bethesda AT 60fps, so they dropped it to 30fps to still being able to support their creative vision on how they think console players should play the game.
 

DeepEnigma

Gold Member
I get that, but I don't understand the problem that some people are making this out to be. The hardware isn't strong enough to support the creative vision of Bethesda AT 60fps, so they dropped it to 30fps to still being able to support their creative vision on how they think console players should play the game.
I don't see the problem, I understand the reason. Some who are accustomed to 60fps don't like it. Everyone is different.

For Phil to say it's not a hardware issue, uh, yes it is. The hardware limits the framerate on their "creative choice."

He wants to have his cake and eat it too since they marketed 60/120 to the hardcore performance/power fanbase over the years.
 
Last edited:

MarkMe2525

Member
It's this comment as to why people are like, okay, sure, whatever. It's both.

The hardware limits the framerate in their creative choice/vision. No such limits will be on the PC.
That is correct, but that is the very nature of any closed box computer. Computing limitations do not equal "hardware issues". The Series X is perfectly capable of playing some other version of Starfield that is built for 60fps on the console.

But of course, we can twist and missrepresent the meaning of his statements because it is indeed Phil saying them. It is clear what he is implying, that being there came a stage in development where they were testing the game on the hardware and it became apparent that they had a decision to make. They could have made changes to the game and visuals to ensure a 60fps experience, or focus on 30fps to get more out of the hardware. [BOLD=This is the creative decision [/BOLD]

Edit: lol my attempt to make my last sentence "bold" on my phone failed. I don't understand why the hotkeys don't work for me on mobile

Edit: Edit: I just don't understand the attempt to paint this as some hardware problem, as the fact that God of War 2018 can't run on SNES would never be attributed to a "hardware issue"
 
Last edited:

DeepEnigma

Gold Member
That is correct, but that is the very nature of any closed box computer. Computing limitations do not equal "hardware issues". The Series X is perfectly capable of playing some other version of Starfield that is built for 60fps on the console.

But of course, we can twist and missrepresent the meaning of his statements because it is indeed Phil saying them. It is clear what he is implying, that being there came a stage in development where they were testing the game on the hardware and it became apparent that they had a decision to make. They could have made changes to the game and visuals to ensure a 60fps experience, or focus on 30fps to get more out of the hardware. [BOLD=This is the creative decision [/BOLD]

Edit: lol my attempt to make my last sentence "bold" on my phone failed. I don't understand why the hotkeys don't work for me on mobile
I get the explanation.

It shouldn't even have to be explained to such extent, but they marketed themselves into damage control. Especially that Greenberg clown.
 

adamsapple

Or is it just one of Phil's balls in my throat?
The hardware limits the framerate in their creative choice/vision. No such limits will be on the PC.

There will definitely be limits on PC depending on someones rig. On the console, they just chose to favor fidelity more over performance and cut back on their visual/fidelity target.

This has become a much bigger issue than it ought to lol.
 
Last edited:

DeepEnigma

Gold Member
There will definitely be limits on PC depending on someones rig. On the console, they just chose to favor fidelity more over performance and cut back on their visual/fidelity target.

This has become a much bigger issue than it ought to lol.
I get the explanation.

It shouldn't even have to be explained to such extent, but they marketed themselves into damage control. Especially that Greenberg clown.
 

Wooxsvan

Member
its sad they just cant be more direct. they didnt want the narrative to be that in order to get this game running at 60 it would make the game look "bad" but rather the narrative is the game is so robust and expansive it can only run at 30 on console.
 

Dirk Benedict

Gold Member
Absolutely. That's literally what I'm saying. Hell, some PC's will be able to achieve beyond 60fps, realistically. I edited my previous post, but I'm wondering if RTGI is the reason for the limitation.
The engine it's running on, the games that came from it were usually CPU bound. I wonder if this means I will have to compromise, given that I also stream games.
 

CatLady

Selfishly plays on Xbox Purr-ies X
lol, no. The creative choice here was to go with fidelity rather than framerate. The reason 60fps would be available on PC is because PC has better hardware.

EDIT: Did anyone catch that Real-time Global Illumination is a graphical element to Starfield? That's probably where the limitation comes from, honestly.

I had no idea, and I don't see how they can possibly include RTGI in this game in addition to everything else it does. It's a massive open world RPG that remembers every piece of junk in the game, remembers all your choices and how they effect your game, has better side quests than most games main quests and awesome random encounters, ship building, base building and more. Are you sure it includes RTGI? I will say the lighting does look really good.
 

YeulEmeralda

Linux User
lol, no. The creative choice here was to go with fidelity rather than framerate. The reason 60fps would be available on PC is because PC has better hardware.

EDIT: Did anyone catch that Real-time Global Illumination is a graphical element to Starfield? That's probably where the limitation comes from, honestly.
It's called hardware limitation nothing creative about it. Pure PR spin.
 

DeepEnigma

Gold Member
lol, no. The creative choice here was to go with fidelity rather than framerate. The reason 60fps would be available on PC is because PC has better hardware.

EDIT: Did anyone catch that Real-time Global Illumination is a graphical element to Starfield? That's probably where the limitation comes from, honestly.
Yeah, it looks good, but someone found bug #1 🤭

"Our next-gen lighting model uses real time global illumination to light the world based on the type of star and the planets' atmosphere" - Todd

9teusOU.png
 

Mister Wolf

Member
lol, no. The creative choice here was to go with fidelity rather than framerate. The reason 60fps would be available on PC is because PC has better hardware.

EDIT: Did anyone catch that Real-time Global Illumination is a graphical element to Starfield? That's probably where the limitation comes from, honestly.

I'm looking forward to DF's breakdown of they're Real-time GI system.
 

The Alien

Banned
It's so creative to have a 30fps game with PS3 era NPC models.
I believe the correct excuse response was "filmic".

These incredibly linear games are "filmic"...
so it's OK to be 30fps.
But your huge open-world space exploration game with branching choices, etc. at 30fps is utter shit and sad joke.
 
They did this way back in the PS3/360 era, 2 generations back. You're telling me, they magically consume the same performance woes today using the same engine. Because if I read this correctly, they can literally run this in the 360 if they strip off the graphics.
They increase the complexity with every game.
If you have an open world like Rage 2, most of everything is static. You shoot a garbage bin and all that happens is some generic bullet holes appear on it. Starfield and Bethesda games are on a different level. Every single object is individually made up and reacts to being shot, picked up, moved somewhere else etc. If you are on one planet today, shoot a bin and sent it flying to the otherside of the room, you can come back to that exact same place 6 months later and that bin will be in the exact same place you shot it to when you were last there. It's stuff like that which makes their games so CPU heavy. So yes, they can cut all that back to be like Rage 2 and then get 60fps, but it would not be the open world that they want to create.
It's on another level, and I'm glad they didn't wind back the game.

This has nothing to do with resolution, shadows, lighting etc, it's all CPU.
Making the game 1440P won't give you any real increase in frame rate.

Maybe they can work on a 40fps upgrade performance down the track.
 

01011001

Banned
They increase the complexity with every game.
If you have an open world like Rage 2, most of everything is static. You shoot a garbage bin and all that happens is some generic bullet holes appear on it. Starfield and Bethesda games are on a different level. Every single object is individually made up and reacts to being shot, picked up, moved somewhere else etc. If you are on one planet today, shoot a bin and sent it flying to the otherside of the room, you can come back to that exact same place 6 months later and that bin will be in the exact same place you shot it to when you were last there. It's stuff like that which makes their games so CPU heavy. So yes, they can cut all that back to be like Rage 2 and then get 60fps, but it would not be the open world that they want to create.
It's on another level, and I'm glad they didn't wind back the game.

This has nothing to do with resolution, shadows, lighting etc, it's all CPU.
Making the game 1440P won't give you any real increase in frame rate.

Maybe they can work on a 40fps upgrade performance down the track.

we will see how CPU bound it is when the PC tests come in.
I absolutely bet that the game will be GPU bound.

if it is cpu bound that would mean their minimum PC requirements would run below 30fps, which would be ridiculous
 
Last edited:

LordOfChaos

Member
There's decent feeling 30fps and there's garbage feeling 30fps. More recent games have felt like the latter due to a bunch of reasons one guy explained here once.

I hope it's at least a decent feeling 30fps game. It's not a dealbreaker on its own though I would have always opted for 60 and lower fidelity somewhere else.
 

Mister Wolf

Member
They increase the complexity with every game.
If you have an open world like Rage 2, most of everything is static. You shoot a garbage bin and all that happens is some generic bullet holes appear on it. Starfield and Bethesda games are on a different level. Every single object is individually made up and reacts to being shot, picked up, moved somewhere else etc. If you are on one planet today, shoot a bin and sent it flying to the otherside of the room, you can come back to that exact same place 6 months later and that bin will be in the exact same place you shot it to when you were last there. It's stuff like that which makes their games so CPU heavy. So yes, they can cut all that back to be like Rage 2 and then get 60fps, but it would not be the open world that they want to create.
It's on another level, and I'm glad they didn't wind back the game.

This has nothing to do with resolution, shadows, lighting etc, it's all CPU.
Making the game 1440P won't give you any real increase in frame rate.

Maybe they can work on a 40fps upgrade performance down the track.

On top of that Starfield is using real time global illumination with a real time day/night cycle. 60 fps open world games like
Horizon Forbidden West and Spiderman use baked lighting. Forbidden West cycles through 12 different instances of baked lighting to make up its day/night cycle. Spiderman doesn't even have a day/night cycle. Dynamic lighting is something that gets cut to achieve 60fps on consoles in an open world game.
 
Last edited:
No sympathy for these devs. They trained players over the last 3 years to expect 60 fps while they phoned in last gen game after game with 60 fps modes. Now those 3 years of laziness is blowing up in their face, and they are finally getting their comeuppance.

Wait till you have some dumb marketing exec get wind of this massive pushback against 30 fps games. They ARE going to start intefering with the development process and shove 60 fps down our throats again causing games to once again become cross gen trash instead of the ambitious system driven games like Starfield and Avatar.

Are you trying to rewrite history? 60fps and higher and max setings has always been the rallying cry from gamers and it's almost always coming from the PC side. Did you not live in the so-called of the "PC Master Race", "Lord Gaben" and "Steam"?

No console executive or game studio has ever prioritised framerate over graphics. If they did achieve 60fps, it's because their tech overhead allows to do so such as games like COD.

Clearly you must come from a different dimension where 60fps is the standard back in the days. Try going back to PS3/360 era threads and see how the 30fps supporters are primarily from console players and marketing execs and even the games media. Now you're creating an imaginary oppression from the big bad 60fps movement. The fact that we are discussing this argument today completely flies in your face of this irony.


They increase the complexity with every game.
If you have an open world like Rage 2, most of everything is static. You shoot a garbage bin and all that happens is some generic bullet holes appear on it. Starfield and Bethesda games are on a different level. Every single object is individually made up and reacts to being shot, picked up, moved somewhere else etc. If you are on one planet today, shoot a bin and sent it flying to the otherside of the room, you can come back to that exact same place 6 months later and that bin will be in the exact same place you shot it to when you were last there. It's stuff like that which makes their games so CPU heavy. So yes, they can cut all that back to be like Rage 2 and then get 60fps, but it would not be the open world that they want to create.
It's on another level, and I'm glad they didn't wind back the game.


This has nothing to do with resolution, shadows, lighting etc, it's all CPU.
Making the game 1440P won't give you any real increase in frame rate.

Maybe they can work on a 40fps upgrade performance down the track.


This is a much better response than the other nonsense replies out there. Interesting how you use Rage 2 as a comparison when we Horizon: Forbidden West unless that game is doing more CPU workload at 60fps. But going back to Starfield, that still doesn't look different that what Skyrim is doing, how is tracking an item in a house different than a planet? Those co-ordinates must be stored in same way, since they use the same base engine.

Secondly, these "interactive" items don't really come into play until you yourself interact with it, so how is it using CPU during this time? And even if you create a reaction, aren't they just updating the co-ordinates, unless they need to have BOTH the origin of the default location and the new location's data. If so, that's some poor memory management. It's one thing to give an item to the NPC, with them dynamically using said item to be tracked, but if the item moves one static location to another, then you shouldn't be using more CPU overhead than required.
 
Are you trying to rewrite history? 60fps and higher and max setings has always been the rallying cry from gamers and it's almost always coming from the PC side. Did you not live in the so-called of the "PC Master Race", "Lord Gaben" and "Steam"?

No console executive or game studio has ever prioritised framerate over graphics. If they did achieve 60fps, it's because their tech overhead allows to do so such as games like COD.

Clearly you must come from a different dimension where 60fps is the standard back in the days. Try going back to PS3/360 era threads and see how the 30fps supporters are primarily from console players and marketing execs and even the games media. Now you're creating an imaginary oppression from the big bad 60fps movement. The fact that we are discussing this argument today completely flies in your face of this irony.





This is a much better response than the other nonsense replies out there. Interesting how you use Rage 2 as a comparison when we Horizon: Forbidden West unless that game is doing more CPU workload at 60fps. But going back to Starfield, that still doesn't look different that what Skyrim is doing, how is tracking an item in a house different than a planet? Those co-ordinates must be stored in same way, since they use the same base engine.

Secondly, these "interactive" items don't really come into play until you yourself interact with it, so how is it using CPU during this time? And even if you create a reaction, aren't they just updating the co-ordinates, unless they need to have BOTH the origin of the default location and the new location's data. If so, that's some poor memory management. It's one thing to give an item to the NPC, with them dynamically using said item to be tracked, but if the item moves one static location to another, then you shouldn't be using more CPU overhead than required.
I haven't play Horizon, so I can't really compare to that.

To have the ability to be able to do something like interact with every single object for instance, you have to reserve the ability to do so on the CPU. You can't have a situation where when nothing is really happening you can go and pick up a packet of cigarettes and then move it to another table, but when there is a ton of stuff going on that requires the CPU, then you can't interact with the same packet of cigarettes like you could before.
So is it safe to say that their game might be inefficient because you have a heap of CPU overhead not being used just in case you need to use it? Maybe, but that's the way they want their world to be, so it's the price they have to pay.
 
Last edited:
we will see how CPU bound it is when the PC tests come in.
I absolutely bet that the game will be GPU bound.

if it is cpu bound that would mean their minimum PC requirements would run below 30fps, which would be ridiculous
PC is a different beast, and generally does not get the same level of optimisation as consoles get.
This game will be hard on RAM and CPU. If you look at the game you can tell it isn't using ultra high textures, or crazy polygon numbers.

One interesting thing is that if Bethesda can get the biggest open world game to run fine on XSS with its RAM, then it makes Balders Gate look inept.
 

SlimySnake

Flashless at the Golden Globes
No console executive or game studio has ever prioritised framerate over graphics. If they did achieve 60fps, it's because their tech overhead allows to do so such as games like COD.
Times have changed. we are in the third year of this generation, about to hit the mid way point and everyone is still making cross gen trash. Those same execs forced talented first party devs to slave away on last gen machines to maximize profits. Are you telling me that the execs prioritized graphics over framerate in Diablo, SF6, RE4, Harry Potter?

What about Spiderman 2? Why didnt execs step in and say hey this game barely looks better than the PS4 game, wtf are you guys doing?

What do they care about is online controversies. its why Phil and Jimbo have both done 180s on several of their bs moves. they dont give a shit about the pushing fidelity anymore. they know their games will sell.
 
I haven't play Horizon, so I can't really compare to that.

To have the ability to be able to do something like interact with every single object for instance, you have to reserve the ability to do so on the CPU. You can't have a situation where when nothing is really happening you can go and pick up a packet of cigarettes and then move it to another table, but when there is a ton of stuff going on that requires the CPU, then you can't interact with the same packet of cigarettes like you could before.
So is it safe to say that their game might be inefficient because you have a heap of CPU overhead not being used just in case you need to use it? Maybe, but that's the way they want their world to be, so it's the price they have to pay.

So marking objects to be interacted with CPU is still reserved data to be called from CPU but until they are, that data should be stored in a table to be accessed from RAM or fast SSD so it looks like an engine problem, one they should've built from the ground up.

The only REAL argument for CPU overhead is the Real Time Global Illumination (RTGI) as light is constantly being tracked across planets, indoors and tracking shadows. Interestingly, none of these 30fps bozos ever considered that possibility. If Bethesda decided that RTGI cannot be disabled in favour of baked lighting then I can believe they have deliberately capped their requirements at to maintain a minimum stable frame rate which is 30fps.
 
Top Bottom