• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Todd Howard says Starfield is already optimized "You might need to upgrade your PC."

ChoosableOne

ChoosableAll
Perhaps if you hadn't placed a piece of paper we could interact with in every cupboard, this wouldn't have happened.

It's not performs that bad though.
 

GHG

Member
I know exactly what you are saying.

The more simple your game look, the more easier is to make advanced physics for it, in reality trees don't break in 3 perfect parts and 90% of the silly stuff would look ridicolous in a realistic looking game, i said the same thing multiple times in multiple topics in the past.

Noita looks like shit and his physics system make zelda look like the most static game in the world (or any game really)

I'd say the only things out there comparable to Noita are Teardown and Instruments of Destruction:



 

Danknugz

Member
runs fine here with no DLSS or whatever. only playing on a 60hz panel though.

13900kf/32gb@6400/4090/ryujin II 360

cpu temps hover around 50c gpu usage can get up to 70%
 

Sorcerer

Member
Todd Howard: we did... you might need to upgrade your PC

100+ patches in the future disagree with you Todd!!!
 

AJUMP23

Member
Queen Youre The Best GIF by Pudgy Penguins

BASED!
 

GymWolf

Member
I'd say the only things out there comparable to Noita are Teardown and Instruments of Destruction:




Comparable but still way inferior, noita simulate every damn pixel on screen so everything is single-pixel precise, from destruction to gore to everything like getting a single pixel of blood on your robe, shit you don't even need that level of precision in a vg.
and on top of that, it has like 3x times the number of elements\materials interaction that zelda has.


But that instruments of destruction look super cool.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Like always pc gamers overestimating their pcs and crying when next gen only games doesn't run that well...
lol exactly. People bought $1649-2000 4090s then paired them up with cheap $300 CPUs that max out at 4.45 Ghz and 65 watts then wonder why their game is bottlenecked. Yes, the 5800x3d was great when they were running games designed on a 1.6 Ghz jaguar CPU from 2010. But everyone who shat on mid range intel CPUs maxing out at 5.0 to 5.1 Ghz consuming over 120 watts is now wondering why those CPUs are running the game better.

This is on those idiots who run PC review channels who continuously played up the AMD power consumption and low cooling requirements over actual performance. hell, it got so bad that i was able to buy intel CPUs at a $100 discount over the equivalent AMD products because everyone wanted the low tdp cooler CPU.

Well, now you are fucked. Still, if you can afford a $1,200 4080 or a $1,650 4090 you should be able to go out there and upgrade to a $400 Ryzen 7800x3d which should give you a massive 40% performance increase over the 5800x3d.

Hell, even AMDs latest $299 CPUs ensure that they dont bottleneck your GPU. I get 98% GPU utilization in every city, town, ship and outdoors and i have a $300 intel CPU from 3 years ago.
 

Mister Wolf

Member
lol exactly. People bought $1649-2000 4090s then paired them up with cheap $300 CPUs that max out at 4.45 Ghz and 65 watts then wonder why their game is bottlenecked. Yes, the 5800x3d was great when they were running games designed on a 1.6 Ghz jaguar CPU from 2010. But everyone who shat on mid range intel CPUs maxing out at 5.0 to 5.1 Ghz consuming over 120 watts is now wondering why those CPUs are running the game better.

This is on those idiots who run PC review channels who continuously played up the AMD power consumption and low cooling requirements over actual performance. hell, it got so bad that i was able to buy intel CPUs at a $100 discount over the equivalent AMD products because everyone wanted the low tdp cooler CPU.

Well, now you are fucked. Still, if you can afford a $1,200 4080 or a $1,650 4090 you should be able to go out there and upgrade to a $400 Ryzen 7800x3d which should give you a massive 40% performance increase over the 5800x3d.

Hell, even AMDs latest $299 CPUs ensure that they dont bottleneck your GPU. I get 98% GPU utilization in every city, town, ship and outdoors and i have a $300 intel CPU from 3 years ago.

Nvidia has already provided a way around CPU bottlenecks with Frame Generation. Worked for Potter's bottleneck, worked for Jedi Survivors' bottleneck, and works for this too.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Nvidia has already provided a way around CPU bottlenecks with Frame Generation. Worked for Potter's bottleneck, worked for Jedi Survivors' bottleneck, and works for this too.
yeah, the problem is that jedi and this game dont have dlss at launch which renders that feature useless. Nvidia shouldve matched whatever the fuck AMD was offering to remove DLSS support from these games. Doubt it was more than a few million dollars. Nvidia makes billions in profit every quarter, they should be going out there and at least making sure that games ship with DLSS support.
 

Elysium44

Banned
lol exactly. People bought $1649-2000 4090s then paired them up with cheap $300 CPUs that max out at 4.45 Ghz and 65 watts then wonder why their game is bottlenecked. Yes, the 5800x3d was great when they were running games designed on a 1.6 Ghz jaguar CPU from 2010. But everyone who shat on mid range intel CPUs maxing out at 5.0 to 5.1 Ghz consuming over 120 watts is now wondering why those CPUs are running the game better.

This is on those idiots who run PC review channels who continuously played up the AMD power consumption and low cooling requirements over actual performance. hell, it got so bad that i was able to buy intel CPUs at a $100 discount over the equivalent AMD products because everyone wanted the low tdp cooler CPU.

Well, now you are fucked. Still, if you can afford a $1,200 4080 or a $1,650 4090 you should be able to go out there and upgrade to a $400 Ryzen 7800x3d which should give you a massive 40% performance increase over the 5800x3d.

Hell, even AMDs latest $299 CPUs ensure that they dont bottleneck your GPU. I get 98% GPU utilization in every city, town, ship and outdoors and i have a $300 intel CPU from 3 years ago.

Hi Todd.

Seriously are you unironically saying the 5800x3D isn't good enough any more. It isn't a 65w CPU either.

The game recommends an AMD Ryzen 5 3600X or Intel i5-10600K.
 

SlimySnake

Flashless at the Golden Globes
Hi Todd.

Seriously are you unironically saying the 5800x3D isn't good enough any more. It isn't a 65w CPU either.

The game recommends an AMD Ryzen 5 3600X or Intel i5-10600K.
i looked at several benchmarks of the game running on a 5800x3d. it stays in the 60 watt range. i saw it jump to 75 watt in one video but it is definitely not like the unlocked kf CPUs from intel that can hit up to 120 watts in cybperunk and starfield.
 

Clear

CliffyB's Cock Holster
I know exactly what you are saying.

The more simple your game look, the more easier is to make advanced physics for it, in reality trees don't break in 3 perfect parts and 90% of the silly stuff would look ridicolous in a realistic looking game, i said the same thing multiple times in multiple topics in the past.
But super realistic physics in zelda would probably break the gameplay because you expect some result when doing something, real physics would be too chaotic and unprecise for what zelda want to achieve, it is already a bitch to drive what you build in tokt because the whole system is completely physic based and most of the time it is not "fun" to drive those things unless you are precise to the millimeter with superhand when you build them.

Noita looks like shit and its physics system make zelda look like the most static game in the world (or any game really)

The reality is that every process costs memory space and bandwidth, so there's an almost "hidden" cost on top of what each process (be it on GPU or CPU) is doing. It'll definitely be interesting to see how Nintendo's engineers work around this, because make no mistake Zelda is extremely well built. I do think Nintendo's visual language helps, but I wonder whether they'll be satisfied with relatively small improvement in detail and complexity just with better frame-rates and resolutions.
 

Elysium44

Banned
i looked at several benchmarks of the game running on a 5800x3d. it stays in the 60 watt range. i saw it jump to 75 watt in one video but it is definitely not like the unlocked kf CPUs from intel that can hit up to 120 watts in cybperunk and starfield.

The Ryzen 7 5800X3D has a 105W TDP rating and maxed out at 130W in our tests


This is not some budget CPU, nor is it very old.

I have seen the Starfield benchmarks where AMD CPUs are underperforming where you'd expect them to be, versus Intel ones. But it doesn't seem to be the fault of the AMD CPUs, or certainly not that they are power limited I mean.
 
Last edited:

winjer

Gold Member
lol exactly. People bought $1649-2000 4090s then paired them up with cheap $300 CPUs that max out at 4.45 Ghz and 65 watts then wonder why their game is bottlenecked. Yes, the 5800x3d was great when they were running games designed on a 1.6 Ghz jaguar CPU from 2010. But everyone who shat on mid range intel CPUs maxing out at 5.0 to 5.1 Ghz consuming over 120 watts is now wondering why those CPUs are running the game better.

This is on those idiots who run PC review channels who continuously played up the AMD power consumption and low cooling requirements over actual performance. hell, it got so bad that i was able to buy intel CPUs at a $100 discount over the equivalent AMD products because everyone wanted the low tdp cooler CPU.

Well, now you are fucked. Still, if you can afford a $1,200 4080 or a $1,650 4090 you should be able to go out there and upgrade to a $400 Ryzen 7800x3d which should give you a massive 40% performance increase over the 5800x3d.

Hell, even AMDs latest $299 CPUs ensure that they dont bottleneck your GPU. I get 98% GPU utilization in every city, town, ship and outdoors and i have a $300 intel CPU from 3 years ago.

Give me a break. Just because the 5800X3D runs not so well in the worst optimized game of the year, doesn't mean it's obsolete.
A game as poorly optimized as Starfield is not an example of the performance of any CPU or even GPU.
The reality is that the 5800X3D still performs very well, better than CPUs from the same generation.

BTW, the 5800X3D is not a 65W part.

WoF97GA.png


Here is more benchmarks with CPUs.
N0JIRLF.png
 
Last edited:

Mister Wolf

Member
yeah, the problem is that jedi and this game dont have dlss at launch which renders that feature useless. Nvidia shouldve matched whatever the fuck AMD was offering to remove DLSS support from these games. Doubt it was more than a few million dollars. Nvidia makes billions in profit every quarter, they should be going out there and at least making sure that games ship with DLSS support.

Now that Nvidia know we'll just patch it in ourselves they really won't spend the money. I'm not even fussed about it. As long as I get my desired outcome then it's all good. Survivor was a great experience with the unofficial DLSS and Frame Generation mod. They had it up and running for Starfield before the official release haha. Honestly the only game that ever pissed me off on PC is Elden RIng with its shit shader stutters and From's insistence on capping their games to 60fps with the logic of the game breaking if you forcefully unlock it.
 

SlimySnake

Flashless at the Golden Globes
Give me a break. Just because the 5800X3D runs not so well in the worst optimized game of the year, doesn't mean it's obsolete.
A game as poorly optimized as Starfield is not an example of the performance of any CPU or even GPU.
The reality is that the 5800X3D still performs very well, better than CPUs from the same generation.

BTW, the 5800X3D is not a 65W part.

WoF97GA.png


Here is more benchmarks with CPUs.
N0JIRLF.png
Here you go. like i said i checked several videos before i made those comments.





Maybe overclocking gets them to 70s and 80s whereas the kf models on intel cards just go above 100-120 watts no problems.
 

GymWolf

Member
The reality is that every process costs memory space and bandwidth, so there's an almost "hidden" cost on top of what each process (be it on GPU or CPU) is doing. It'll definitely be interesting to see how Nintendo's engineers work around this, because make no mistake Zelda is extremely well built. I do think Nintendo's visual language helps, but I wonder whether they'll be satisfied with relatively small improvement in detail and complexity just with better frame-rates and resolutions.
It depends on how powerfull their next console is, if it's really better than a ps4 with better cpu they can't just release another dlc looking\feeling game.

They need a big step forward.
 

winjer

Gold Member
Here you go. like i said i checked several videos before i made those comments.





Maybe overclocking gets them to 70s and 80s whereas the kf models on intel cards just go above 100-120 watts no problems.


And you insist that Starfield is a good example of CPU performance.

Well, here is another poorly optimized game, though not as bad as Starfield.
And look at that. The 5800X3D is the 3rd best CPU ever made. It even beats the 13900K by a good margin.
See, two can play that game.
YVCA40d.png
 
Last edited:


15 years ago. 15 years.


Still mightily impressive game (visually).

Kinda blows my mind that this is the first game that says it needs an SSD and it actually kinda does. Borderline unplayable in my HDD lol. Weird cause it doesn’t seem to be doing anything precious games don’t. Oh well. SSD’s are dirt cheap lol

The SSD is there to help brute force load times. Cannot imagine how this game runs off a platter drive.

It's incredibly liberating to have absolutely no interest in playing this game right now.

The more I'm seeing, the less interested I am in giving it a try. And that's with me not paying out of pocket for it; my gaming time is limited these days and I'd rather put it towards surefire fun times. Quite plenty RPGs for me to still catch up with.

But, I'll still give it a shot, just to see some things (and maybe do some comparisons).

The game can look decent at times with some direct light sources like a sun, but without a light source and just using ambient lighting like atlantis at night it can look very flat and dull.


2tAsdU1.jpeg

Basically the same problem as Halo Infinite.
 

Elysium44

Banned
Here you go. like i said i checked several videos before i made those comments.

Maybe overclocking gets them to 70s and 80s whereas the kf models on intel cards just go above 100-120 watts no problems.

The AMD CPUs are a lot more power efficient so they would rightly use less power to get the same sort of performance, I think that's a red herring here. Look at one of the comments from the first video:

I got similar and a little better performance on my RTX 4080 1440p native. I pair it with a 13700K, it's about the same usage as 5800X3D. It's a ZOTAC OC version of the card. My averages at the same spot you're showing is over 70 , and the card is at 2775Mhz , not 2820, I can overclock it to 2950 and gain 1 or 2 fps, doesn't matter much.This game is demanding for no particular reason..

(This page below shows the relative power efficiency of the 13700K versus the 5800X3D.)

 
Last edited:

SlimySnake

Flashless at the Golden Globes
And you insist that Starfield is a good example of CPU performance.

Well, here is another poorly optimized game, though not as bad as Starfield.
And look at that. The 5800X3D is the 3rd best CPU ever made. It even beats the 13900K by a good margin.
See, two can play that game.
YVCA40d.png
Completely different kind of game. Its a turn based top down game with a completely different workload compared to starfield.

I view games performance with the same lens we have been all year. With games like star wars, gotham knights, immortals and hogwarts that have trash CPU usage. Awful multithreading which was a UE4 feature and somehow has managed to become a bottleneck in UE5 games 10 years later.

I am not seeing that here. Game is utilizing all 16 threads. CPU usage in CPU bound cities is almost 75% which is insane. not even cyberpunk which scales really well with cores and threads goes up that high. i was hitting 75 degrees in games. i hit that in CPU benchmarks. Hell, i honestly dont want it going over that because i dont want my cpu running at 80 degrees for hours.


Could it be improved? sure. but i can only look at the results. the results show properly scaling as you go from last gen intel CPUs to current gen. it shows properly scaling in current gen AMD CPUs. there is no cpu idle time we saw in gotham knights and star wars. the 5800x3d is performing better than every 5000 series CPU so its just an issue with AMDs cpus that were maxing out at 4.45 ghz and did not allow the tdp to go over 100 watts like intel CPUs do.
 

StereoVsn

Member
Todd… you don’t have Ray tracing , lumens, and all this other crazy visual shit in your game. To boot you took a blood deal to dismiss DLSS for cash so you prayed modders will help on that.

At least it doesn’t have shader compilation stutter or one cpu core doing all the work with no GPU usage like the last 3 years have been plagued with .

It can absolutely be better optimized.
And at least no Denuvo. So that's another plus.

That said, Todd has brass balls just to BS like that, lol.
 

SlimySnake

Flashless at the Golden Globes
Now that Nvidia know we'll just patch it in ourselves they really won't spend the money. I'm not even fussed about it. As long as I get my desired outcome then it's all good. Survivor was a great experience with the unofficial DLSS and Frame Generation mod. They had it up and running for Starfield before the official release haha. Honestly the only game that ever pissed me off on PC is Elden RIng with its shit shader stutters and From's insistence on capping their games to 60fps with the logic of the game breaking if you forcefully unlock it.
i didnt pay for the starfield dlss mod because fuck paying for mods and the dlss mod for starfield crashed my pc last night so no dice.

i blame this on Microsoft, Nvidia and Bethesda. These are trillion dollar companies begging each other for a few million while getting rich off of their userbase. But thats not on bethesda's engineers.
 

LordCBH

Member
The SSD is there to help brute force load times. Cannot imagine how this game runs off a platter drive.

Not good. Like the frame rate is good on my hardware, but the game hard freezes for seconds at a time every minute or so, and dialogue takes up to 30 seconds to play when you talk to NPCs. Sound jitters and cuts in and out too.

I’m convinced it runs like this on an HDD because they didn’t have to bother optimizing it for HDD’s since consoles have SSD’s now. In terms of what’s going on it really doesn’t do anything FO4 didn’t, and it still has to load when you go to any building or location.

Load times on the HDD aren’t bad, except for the initial load.
 

DeaDPo0L84

Member
Bloomberg audience Q: Why did you not optimize Starfield for PC?

Todd Howard: we did... you might need to upgrade your PC


As much of a dick answer that it is, I laughed cause the guys pretty much saying "fuck off figure it out on your own". Also I have a 4090+13700k which runs the game great with DLSS 3.0. With that said the game itself has underlying issues, especially if you have too many things in your inventory and switch between weapons with a lot of mods, the fps will drop by 20-30 for a second, same with the scanner. This isn't a hardware issue on the users side, it's something limited by the game's own engine.

I did dump a ton of stuff last night out of my inventory and it *seemed* to fix it, but I didn't test it extensively.
 

SlimySnake

Flashless at the Golden Globes
The AMD CPUs are a lot more power efficient so they would rightly use less power to get the same sort of performance, I think that's a red herring here. Look at one of the comments from the first video:



(This page below shows the relative power efficiency of the 13700K versus the 5800X3D.)

nah, the 13700k destroys the 5800x3d. 67 vs 99 fps. no idea what this guy is talking about.

e6y3Ycj.jpg
d8lPY6x.jpg
 

winjer

Gold Member
Completely different kind of game. Its a turn based top down game with a completely different workload compared to starfield.

They are both rendering 3D environments. Just because they use different POVs doesn't mean they are so much different tech.
And being turn based, doesn't mean the game stops rendering 3D environments.

I view games performance with the same lens we have been all year. With games like star wars, gotham knights, immortals and hogwarts that have trash CPU usage. Awful multithreading which was a UE4 feature and somehow has managed to become a bottleneck in UE5 games 10 years later.

I am not seeing that here. Game is utilizing all 16 threads. CPU usage in CPU bound cities is almost 75% which is insane. not even cyberpunk which scales really well with cores and threads goes up that high. i was hitting 75 degrees in games. i hit that in CPU benchmarks. Hell, i honestly dont want it going over that because i dont want my cpu running at 80 degrees for hours.

Being multithreaded is not the only way a game has to be optimized.
What does it matter if Starfield can spawn a lot of threads, if then it looks like a PS4 game and runs like a PS3 game.

Could it be improved? sure. but i can only look at the results. the results show properly scaling as you go from last gen intel CPUs to current gen. it shows properly scaling in current gen AMD CPUs. there is no cpu idle time we saw in gotham knights and star wars. the 5800x3d is performing better than every 5000 series CPU so its just an issue with AMDs cpus that were maxing out at 4.45 ghz and did not allow the tdp to go over 100 watts like intel CPUs do.

You have got so much wrong about the 5800X3D, it's quite impressive.
The 5800X3D is a 105W part. And it can use that and a bit more if required.
In games, the 5800X3D is never limited by power constrains. Never.
Games do note require as much power usage as complex apps that hit all cores on a CPU.
That is why in games the 13900K "only" uses 100-120W". But in heavy applications it can go close to 300W.
 

DeaDPo0L84

Member
Also, what is next-gen about this game? It's a big game, it has scale but so do a lot of other games. It also like others can look really good at times but then also like absolute dog shit during others. It has next to no AI implementation, no ray tracing, loading screens for nearly everything regardless if you're just leaving your ship, entering a building, etc, archaic NPC interaction where apparently the game can't fit more than one person on screen at a time, space is extremely limited and again requires loading screen transitions to land on planets.

I say all that and I'm still enjoying the game, even look forward to doing so when I'm at work, but nothing I am doing in Starfield has come across as "holy shit I've never seen this or been able to do this before in a videogame!".
 
Last edited:

SlimySnake

Flashless at the Golden Globes
They are both rendering 3D environments. Just because they use different POVs doesn't mean they are so much different tech.
And being turn based, doesn't mean the game stops rendering 3D environments.



Being multithreaded is not the only way a game has to be optimized.
What does it matter if Starfield can spawn a lot of threads, if then it looks like a PS4 game and runs like a PS3 game.



You have got so much wrong about the 5800X3D, it's quite impressive.
The 5800X3D is a 105W part. And it can use that and a bit more if required.
In games, the 5800X3D is never limited by power constrains. Never.
Games do note require as much power usage as complex apps that hit all cores on a CPU.
That is why in games the 13900K "only" uses 100-120W". But in heavy applications it can go close to 300W.
Come on dude. the GOW PS3 games were top down games and were a completely different beast from PS4 era games because they could just choose what to load and what not to. i know BG3 lets you go into over the shoulder cam at times but its not even remotely close to what starfield is doing.

And I am sorry but this doesnt look like a PS4 game to me. Yes, its ugly outdoors but indoors its the best looking graphics ive seen all gen.

I stand corrected on the 5800x3d wattage but i did look at benchmarks which show it essentially running like a 65 watt CPU. So i guess this particular CPU isnt hitting its full potential. But if you look at the benchmarks, its the case for all 3000 and 5000 series CPUs so its probably the engine simply preferring higher clocked CPUs and whatever AMD is doing with their 7000 series lineup.
 

winjer

Gold Member
Come on dude. the GOW PS3 games were top down games and were a completely different beast from PS4 era games because they could just choose what to load and what not to. i know BG3 lets you go into over the shoulder cam at times but its not even remotely close to what starfield is doing.

And I am sorry but this doesnt look like a PS4 game to me. Yes, its ugly outdoors but indoors its the best looking graphics ive seen all gen.

Just because a game is top down view, doesn't mean it stops rendering 3D stuff.
Both games are rendering 3D worlds. With the big diference that open areas in DG3 look good, while Starfield looks like crap, while running like crap.

Yes, Starfield indoors in pretty, but indoors is doesn't hammer the CPU.

I stand corrected on the 5800x3d wattage but i did look at benchmarks which show it essentially running like a 65 watt CPU. So i guess this particular CPU isnt hitting its full potential. But if you look at the benchmarks, its the case for all 3000 and 5000 series CPUs so its probably the engine simply preferring higher clocked CPUs and whatever AMD is doing with their 7000 series lineup.

No, the 5800X3D it is not power limited in games. It never is.
The limit it hits in games is the clock speed of 4.45 Ghz. Always.
Even in heavy plications it never is power limited. It sooner becomes temperature limited, than power limited.
I know that. I tested it extensively with y-cruncher and Prime95 small FFTs, and with undervolting.
Guess what. I can get higher performance in y-cruncher if I power limit the 5800X3D, tweaking EDC, PPT and TDC.
 

LiquidMetal14

hide your water-based mammals
For the game to have some of the technical issues and it being a next generation exclusive, minus the gift Series S version, it shows how Bethesda is nowhere near the top studios on a technical or game level. I don't think he may even be lying even in his own eyes but what you see is what you get so be glad that it isn't a complete mess.
 

Ovech-King

Gold Member
There is enough proof in new atlantis that they can do better . Just for one example I found one specific spot (and there is probably many) where i can stand and get 52fps stable (to play around with the resolution scale and make sure i get 60 fps EVERYWHERE ) while everywhere else around or close that spot and the rest of fhe city is between 58-60 on average (limited at 60fps in nvidia control panel )

I’m not eating this statement Todd , no sir!
 

SlimySnake

Flashless at the Golden Globes
Also, what is next-gen about this game? It's a big game, it has scale but so do a lot of other games. It also like others can look really good at times but then also like absolute dog shit during others. It has next to no AI implementation, no ray tracing, loading screens for nearly everything regardless if you're just leaving your ship, entering a building, etc, archaic NPC interaction where apparently the game can't fit more than one person on screen at a time, space is extremely limited and again requires loading screen transitions to land on planets.

I say all that and I'm still enjoying the game, even look forward to doing so when I'm at work, but nothing I am doing in Starfield has come across as "holy shit I've never seen this or been able to do this before in a videogame!".
Watch Fallout 4 then Starfield. Its a massive upgrade.

Bethesda engines have always been heavy on CPUs. This time around they upgraded their lighting model to be realtime GI so basically what RT and Lumen does but without using RT cores. But a clear step above last gen games. Not a single game i know of used realtime GI. I think some racing games like driveClub and GT7 but thats about it. UE4 had support for it but was pulled after launch.

they need to ditch that engine but there is stuff here that is way beyond what they did last gen and a clear upgrade over games this gen. it just doesnt show all the time because they downgraded the shit out of the open world. probably because they were worried about the backlash even higher performance requirements would cause.
 
For the game to have some of the technical issues and it being a next generation exclusive, minus the gift Series S version, it shows how Bethesda is nowhere near the top studios on a technical or game level. I don't think he may even be lying even in his own eyes but what you see is what you get so be glad that it isn't a complete mess.
Fallout 76 looked very outdated in 2018
Bethesda was never good in technical aspects
 

twilo99

Member
They wont. For $500 - 600, you get what you pay for.

Very true, but using such archaic architecture is not ideal for gaming in general.

Microsoft used the same old CPU in a $300 and a $500 console so there is some margin there. I know the GPU is different but that can’t be worth $200. Also, get rid of that disk drive!

Sony could’ve spent that extra margin from the cd drive version on slightly better CPU.
 
Top Bottom