• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi Survivor PC Performance Discussion OT: May The Port Be With You

Braag

Member


19GB @4:45 in the video and 22GB @7:07

Funny GIF

LMAO
There's literally no optimization at all. What game eats fucking 22GB of VRam at 1440p. No way the first day patch is capable of fixing something this broken. It will take them 6 - 12 months to maybe fix this, if even then. So save your money I guess.
Also Steam reviews are gonna rip this to shreds, followed by a DF video which will call them out for this shitshow. Gonna be fun to watch.
 

SmokSmog

Member
Next gen only game made for consoles with gimped Zen2, Zen3 isn't that much faster, maybe 50% at best so you have 30FPS on consoles and 45FPS with drops on PC with Zen3.
No direct storage to help the CPU with texture streaming.

Welcome to next gen gaming.

You need 24GB Vram + 64GB RAM + super fast single core performance like on 5,5ghz 13900K/13700K or 7800X3D.

Cope or seethe.

e37.jpg
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Next gen only game made for consoles with gimped Zen2, Zen3 isn't that much faster, maybe 50% at best so you have 30FPS on consoles and 45FPS with drops on PC with Zen3.
No direct storage to help the CPU with texture streeming.

Welcome to next gen gaming.

You need 24GB Vram + 64GB RAM + super fast single core performance like on 5,5ghz 13900K/13700K or 7800X3D.

Cope or seethe.

Its just a bad port.
Not indicative of anything other than these guys didnt bother to QA the PC version as it is right now.
Will wait for the day one update to see what its actually like......then wait a month for the update that actually makes the game playable.
 

kingyala

Banned
Game is phenomenal, GOTY contender, but performance even on high end PCs is awful.

Saw this review posted in the review thread, and figured i'd post a PSA here since a lot of us got burned on TLOU and Hogwarts recently. Also, probably a good idea to keep this discussion out of the OT/Review threads. The game is truly great, and its a shame that the performance discussion might drown out everything else the game is doing right.





Source: https://www.pcmrace.com/2023/04/26/star-wars-jedi-survivor-review/

Summary:
- No DLSS Support since this is an AMD sponsored title.
- VRAM issues plaguing even the 12GB 3080 Ti
- Prolonged Stuttering/pop-in plentiful

EDIT: FextraLife guy destroys the game's performance on PC. Calls it abysmal. Does not recommend getting it on PC. XBox version is fine. Single digit dips through out the game's 20-25 hour campaign on a 3080Ti at 1440p high settings.



Here is a 4090 running the game at just 1440p and experiencing prolonged frame drops below 40 fps.



Seeing as how TLOU ran rather well on 16GB cards, let alone the 24GB 4090, this is arguably even worse.

P.S Reports that Xbox 60 fps version got a big boost with the day one patch but no such report for PC. So buy at your own risk!

and lord alex convinced people that a 2070 super was enough for this generation :messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy:, i told people 8gb cards are dead, this happens every generation, why did people think that this time itll be any different..... and even worse ''Mark cerny'' warned fools, but they called him a fool instead, the more the games push the decompression on consoles the more problematic pc ports are going to be... because not only do this consoles have more unified memory they also dont need to hold too much data on memory because they can simply decompress it on the fly without you noticing, but let me wait for the comparissons first...
 

kingyala

Banned
Or if it's done incompetently on PC. Don't we have a bunch of next-gen games that look and run much better than this? A Plague Tale: Requiem, FS 2020, Returnal (a port of a PS5 game), Dead Space Remake. Even Forspoken which admittedly looks bad, still runs much better on PC. But the moment we have a game that performs horribly on PC, hur hur, I/O secret sauce.
the problem is texture decompression, plague tale is simply a compute problem which is easy for pc gpu's to solve, flight simulator was designed for pc by a microsoft studio so it should scale well, dead space remake doesnt contain any special assets or requires any sort of special texture decompression it uses repeated assets, you can argue returnal but again returnal isnt a graphics hog, its more of a cpu, compute hog with all those particles and simulations going on and was designed as a 60fps game so should scale well with enough optmization on pc.... jedi survivor though im now convinced it could be using alot of unique textures similar to the last of us and requires alot of cpu decompression on pc
 

Gaiff

SBI’s Resident Gaslighter
the problem is texture decompression, plague tale is simply a compute problem which is easy for pc gpu's to solve, flight simulator was designed for pc by a microsoft studio so it should scale well, dead space remake doesnt contain any special assets or requires any sort of special texture decompression it uses repeated assets, you can argue returnal but again returnal isnt a graphics hog, its more of a cpu, compute hog with all those particles and simulations going on and was designed as a 60fps game so should scale well with enough optmization on pc.... jedi survivor though im now convinced it could be using alot of unique textures similar to the last of us and requires alot of cpu decompression on pc
Oh, shut the fuck up. Why is it taking 15 seconds to load muddy textures here on PS5?

 
Last edited:

SlimySnake

Flashless at the Golden Globes
Nick said that the PS5 version is locked at native 4k 30 fps and mostly hits 60 fps in the 1440p mode with some drops in the open world sections. However those drops arent as bad as the ones he had while traversing the open world levels on PC.

PC version also had a lot more crashes and bugs that he didnt get on PS5.

 

Arsic

Loves his juicy stink trail scent
Damn I was ready to give EA $15 for the month of pro membership to play this, but I don’t think I’ll buy a PS5 version for $60
 

kingyala

Banned
Lazy fucking developers. That’s what we’re up against. Don’t give me that horseshit about they’re focused on consoles either. If you’re releasing a $70 PC version it had better work.

Even the Dead Island 2 developers who were like the third or fourth team on that train wreck managed to do better than all of these so called AAA teams.
dead island is a ps4 game, it can run on 6gb gpu... you can blame developers until jesus comes back but the fact is 8gb vram is officially dead and the quicker you accept this fact, the easier everything will be.. and people should learn to listen to actuall computer engineers and developers and stop following youtube influencer plaster knowledge...

mark cerny explained why ssd decompression matters... people didnt listen, dozen developers explained why ssd decompression is a game changer and people called it blushing... alex said a rtx 2070 super is enough for the generation by comparing crossgen games and called the console ssd decompression... hot air.. and people believed it, like a herd of bind sheep, and now the truth came out, suddenly all developers are called lazy!... maybe just stop following youtube and listen to actuall engineers next time.
 

kingyala

Banned
Imagine if they actually finished the game on time and didnt need a day one patch, crazy idea, I know
impossible nowadays, 1 because of crossgen devs have to make sure a game works from xbone to a 4090 pc, 2. memory is bigger therefore games are bigger means more content, more asset variation one asset has to have low to ultra textures and geometry to support lowest performing hardware to more performant hardware, different hardware configurations and api's, teams have to make sure a game works well on direct x and xbox machines then again playstation, then different pc's then nintendo sometimes... its just a soup of problems... more power brings more possibility and problems at the same time
 

aries_71

Junior Member
I thought potential PC sales and market share are high. Do they really give a shit about them? Or maybe they think PC players are going to buy it no matter what?
 

SlimySnake

Flashless at the Golden Globes
On the flip side TLoU got patched today. Running beautifully now, I’ve just got it locked to 110fps with FSR Quality. VRAM usage slashed in half (just over 9GB), I’m running at high settings (don’t see the point in Ultra), and GPU usage at 84% with CPU no longer acting crazy.

Just goes to show I/O has fuck all to do with any of this. It’s developers not putting in the work.
Yeah, I can confirm. Downloading and installing the patch was still a pain. 1 hour to download and install even though i have 1 gig internet. Then the game compiled shaders and crashed. Verified files to redownload them and this time it started the shader install at 0%. I swear i did this just two days ago with the last patch.

But once installed, the game does run beautifully now. I was getting 40 fps just a couple of days ago in Jackson, getting a locked 60 fps there with only 85% GPU utilization. VRAM is down to just 7.5 GB. I increased the new Texture Streaming rate setting to 1.5x and it jumped to 8.2GB. Sometimes it jumps to 8.4 GB but hardly the 9.2 GB with stutters down to 5 fps i was getting before.

There are still minor hitches with the texture streaming rate setting maxed out but at normal or 1.0, it was pretty much a smooth experience. Shocking that it took them a month to figure this out, but hey what are you gonna do.
 

Gamezone

Gold Member
I'm pretty sure I watched many reviews who mentioned that the performance on consoles was pretty garbage on consoles as well, especially when not running in performance mode.
 

Alex11

Member
impossible nowadays, 1 because of crossgen devs have to make sure a game works from xbone to a 4090 pc, 2. memory is bigger therefore games are bigger means more content, more asset variation one asset has to have low to ultra textures and geometry to support lowest performing hardware to more performant hardware, different hardware configurations and api's, teams have to make sure a game works well on direct x and xbox machines then again playstation, then different pc's then nintendo sometimes... its just a soup of problems... more power brings more possibility and problems at the same time
My God good man, I respect you for not beating down on the devs, but I think excuses have a limit, I mean it's not like they have to make sure the game works also on your washing machine. For sure there are a lot of variables to consider, and different hardware and yada yada, but come on, enough excuses, when a 4090 struggles at 1440 and uses 22 effing gigs, I think that'll do.
 
Last edited:

CGNoire

Member
So is a lack of Direct Storage the issue here?
Like is the cpu just being thrashed constantly?

When is PC getting hardware decompression standard?

Oh and visualy this is clearly a PS4 game with the PS4 version cancelled for optimization budget reasons.
 

SeraphJan

Member
Nick said that the PS5 version is locked at native 4k 30 fps and mostly hits 60 fps in the 1440p mode with some drops in the open world sections. However those drops arent as bad as the ones he had while traversing the open world levels on PC.

PC version also had a lot more crashes and bugs that he didnt get on PS5.


This is so disappointing, this means unless they fixed it soon enough, I'm forced to buy console version again.
 
Last edited:

kingyala

Banned
My God good man, I respect you for not beating down on the devs, but I think excuses have a limit, I mean it's not like they have to make sure the game works also on your washing machine. For sure there are a lot of variables to consider, and different hardware and yada yada, but come on, enough excuses, when a 4090 struggles at 1440 and uses 22 effing gigs, I think that'll do.
dont you think ur being harsh now.. cause your assuming that somehow in the entire history of the universe that somehow in some sort of magical way suddenly game developers have all become lazy!... i think your argument is the only lazy thing here, try to think... and ask more important questions! like why are games broken? instead of just lazyly thinking and believing and blaming developers for lazyness...

pc's have always been complicated anyway and secondly back in the days you had fucking playstation 2 with 32 mb of ram and xbox with 128 mb ... happy days you just create a small game with a small team and focus on 2 machines, this went on until now where you have faster storage faster processors and 12gb+ of memory to deal with and all the content you have to make add that with optimizing for it to work on ps5 series x and confused series s + pc where some people still have 6/8gb cards which are probably most pc users and the first guys to come online and complain on how shitty a pc port is!...

and it doesnt stop there somehow gamers now demand 60fps as mandatory and anything else is considered lazy... so developers have to deal with all this pile of crap that isnt their fault, rather the problem is hardware manufacturers like nvidia who keep sucking your pockets with 8gb gpu's, old pc architecture that needs a revamp and customer demands that doesnt correspond to their hardware... its retarded and actually needs an incredible level of ignorance to think that somehow all developers are lazy, and intentionally produce bad ports... this is extreme ignorance
 

Grechy34

Member
dont you think ur being harsh now.. cause your assuming that somehow in the entire history of the universe that somehow in some sort of magical way suddenly game developers have all become lazy!... i think your argument is the only lazy thing here, try to think... and ask more important questions! like why are games broken? instead of just lazyly thinking and believing and blaming developers for lazyness...

pc's have always been complicated anyway and secondly back in the days you had fucking playstation 2 with 32 mb of ram and xbox with 128 mb ... happy days you just create a small game with a small team and focus on 2 machines, this went on until now where you have faster storage faster processors and 12gb+ of memory to deal with and all the content you have to make add that with optimizing for it to work on ps5 series x and confused series s + pc where some people still have 6/8gb cards which are probably most pc users and the first guys to come online and complain on how shitty a pc port is!...

and it doesnt stop there somehow gamers now demand 60fps as mandatory and anything else is considered lazy... so developers have to deal with all this pile of crap that isnt their fault, rather the problem is hardware manufacturers like nvidia who keep sucking your pockets with 8gb gpu's, old pc architecture that needs a revamp and customer demands that doesnt correspond to their hardware... its retarded and actually needs an incredible level of ignorance to think that somehow all developers are lazy, and intentionally produce bad ports... this is extreme ignorance

Which developer do you work for?
 

Fabieter

Member
On the flip side TLoU got patched today. Running beautifully now, I’ve just got it locked to 110fps with FSR Quality. VRAM usage slashed in half (just over 9GB), I’m running at high settings (don’t see the point in Ultra), and GPU usage at 84% with CPU no longer acting crazy.

Just goes to show I/O has fuck all to do with any of this. It’s developers not putting in the work.

No one said the i/o would make it impossible on pc that's bullshit. Pc has alot more going for it but it certainly makes optimization harder which makes devs don't hit the release date with acceptable performance.
 

rnlval

Member

Zathalus

Member
dont you think ur being harsh now.. cause your assuming that somehow in the entire history of the universe that somehow in some sort of magical way suddenly game developers have all become lazy!... i think your argument is the only lazy thing here, try to think... and ask more important questions! like why are games broken? instead of just lazyly thinking and believing and blaming developers for lazyness...

pc's have always been complicated anyway and secondly back in the days you had fucking playstation 2 with 32 mb of ram and xbox with 128 mb ... happy days you just create a small game with a small team and focus on 2 machines, this went on until now where you have faster storage faster processors and 12gb+ of memory to deal with and all the content you have to make add that with optimizing for it to work on ps5 series x and confused series s + pc where some people still have 6/8gb cards which are probably most pc users and the first guys to come online and complain on how shitty a pc port is!...

and it doesnt stop there somehow gamers now demand 60fps as mandatory and anything else is considered lazy... so developers have to deal with all this pile of crap that isnt their fault, rather the problem is hardware manufacturers like nvidia who keep sucking your pockets with 8gb gpu's, old pc architecture that needs a revamp and customer demands that doesnt correspond to their hardware... its retarded and actually needs an incredible level of ignorance to think that somehow all developers are lazy, and intentionally produce bad ports... this is extreme ignorance
Of course its the developers (and/or publishers) fault. We have numerous examples of other games that are either more graphically impressive or done with far smaller teams. Dead Space, RE4, Forspoken, Returnal, and A Plague Tale: Requiem are all recent games that run just fine on PC. Hogwarts Legacy, Callisto Protocol and Gotham Knights had some stuttering but that was dealt with patches and the performance was not near as dire as this game. The only truly bad performing games so far appear to be this and the Last of Us. The Last of Us that just got patched and is far better now mind you.

Now I'm not saying the developers are lazy specifically, but obviously this game was released without extensive optimization and testing. This is not limited to just PC apparently as it appears the consoles have issues with performance, screen tearing, and delayed asset streaming as well. So it appears this game should have been delayed for a few weeks for all platforms. My guess is that they didn't want to compete with Zelda, FF16, and Diablo 4.
 

rnlval

Member
Of course its the developers (and/or publishers) fault. We have numerous examples of other games that are either more graphically impressive or done with far smaller teams. Dead Space, RE4, Forspoken, Returnal, and A Plague Tale: Requiem are all recent games that run just fine on PC. Hogwarts Legacy, Callisto Protocol and Gotham Knights had some stuttering but that was dealt with patches and the performance was not near as dire as this game. The only truly bad performing games so far appear to be this and the Last of Us. The Last of Us that just got patched and is far better now mind you.

Now I'm not saying the developers are lazy specifically, but obviously this game was released without extensive optimization and testing. This is not limited to just PC apparently as it appears the consoles have issues with performance, screen tearing, and delayed asset streaming as well. So it appears this game should have been delayed for a few weeks for all platforms. My guess is that they didn't want to compete with Zelda, FF16, and Diablo 4.
GA104-based RTX A4000 with 16GB VRAM runs Hogwarts Legacy and, Last of Us Part 1 just fine. It's gimped on GA104-based RTX 3070 8 GB VRAM.



RTX A4000 16 GB has the same "fine wine" as RX 6800 XT 16 GB.
 
Last edited:

SeraphJan

Member
Ive only got a 3080'10G with me right now.....its certainly more than that 10GB when you combine OS and Game cuz I bounce out when I tried DSR.
Ill do more testing later later (im lying I wont but someone will).
If I would hazard a guess Native 4K Ultra is probably in the region of 10GB process usage.
The game also has a new texture streaming option that seems to lower VRAM usage.
Im sure one of these tech channels will do a deep dive with multiple GPUs to get a gauge of just how playable the game is now.
Looking forward to your review:messenger_clapping:
 
Last edited:

rnlval

Member
Yeah, I can confirm. Downloading and installing the patch was still a pain. 1 hour to download and install even though i have 1 gig internet. Then the game compiled shaders and crashed. Verified files to redownload them and this time it started the shader install at 0%. I swear i did this just two days ago with the last patch.

But once installed, the game does run beautifully now. I was getting 40 fps just a couple of days ago in Jackson, getting a locked 60 fps there with only 85% GPU utilization. VRAM is down to just 7.5 GB. I increased the new Texture Streaming rate setting to 1.5x and it jumped to 8.2GB. Sometimes it jumps to 8.4 GB but hardly the 9.2 GB with stutters down to 5 fps i was getting before.

There are still minor hitches with the texture streaming rate setting maxed out but at normal or 1.0, it was pretty much a smooth experience. Shocking that it took them a month to figure this out, but hey what are you gonna do.
It wouldn't be a problem if MS/AMD/NV/Intel worked together with PC DirectStorage 1.1 being released along with XSX's release.
 

hlm666

Member
impossible nowadays, 1 because of crossgen devs have to make sure a game works from xbone to a 4090 pc, 2. memory is bigger therefore games are bigger means more content, more asset variation one asset has to have low to ultra textures and geometry to support lowest performing hardware to more performant hardware, different hardware configurations and api's, teams have to make sure a game works well on direct x and xbox machines then again playstation, then different pc's then nintendo sometimes... its just a soup of problems... more power brings more possibility and problems at the same time
And yet dead island 2 went through development hell, passed around multiple developers and came out acceptable on all platforms and is using unreal engine and looks more current gen than this ontop.
 

GymWolf

Member
dead island is a ps4 game, it can run on 6gb gpu... you can blame developers until jesus comes back but the fact is 8gb vram is officially dead and the quicker you accept this fact, the easier everything will be.. and people should learn to listen to actuall computer engineers and developers and stop following youtube influencer plaster knowledge...

mark cerny explained why ssd decompression matters... people didnt listen, dozen developers explained why ssd decompression is a game changer and people called it blushing... alex said a rtx 2070 super is enough for the generation by comparing crossgen games and called the console ssd decompression... hot air.. and people believed it, like a herd of bind sheep, and now the truth came out, suddenly all developers are called lazy!... maybe just stop following youtube and listen to actuall engineers next time.
They both run on ue4 and di2 doesn't really look much worse (probably the opposite), it has WAY more physics and stuff on screen than jedi survivor and it runs like a dream.

You can bet your ass that i can at least partially blame the fucking developers dude.

And di2 has to run on all platforms,not only nextgen and pc, so it's not really "impossible nowadays" to have good pc port...
 
Last edited:

Ivan

Member
A maybe just stop following youtube and listen to actuall engineers next time.
They even laughed at Cerny's PS5 I/O presentation, like he's some fool that has no clue. They do it all the time with their cringe worthy thumbnails in shittiest possible videos.

I remember Linus getting schooled by Tim Sweeney on that subject too.
 
Last edited:
Top Bottom