• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi Survivor PC Performance Discussion OT: May The Port Be With You

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I have a question for PC gamers. How fast is the DDR5 3600 RAM speeds in terms of GBps. The PS5 can pull 5.5 GBps from the SSD into the vram. The XSX is capped at 2.4 GBps, but they used DDR3 ram in conjuction with the ESRAM just last gen. Surely the DDR4 and DDR5 ram can pull data from the Gen 4 7 GBps SSDs and then push to the vram at a much faster rate than the PS5 IO can.

Hogwarts was taking up to 25GBps in my system ram. Another 9GB in my VRAM. thats half the fucking game. Just how slow is this DDR5 that it cant do what the PS5 IO is doing?
DDR5 3600 isnt a thing....DDR4 was already well past 20GB/s.
DDR5 JEDEC minimum is 4800.

General PC I/O for games is: Date stored on SSD the CPU sends whats needed to System RAM, then the CPU sends what the GPU needs to the GPU to VRAM.

Yes, that takes time and is inefficient, there are losses all along the way.

DirectStorage alleviates that by using the GPU to do the work so data travels from SSD to GPU to VRAM, bypassing the CPU almost entirely.

With DirectStorage PCs can easily hit 20GB/s which is double the PS5s compressed 9GB/s.
Even a paltry Gen3 will still hits 8GB/s(probably why with 2.4GB/s DirectStorage is helping the XSX keep up with the PS5 in terms of load times.)

Devs just need to frikken use the free API they have been given.
It should also reduce the System Ram footprint as data for the GPU will go directly from storage to VRAM and not need to be stored on system ram first.

L3SrllXeQE0oAod8.jpg


^Ignore the NIC in that picture....Nvidia forgot to update the slides for local machines.


p1LC2UA.png


This is just max throughput though....there are other things to take into consideration.
 

SlimySnake

Flashless at the Golden Globes
3600 DDR4 is 57.6 GB/s. 6000 DDR5 would be 96 GB/s. No consumer SSD (even with compression) is close to that.
Well, I guess thats that. Even with this fancy Cerny IO that can do 9 GBps in best case scenarios, DDR4 should be good enough for RAM to VRAM transfers.
 

Guilty_AI

Member
Devs just need to frikken use the free API they have been given.
Too bad that ain't the issue. You saw assets taking 15 seconds to load even on magical cerny sauce.
Game has piss poor R&D, thats all there is to this. And all of these shitty launches for that matter.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Well, I guess thats that. Even with this fancy Cerny IO that can do 9 GBps in best case scenarios, DDR4 should be good enough for RAM to VRAM transfers.
Data still needs to get into RAM from the SSD and something has to then move that data from RAM to VRAM.
Too bad that ain't the issue. You saw assets taking 15 seconds to load even on magical cerny sauce.
Game has piss poor R&D, thats all there is to this. And a lot of these launches for that matter.
Ohh yeah.
Im not talking about with this game, was just answering SlimySnake.
This games performance has nothing to do with I/O, its just a piss poor port.
A 5900X married to a 4090 struggling to get to 60 at 1440p.......and not even with RT?.,.....clearly something is broken.
 
Ive been gaming on PC since 2003. I get it. My PCs would always last until the next gen consoles came and vram would become the bottleneck forcing me to upgrade. My memory is hazy for my 2003 PC, but I distinctly remember the day my GTX 570 defaulted all settings to very low, and refused to load textures in CoD Advanced Warfare. A GPU just as powerful as the PS4.

I completely understand the vram side of things. I argued alongside you in other threads just last year. What's going on recently is more than just vram related. The TLOU medium textures are not even PS3 quality and they take up 10GBs? PS3 had 256MB of VRAM.

PCs also have the added benefit of having System RAM. This has been the standard for the last three gens. I understand if PS5 and XSX have this inherent advantage but I have a system that has almost 2x faster vram bandwidth, 50% faster CPU clocks, DDR5 RAM, and as SSD thats 30% faster than the PS5. That should be enough to compensate for whatever secret sauce Cerny and Jason Ronald put into the XSX and PS5. The fact that these issues are resolved with some patches show that its not hardware related, its just time related. They are willing to ship out these games knowing full well they stink.

Any game with shader stutters is proof of that. You run gotham knights, sackboy, elden rings for the first time and it will stutter like crazy. Unplayable. There is no way they dont catch that. TLOU took 40 minutes to 2 hours to build shaders. ND said that they are investigating why that could be. BULLSHIT. They knew. This happens with every single CPU. AMD, Intel, 8 core, 16 core. 4.0 Ghz or 5.0 Ghz. Doesnt matter. And they pretended as if it was an isolated issue. GTFO.
Look it's really simple: they ship broken games because gamers buy them anyway.
Minimum viable product.
 

april6e

Member
Jesus, this almost kills the game for me. There is no way I'd want to play such a demanding game on console. And why in the hell did they sign an exclusive contract with AMD knowing that the majority of serious PC gamers use Nvidia? So we can't even get DLSS which would have been a giant help on the performance?

Guess I'll try it out when it hits the bargain bin on Black Friday
 
Last edited:

Buggy Loop

Member
Another AMD sponsored shitfest to add to the list

Jedi Survivor
TLOU
RE4
Callisto Protocol
Forspoken
Halo Infinite
RE Village

All had shit performances or are even still plagued with memory leaks

Is it that Nvidia spots the devs who have no idea what they’re doing and do not sponsor them, or that all shit devs gravitate towards an AMD sponsorship?
 
Last edited:

THE DUCK

voted poster of the decade by bots
Not really a "port" is it? More like multiple simultaneous versions, some better than others.
 

SlimySnake

Flashless at the Golden Globes
Jesus, this almost kills the game for me. There is no way I'd want to play such a demanding game on console. And why in the hell did they sign an exclusive contract with AMD knowing that the majority of serious PC gamers use Nvidia? So we can't even get DLSS which would have been a giant help on the performance?

Guess I'll try it out when it hits the bargain bin on Black Friday
AMD must have paid more.

I think lack of dlss will hurt this game more than TLOU, Gotham Knights and Hogwarts. At least i was able to use DLSS to get the vram usage down while reducing the GPU overhead in those games. Cant do that in this game because FSR looks like shit apparently.

This is just the case one poor choice after another.

Someone modded in DLSS into RE4 so im hoping we get a similar mod here. Though with the game performing so poorly in 1440p, i dont think dlss quality settings which use 1440p as internal resolutions are going to benefit much. Maybe DLSS Performance might get 4090 users a locked 60 fps. 3080 users might have to settle for 1440p dlss balanced.
 

yamaci17

Member
AMD must have paid more.

I think lack of dlss will hurt this game more than TLOU, Gotham Knights and Hogwarts. At least i was able to use DLSS to get the vram usage down while reducing the GPU overhead in those games. Cant do that in this game because FSR looks like shit apparently.

This is just the case one poor choice after another.

Someone modded in DLSS into RE4 so im hoping we get a similar mod here. Though with the game performing so poorly in 1440p, i dont think dlss quality settings which use 1440p as internal resolutions are going to benefit much. Maybe DLSS Performance might get 4090 users a locked 60 fps. 3080 users might have to settle for 1440p dlss balanced.
the game is primarily cpu bound, there is not much of a problem on the GPU sides of things

it drops to 30-40s with a 5900x. no amount of dlss/fsr can help in this case

only frame gen can help... and it is not present xd
 
Last edited:

yamaci17

Member
No, AMD sponsored games are univocally bad all around, you just won’t choke on 8GB, hurrah?

dumb and dumber thread GIF


they just might. some amd folks even started to spin the logic of

"heyyy 1080p high settings is doable on 8 gb.... and u kind of get unplayable framerates with ultra anyways1!1"
 

zcaa0g

Banned
Devs need to learn to code or utilize a different engine. Frostbite based games look far better than this and run liquid smooth.
 
I am sure these PC issues will be sorted out in the coming few years. But it seems that PCs with ancient IO and developers unwelling to develop multiple PC optimization settings is what are holding back PC ports.
 

Buggy Loop

Member
I am sure these PC issues will be sorted out in the coming few years. But it seems that PCs with ancient IO and developers unwelling to develop multiple PC optimization settings is what are holding back PC ports.

1060 6GB 38 fps at 2560x1440, epic

5da8c0c5da2e4b0001963b4f-image_67ce26a4.jpeg


4090 ~40 fps at 2560x1440, epic

e18107e0-7767-11ed-ad6f-8aeab540bb08.cf.jpg


It's the IO !!!

pretty-cool-cerny.gif



Meanwhile, the game doesn't budge performance at 720p low settings, or 1440p high settings and the GPU / CPU utilization cannot even reach 40%. Even RT / no RT!






Fucking PC needs an IO 🤡
 
Last edited:
Sorry in advance for a lengthy response.

The trend that you all see in terms of more and more games having stuttering issues and frame drops on the PC that you do not see on the consoles come primarily down to the below.

If you do a CGI movie, a single character model can have up to 100 textures. High resolution textures are roughly 50-100 MB in size uncompressed. In other words, a single character model has 5-10 GB of texture data under optimal conditions. In games it is obviously less. However, in the past there has been a hard cap on how many textures the game can utilize and what resolution they can have. Since consoles had mechanical drives (as well as most PCs) all textures were mainly loaded when a 'level' loaded. Nothing was loaded on the fly. On a PC this resulted in a texture limit around 4GB (based on the median graphic card that was utilized).

With the new generation of consoles with SSDs they can load textures on the fly so most games have started to do that and as a consequence the number of textures that are used have increased dramatically as well as their resolution.

On a console this is very easy since RAM and VRAM are the same and the GPU is allowed to call for data straight from the SSD without checks and balances. On a PC the GPU cannot access RAM and SSD directly at all since that would be a security risk. The GPU needs to loop through the kernel (with associated driver overhead) and ask for a file to be fetched from RAM or SSD to be loaded to VRAM to be accessed. This introduces a lot of latency and fairly dramatic frame drops if the texture is not where it should be to create a frame. Direct Storage tries to make this faster and smoother but it is still a far cry from what both the XSX and the PS5 can achieve due to the limitations of the PC architecture.

This tweet still holds despite a lot of PC lovers fighting its implications (and I love my PC - have a custom loop 4090 etc):



Lol this is NOT true.

DirectStorage 1.2 literally blows PS5 I/O Complex out of the water.
The whole point of DirectStorage isn't just to address speed but also latency.
It literally bypasses the kernel I/O that John mentioned and AMD & Nvidia have redone their drivers to leverage DS.
All the bottlenecks John mentioned have literally been eliminated.

Look at the difference between 1.1 and 1.2.
PS5 / XS will never touch 28 GB/s, ~7GB loaded in .2 secs!

 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Devs need to learn to code or utilize a different engine. Frostbite based games look far better than this and run liquid smooth.
Its the same devs who made the first game using the same engine.
Which runs great on a frikken 6th gen Intel and a GTX 1070.


Something is amiss with the QA stage of PC ports right now.
Its like that final quality check pass that used to happen has been abandoned by some devs and they just send the game to Steam Database knowing they will patch the game in the coming weeks.

The way some of these PC ports are launching they would have been kicked off PSN/XBL....but cuz theres no certification on PC, they can release a broken game and have 7 patches in 7 days if need be.
 

01011001

Banned
1060 6GB 38 fps at 2560x1440, epic

5da8c0c5da2e4b0001963b4f-image_67ce26a4.jpeg


4090 ~40 fps at 2560x1440, epic

e18107e0-7767-11ed-ad6f-8aeab540bb08.cf.jpg


It's the IO !!!

pretty-cool-cerny.gif


Meanwhile, the game doesn't budge performance at 720p low settings, or 1440p high settings and the GPU / CPU utilization cannot even reach 40%



Fucking PC needs an IO 🤡



JEEESUS... how inept does a dev team need to be for your game to run at 46 fps on what is the most overkill PC ever?
 
Last edited:

Buggy Loop

Member
Its the same devs who made the first game using the same engine.
Which runs great on a frikken 6th gen Intel and a GTX 1070.


Something is amiss with the QA stage of PC ports right now.
Its like that final quality check pass that used to happen has been abandoned by some devs and they just send the game to Steam Database knowing they will patch the game in the coming weeks.

The way some of these PC ports are launching they would have been kicked off PSN/XBL....but cuz theres no certification on PC, they can release a broken game and have 7 patches in 7 days if need be.



We need a new IO for next gen to give us synchronized audio too 🤷‍♂️ PC so weak
 

01011001

Banned
Its like that final quality check pass that used to happen has been abandoned by some devs and they just send the game to Steam Database knowing they will patch the game in the coming weeks.

I'm telling you, one of 2 things need to happen for this to ever have a chance to change.

option 1: every major launcher and console manufacturer needs to charge at least 5x to 10x as much money as consoles do now for patch verification.

or option 2: patches are not allowed from the day the gold master is delivered, to 5 to 6 weeks after launch.
meaning no day 1 patch, no day 0 patch, no patch 3 days after launch.
developers send in the Gold Master that gets pressed onto discs, and that's the state the game will be in for more than a month, hell I'd even extend it to 2 full months tbh.


option 1 would mean patches are expensive so publishers have an incentive to patch as little as possible.

while option 2 would mean a disastrous launch version would be stuck in the state it is in for 2 months, with the consumers knowing the game will not be patched "in the coming days", as many devs always say to get people to buy the game anyway in the hopes of a quick patch.
this would mean if a publisher pushes out a broken game, it would be a way bigger PR disaster than it currently is.
and also, reviewers can't do that whole "maybe it will be fixed with the day 1 patch" shit.

I'm for option 2 here,
not only because that would mean unfinished/unplayable disc releases would be a thing of the past, but also because should a broken game release, the developers would have a full 2 months to fix it, without the constant small and insignificant patches distracting them.
 
Last edited:
This is definitely disappointing, I don't remember the first game having big technical issues on launch... and it's not only a Denuvo issue (although the drm probably contributes to the poor performance of the PC version).

From a perspective of someone who's not heavy into tech stuff I don't understand the enormous bump in the system requirements vs. the first game. It doesn't look that much better than Fallen Order.
I do, i was there day one and it had shader stutter, crashes, issues with textures not loading and a whole host of issues. Google "Jedi Fallen Order PC performance issues" for me.

I expected this and a lot of what's being described reminds me of launch day with the first one.

I did the $15 sub for a month and will beat it then delete it on PC. All it's worth ImO
 

analog_future

Resident Crybaby
I was talking about this in my previous posts. Yes console owners don't get full tracing and the like. But they get an optimized game that works without technical issues.

I recently downgraded my 4090 FE with a friend to a 4080 strix and 850$ Canadian from his side just because I am starting to feel like PC is becoming a waste of money

But even then, I feel like downgrading again and just keep the PC for old games till devs get their shit together, I am not giving them my money they can fuck off.

The FE 4090 retails for 2100$ ( MSRP price ) The FE 4080 retails for 1700$

This mean I got my 4080 strix for 1250$ Canadian ( and I still feel ripped off with these ports ) such a shame

I've been on the fence with building a high-end gaming PC for several months now, and every time I feel like I might be ready to pull the trigger, a new game drops and reminds me why I haven't yet. Whether it be this, TLOU Part I, RE4, Dead Space, Hogwarts, etc...

Console versions have their fair share of issues here and there as well, but at least I spent $500 on my Xbox vs the $2,000 I've been contemplating investing into a gaming PC.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'm telling you, one of 2 things need to happen for this to ever have a chance to change.

option 1: every major launcher and console manufacturer needs to charge at least 5x to 10x as much money as consoles do now for patch verification.

or option 2: patches are not allowed from the day the gold master is delivered, to 5 to 6 weeks after launch.
meaning no day 1 patch, no day 0 patch, no patch 3 days after launch.
developers send in the Gold Master that gets pressed onto discs, and that's the state the game will be in for more than a month, hell I'd even extend it to 2 full months tbh.


option 1 would mean patches are expensive so publishers have an incentive to patch as little as possible.

while option 2 would mean a disastrous launch version would be stuck in the state it is in for 2 months, with the consumers knowing the game will not be patched "in the coming days", as many devs always say to get people to buy the game anyway in the hopes of a quick patch.
this would mean if a publisher pushes out a broken game, it would be a way bigger PR disaster than it currently is.
and also, reviewers can't do that whole "maybe it will be fixed with the day 1 patch" shit.

I'm for option 2 here,
not only because that would mean unfinished/unplayable disc releases would be a thing of the past, but also because should a brolen game release, the developers would have a full 2 months to fix it, without the constant small and insignificant patches distracting them.
Mate im basically at the point where day one PC purchases are a thing of the past.
Im one of the suckers who got Callisto Protocol at midnight.........i didnt leave the elevator till nearly 2 weeks later, I stayed up for this shit!
Now I can wait.....my release date isnt the date devs tell me....its the date the game is playable on my current system (12400 + 3080'10G).
Thank god ive still got a decent backlog I can get through some which became backlog because they werent playable at launch.

Smaller studios seem to be doing better PC ports than larger studios.
 
Is this even surprising anymore? I know porting probably isn’t exactly copy and paste, but this is getting beyond ridiculous now.
 

SmokedMeat

Gamer™
On the flip side TLoU got patched today. Running beautifully now, I’ve just got it locked to 110fps with FSR Quality. VRAM usage slashed in half (just over 9GB), I’m running at high settings (don’t see the point in Ultra), and GPU usage at 84% with CPU no longer acting crazy.

Just goes to show I/O has fuck all to do with any of this. It’s developers not putting in the work.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
On the flip side TLoU got patched today. Running beautifully now, I’ve just got it locked to 110fps with FSR Quality. VRAM usage slashed in half (just over 9GB), I’m running at high settings (don’t see the point in Ultra), and GPU usage at 84% with CPU no longer acting crazy.

Just goes to show I/O has fuck all to do with any of this. It’s developers not putting in the work.
I was about to post about this.

The update has all but fixed the game.....it should have launched like this.

Medium Environment textures arent straight ass anymore(im guessing thats why its such a large update, they literally forgot to put medium textures in the launch game)
CPU utilization isnt crazy.
They fixed the OS reserved VRAM usage.
VRAM usage in general is actually reasonable now.
1440p High ~8GB of VRAM usage.
1440p Ultra is possible on a 3080'10G
1080p Ultra is a squeeze but probably doable on a 3070 now.

The magic of I/O..........
 

SeraphJan

Member
I was about to post about this.

The update has all but fixed the game.....it should have launched like this.

Medium Environment textures arent straight ass anymore(im guessing thats why its such a large update, they literally forgot to put medium textures in the launch game)
CPU utilization isnt crazy.
They fixed the OS reserved VRAM usage.
VRAM usage in general is actually reasonable now.
1440p High ~8GB of VRAM usage.
1440p Ultra is possible on a 3080'10G
1080p Ultra is a squeeze but probably doable on a 3070 now.

The magic of I/O..........
Do you have any information on how many VRAM are required for 4K Ultra?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Do you have any information on how many VRAM are required for 4K Ultra?
Ive only got a 3080'10G with me right now.....its certainly more than that 10GB when you combine OS and Game cuz I bounce out when I tried DSR.
Ill do more testing later later (im lying I wont but someone will).
If I would hazard a guess Native 4K Ultra is probably in the region of 10GB process usage.
The game also has a new texture streaming option that seems to lower VRAM usage.
Im sure one of these tech channels will do a deep dive with multiple GPUs to get a gauge of just how playable the game is now.
 

SmokSmog

Member
lol are you serious? There are several instances in this video where the 4090 drops below 40 fps. Hell 90% of this video is the 4090 running the game in the 40s at 1440p!!

Here is a timestamp that shwos drops to 30 fps going all the way down to 27 fps. This is a $1,600 4090. The 3080 Ti is roughly 40% slower so it makes sense that it would see regular dips to below 20 fps Fextralife mentioned.



1:23 shows a drop to low 30s.
3:00 shows a drop to low 30s.
6:30 shows a drop to low 30s.

Again, this is 1440p. THE most powerful card in the world. Unable to run the game at anywhere close to 60 fps. Even TLOU wasnt that bad. Just admit you got it wrong and move on.

Lol, I see people are completely clueless here on this forum... It's not the 4090 that's dropping to 30FPS, it's the Ryzen 5900X. 4090 is literally sleeping there and waiting for the CPU to calculate the game.
 
Top Bottom