• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Star Wars Jedi Survivor - DF Tech Review - PS5 vs Xbox Series X/S - Ambitious But Compromised

JimboJones

Member
At 8:21 you can easily see the awful ghosting on Cal's legs (and that's PS5 Quality Mode, artifacts get way worse in Performance Mode). It's funny when DF freezes frames on PC to showcase small artifacts you won't ever notice, yet calls these awful ghosting artifacts "good on high-resolution displays". If these artifacts are good for them, they should be calling DLSS 3 Frame Generation flawless (and not examining it frame-by-frame). This is the first game in which you can clearly see how much DF favors consoles, even though the console version of Jedi Survivor looks and runs worse than the PC version on a high-end system.
DF favors consoles over PC? That's a new one.
 

Mr.Phoenix

Member
Why is that? Software lumens is basically Software GI. Hardware Lumens is RTGI. RT is always going to be more taxing.
the weird part is that it doesn't use the RT hardware...
also it's not just like any software GI, software lumen still has relatively ok RT reflections for example.

I wonder why they aren't just adding hardware acceleration with a super low preset that is equivalent quality of software lumen.
01011001 01011001 said it best here.

I would think that even on the most basic of levels, lumen RTGI would use some of the dedicated RT hardware to actually make it faster.
 

SlimySnake

Flashless at the Golden Globes
01011001 01011001 said it best here.

I would think that even on the most basic of levels, lumen RTGI would use some of the dedicated RT hardware to actually make it faster.
Hardware Accelerated Lumens is doing exactly that. They are using the new hardware RT cores in the RDNA2 CUs to accelerate RTGI, Shadows, and Reflections. But RT effects have their own cost and thats why they are always going to be more costly on the GPU.

The VRAM hit for turning on RT is roughly 1-2GB, and the CPU also sees a major hit to performance. Hardware RT, Lumens or otherwise, is always going to be more costly.
 

Skifi28

Member
Watching the video, I'm baffled by some of the decisions. Using RT reflections as a fallback to SSR? Really? Forcing RTGI on consoles when the visual impact appears to be minimal? At least implement a toggle, they're butchering performance and resolution for barely any gains.
 

Spyxos

Gold Member
He's right about Fallen Order.
I remember it looking like an amazingly detailed world.
It looks pretty flat and outdated compared to Survivor, though.
When I played Jedi Survival, I thought it looked pretty much the same as its predecessor. When I saw the comparisons it was more than clear that the differences are very big.

Too bad I can't see the small details on my 1080p screen at Ultra Settings. Everything is blurry and I can only guess how good the game actually looks. Especially the second planet is just like mud on my screen.
 
Last edited:

Mr.Phoenix

Member
Watching the video, I'm baffled by some of the decisions. Using RT reflections as a fallback to SSR? Really? Forcing RTGI on consoles when the visual impact appears to be minimal? At least implement a toggle, they're butchering performance and resolution for barely any gains.
I actually think using RT reflections as a fallback to very well-done SSR is pretty clever. Being that you would only need RT reflections for objects not on the screen or occluded.
 

SlimySnake

Flashless at the Golden Globes
I actually think using RT reflections as a fallback to very well-done SSR is pretty clever. Being that you would only need RT reflections for objects not on the screen or occluded.
The problem is that just enabling RT would have a huge VRAM and CPU cost since the game will need to build that BVH structure even if the GPU isnt rendering the actual RT reflection. So while their solution is smartly saving precious GPU cycle, the CPU and vram hit are there. And since the game is CPU bound, it just makes things worse even when you are out in the open world with nothing reflecting.

NX Gamer explained this in the RE4 review where the reflections were limited to puddles but even in scenes without puddles or reflections, the framerate was heavily impacted because the CPU and VRAM had to keep track of everything anyway.
 

Skifi28

Member
I actually think using RT reflections as a fallback to very well-done SSR is pretty clever. Being that you would only need RT reflections for objects not on the screen or occluded.
I'm certainly no expert, but to my understanding of how RT reflections work, you still shoulder most of the cost whether reflections are being displayed on-screen or not. If you have the hardware to spare by all means go crazy, but not on a console when you can't hit your framerate targets.
 

01011001

Banned
Hardware Accelerated Lumens is doing exactly that. They are using the new hardware RT cores in the RDNA2 CUs to accelerate RTGI, Shadows, and Reflections. But RT effects have their own cost and thats why they are always going to be more costly on the GPU.

The VRAM hit for turning on RT is roughly 1-2GB, and the CPU also sees a major hit to performance. Hardware RT, Lumens or otherwise, is always going to be more costly.

the thing that makes software lumen perfom better is the low quality tho.
it still send out rays to calculate it. it is doing that without using the dedicated RT hardware tho.

so the question is, why no use the low quality + the hardware acceleration.
in theory this should make it run even better
 

01011001

Banned
I actually think using RT reflections as a fallback to very well-done SSR is pretty clever.

nah, personally, I fucking hate that.

to me, RT reflections lose one of their biggest advantage over SSR when you use SSR ontop of RT reflections.
and that advantage is temporal stability!

Screen Space effects like SSR suck because they break and show clear and distracting artifacts as soon as anything is occluded from the camera view.
and that visual inconsistency and temporal instability is what makes them bad.

RT reflections on the other hand, when done well, have really good temporal stability. they look the same no matter at which angle you look at them, no matter what is between the camera and the reflection.

I absolutely hate it in Cyberpunk for example that you can not turn off the SSR when RT reflections are enabled... because it's so fucking distracting looking at a mirror like building, only to see part of the reflection detail disappear due to some NPC walking in front of it, or due to your camera moving.

it, to me, defeats the whole purpose of having RT reflections in the first place.
 
Last edited:
This happens when you sign a marketing contract and the final product sucks… because this video looks like a bad StarWars/PS5 AD…
 

Fbh

Member
Fair point, but what is sad is that this game will probably go on to sell 3-4 times more than Titanfall did... and therein lies the problem and why devs never shy away from just going for a 30fps cap and focus on glitz and glamour.

I hate to say it, but Framerates has never stopped games like these from selling. The only games that really get affected by fps are fighting games, racing games and twitch shooters.

True I guess, but will it really sell more than Titanfall 2 because of the the graphics? Or because it's a Star Wars game?
 

Mr.Phoenix

Member
True I guess, but will it really sell more than Titanfall 2 because of the the graphics? Or because it's a Star Wars game?
A little of both?

One thing or certain though, is that devs don't lose sleep over 30fps, because it doesn't affect sales.
 

Luigi Mario

Member
Just yesterday I heard that Jedi Survivor is the last game to be released under EA's exclusive Star Wars license. If this is true, it seems like a fitting end to an era of wasted potential.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Honestly even beyond the terrible optimization, making graphics the primary focus of a game like this seems dumb.
The gameplay is primarily centered around timing based melee combat and timing based platforming, you'd imagine the main focus would be to get it to run smoothly and then you work on making it look as good as possible within that performance target. I mean fuck it's what Respawn themselves did with Titanfall 2, they knew they had a fast paced FPS so they focused on 60fps and the end result was a game that wasn't going to win any graphics awards but still looked decent and more importantly felt great to play because it was basically locked 60fps on all consoles.

Here they seem to have taken a complete 180 on their design philosophy and now we've got nicer graphics and ray tracing at the cost of having to choose between playing at like 1080p and 30fps with drops or going all the way down below even 720p and still not even getting a locked 60fps.
It's a cinematic action adventure game. These are typically 30 fps. Everyone loves RDR2, Horizon, Ghost of Tsushima, Uncharted and until recently TLOU despite them being 30 fps. Uncharted actually had an insane 150 ms of input lag. I think locked 30 fps melee games with good frame pacing are perfectly playable. Hell, Bloodborne and sekiro had awful framepacing, yet they were hailed as masterpieces.

I think they shouldnt even have offered a 60 fps mode on consoles. If you have to go down to 600p then its clearly beyond the capabilities of the console. Cinematic action adventure games cant look like Source Engine games rooted in 2004 era graphics. I remember HL2 Episode 2 looking last gen as fuck next to some 2007 era PS360 games. It was dated back then. Titanfall looked dated as well. It worked because it was a competitive shooter, but the source engine isnt going to be producing good looking cinematic action adventure games.

The problem here isnt that they made graphics the primary focus, it's that they chose to make ray tracing the primary focus. As the video shows, the difference is negligible. Series S with just 4 tflops and awful memory bottlenecks is able to run the game at a similar 890p resolution the PS5 can drop to. So you have 6 extra PS5 tflops just wasting away rendering RTGI and RT Reflections? Is that really worth it?

If anything, if they remove RT altogether, they can probably hit 1440p-1620p native which means they should have enough headroom for a fairly decent 1080p performance mode that would have a higher pixel count than the PS5 Quality mode in its worst case scenario.
 
Last edited:

BennyBlanco

aka IMurRIVAL69
17 FPS / 720p is wild


IMG-7757.jpg
 

01011001

Banned
It's a cinematic action adventure game. These are typically 30 fps.

it's not, it's an actual action adventure ;) and just because they typically were last gen doesn't mean that's a good thing.

this game plays like SHIT at 30fps.
the input lag is fucking unbearable. at least 160ms if not 200ms.
 
Honestly, I think this is a glitch/bug or something. No reason for that difference.
Many recent UE4 games show lower quality RT on XSX somehow. This is a pattern and it means there is probably a good reason. It's most likely caused by the fact that Xbox RT API is less efficient than on PS5.
 
Last edited:

JackMcGunns

Member
I don't mind analyzing the game itself (that's equally important), but at the same time there's plenty of differences between the two consoles that they should have paid attention to and talked about.

Here are just 5 noticeable differences in only one of the frames.

They also didn't show frame-rate performance in similar scenes on both consoles (e.g., the water area). It's important because other analysis have shown us (side-by-side) that performance differences between the 2 consoles can be up to 18%.


It's a frame of a moving video and almost impossible to get perfectly equal perspective. Just moving the camera slightly off can have a dramatic effect on the image capture.

Take a look at this sign, this is resolution mode, why does it look so much sharper on Series X, taken from the same images you used...


We're at the verge of splitting hairs!!

V9rVglD.png
 

winjer

Gold Member
the weird part is that it doesn't use the RT hardware...
also it's not just like any software GI, software lumen still has relatively ok RT reflections for example.

I wonder why they aren't just adding hardware acceleration with a super low preset that is equivalent quality of software lumen.

The game is using hardware RT. I can confirm r.RayTracing=1
BTW, r.RayTracing.AsyncBuild is off. Setting this to 1, gives a 8-9% performance boost in some scenes.

About the software GI, it's disabled. Both with RT off and on.
r.SSGI.Enable=0
 

MikeM

Member
It's a cinematic action adventure game. These are typically 30 fps. Everyone loves RDR2, Horizon, Ghost of Tsushima, Uncharted and until recently TLOU despite them being 30 fps. Uncharted actually had an insane 150 ms of input lag. I think locked 30 fps melee games with good frame pacing are perfectly playable. Hell, Bloodborne and sekiro had awful framepacing, yet they were hailed as masterpieces.

I think they shouldnt even have offered a 60 fps mode on consoles. If you have to go down to 600p then its clearly beyond the capabilities of the console. Cinematic action adventure games cant look like Source Engine games rooted in 2004 era graphics. I remember HL2 Episode 2 looking last gen as fuck next to some 2007 era PS360 games. It was dated back then. Titanfall looked dated as well. It worked because it was a competitive shooter, but the source engine isnt going to be producing good looking cinematic action adventure games.

The problem here isnt that they made graphics the primary focus, it's that they chose to make ray tracing the primary focus. As the video shows, the difference is negligible. Series S with just 4 tflops and awful memory bottlenecks is able to run the game at a similar 890p resolution the PS5 can drop to. So you have 6 extra PS5 tflops just wasting away rendering RTGI and RT Reflections? Is that really worth it?

If anything, if they remove RT altogether, they can probably hit 1440p-1620p native which means they should have enough headroom for a fairly decent 1080p performance mode that would have a higher pixel count than the PS5 Quality mode in its worst case scenario.
Depends on who is playing.
RDR2? Remains in backlog. Won’t play until 60fps.
The games you mentioned all have 60fps minimum mode options now (yay!).
 

SlimySnake

Flashless at the Golden Globes
The game is using hardware RT. I can confirm r.RayTracing=1
BTW, r.RayTracing.AsyncBuild is off. Setting this to 1, gives a 8-9% performance boost in some scenes.

About the software GI, it's disabled. Both with RT off and on.
r.SSGI.Enable=0
Where can i enable this? And what exactly does it do? Why wouldn’t they enable it by default.?
 

winjer

Gold Member
Where can i enable this? And what exactly does it do? Why wouldn’t they enable it by default.?

SSGI hits performance. In this game it seems to make little diference, as the pre-baked GI is very good.
But if you want to, you can edit a pak file and add this. Or use UUU.

[SystemSettings]
r.SSGI.Enable=1
r.SSGI.HalfRes=1
r.SSGI.Quality=3
 

mnkl13

Member
looking at this game it reminds me of rift apart. but i think this one is looking better. and rift apart was very impressive when it came out. so i just finally got this felling that we're moving foward that i didn't have before for some reason.
 

01011001

Banned
The game is using hardware RT. I can confirm r.RayTracing=1
BTW, r.RayTracing.AsyncBuild is off. Setting this to 1, gives a 8-9% performance boost in some scenes.

About the software GI, it's disabled. Both with RT off and on.
r.SSGI.Enable=0

I'm taking about Fortnite on console
 

SlimySnake

Flashless at the Golden Globes
The console patch is supposed to drop today. The PC patch supposedly tremendously improved performance so let's hope the same happens on consoles.
I dont know about trumendously. Its 13% and only in non-RT mode. And the fucking stutters are still there.

Though Jeddah is around 20-25%.

Also, this patch added a new issue where the game crawls to 20 fps as the widescreen cutscenes slowly remove the black bars and transition to full screen gameplay.

 

john2gr

Member
I dont know about trumendously. Its 13% and only in non-RT mode. And the fucking stutters are still there.

Though Jeddah is around 20-25%.

Also, this patch added a new issue where the game crawls to 20 fps as the widescreen cutscenes slowly remove the black bars and transition to full screen gameplay.



Don't expect Respawn to fix the traversal stutters. Traversal stutters are still in Fallen Order, and they are also present in the console versions of Jedi Survivor (they happen at the exact same spots). Even DF's frametime graphs show the stutters/spikes on consoles. It's how Respawn coded the game, so don't expect any major improvements. Also a hint: these traversal stutters can be more noticeable the higher your framerate is. If you lock it at 30fps, you won't be noticing them (as the CPU has more time to handle them before receiving the next frames from the GPU).

dHaOqHS.jpg
 
Last edited:

01011001

Banned
As per DF that was using software lumen.

Hardware Lumen is probably gonna be even more taxing on the fixed console specs, right ?

it's using software lumen because it's lower quality.
the question is, why not use hardware acceleration + the lower quality of software lumen.

intuitively you'd think that would make it run even better
 

Gaiff

Member
I dont know about trumendously. Its 13% and only in non-RT mode. And the fucking stutters are still there.

Though Jeddah is around 20-25%.

Also, this patch added a new issue where the game crawls to 20 fps as the widescreen cutscenes slowly remove the black bars and transition to full screen gameplay.


Yeah, that's why I said supposedly. 13% is pretty substantial but I've heard from people who experienced minimal stutters. Steve from HU for instance said it was mostly fine for him and I have a friend on a 3080 whose performance went up by 50% and he only experiences traversal stutters here and there. Seems shaders comp worked in his case but this game is so wonky that I'm not really sure. Some have also reported no improvements.
 
Last edited:

winjer

Gold Member
A user on the Guru3d forums managed to improve his issues with stutter by configuring his page file.
So I'll leave his post here, as it might help other people.


Well, I learned something valuable today and I thought I would share it as "#stutterstruggle" seems to be a prevailing theme in PC gaming these days.

One of my biggest complaints of late, was my traversal stuttering in Hogwarts Legacy. It wasn't super bad per say, but it was extremely annoying to the point where it made me regret buying the game as I am not really a fan of the Harry Potter Universe to begin with. The traversal stuttering was extremely repeatable regardless of how many times my character crossed through particular areas.

I also noticed that sometimes certain textures would be slow to load. This one was extremely perplexing, because how could that be possible on an RTX 4090 with gobs of VRAM? And my rig has a 13900KF and 32GB of fast DDR5 7600 as well. After several patches from the developer, the issue improved but was never fixed, so I put that down to the game's poor resource management due to bad programming and vowed to never buy another game from them again.

Star Wars Jedi Survivor had similar behavior in fact, but I put that one down to being "undercooked" out of the kitchen and after a few patches, all would be well.

If only things were so simple, but it turns out a large part of these issues was caused by a Windows configuration error believe it or not. I have 4 NVME SSD drives in my PC:

1) SK Hynix Platinum P41 2TB (this is my C:\)
2) SK Hynix Platinum P41 2TB (I use this drive for newer games only)
3) Samsung 980 Pro 2TB (Another game only drive)
4) Samsung 960 Pro 1TB (I use this drive for older games only)

So when I built my PC, one of the questions I had in my mind was whether I should put a Pagefile on each of those drives. I have known for a long time now how important a Pagefile can be to Windows, but I decided to use the option "Automatically manage paging file size for all drives."

That turned out to be a frigging mistake!
:mad:


Turns out, that for the best performance, ALL drives should have their own individual Pagefile. And this makes sense, because why would a game being played on say for example, my games only Platinum P41 be required to access or store data on the pagefile in my C:\ drive? The latency penalty would be huge!

But more than that, since enabling Pagefiles on all of my SSDs, I have noticed a MASSIVE reduction in traversal stuttering. In Hogwarts Legacy, I have noticed at least a 90% improvement in that area, which is astonishing!
:eek:
Every single game I tried, whether Hogwarts Legacy, Dead Space Remake, Jedi Survivor etcetera had a huge improvement, as verified by the MSI Afterburner frametime graph.

How is this possible you may ask? It turns out that Windows needs the presence of a Pagefile to utilize system memory more efficiently. For example, before I put a pagefile on the SSD where Hogwarts Legacy was installed, I noticed some odd behavior in terms of how it was using my RAM and VRAM. As I said earlier, I put it down to the game's poor resource management algorithms, but it turns out that Windows was restricting memory usage due to no Pagefile on the drive.

After creating a Pagefile, Hogwarts Legacy now uses WAY more RAM, but less VRAM. It uses above 20GB of RAM now with the Pagefile enabled, but without the Pagefile it would use about 16GB max and then over the course of prolonged gaming, Windows would decrease RAM usage down to as low as 8 or 9GB which would worsen traversal stuttering. I understand now that this was because Windows was attempting to not run out of memory due to not having a Pagefile on that drive.

So I hope this enlightens the forum as to the importance of having a Pagefile so that Windows will use the system RAM to the utmost efficiency
;)


I wonder how many PC gamers are like me and have multiple drives in their rig, but only have the Pagefile on their C:\ drive and are wondering why their games stutter so badly?
:eek:
 

SlimySnake

Flashless at the Golden Globes
Don't expect Respawn to fix the traversal stutters. Traversal stutters are still in Fallen Order, and they are also present in the console versions of Jedi Survivor (they happen at the exact same spots). Even DF's frametime graphs show the stutters/spikes on consoles.

dHaOqHS.jpg
I typically dont mind the traversal stutters. RE4 had them. I know exactly which spot. But there is something really weird about those stutters here. i get them while just walking around in the open world constantly.

Maybe its the way they have designed the game. There are no loading corridors like in RE4, TLOU and Destiny so it's next gen in that respect, but the flip side is that it is probably constantly streaming in data and UE4 just cant fucking handle it.
 

Ev1L AuRoN

Member
Just give us a non-RT version with locked 60fps and higher resolution and be done with it. the RT version can improve but i'm not confident they will be able to iron down the issues. RDNA2 its just not cut for that level of RT features, the resolution is low enough, I can't see it getting better without sacrifices in IQ.
 

SlimySnake

Flashless at the Golden Globes
The frustrating thing about this game is that it is a great game under the performance issues.
It's not just great. It's phenomenal. The level design, the story beats, the fantastic NPC characters, the incredible open world exploration (finally rewarding), even the graphics are very impressive at times. It is probably not as good looking as the matrix demo, but compared to other cross gen games, it looks really good.

I just hate having to adjust and readjust settings. Even Gotham Knights, Hogwarts and Re4, I was just able to turn off RT and enjoy a smooth 4k 60 fps experience. I cant that get that here even with RT off and FSR on at 4k 40 fps.
 

SlimySnake

Flashless at the Golden Globes
Just give us a non-RT version with locked 60fps and higher resolution and be done with it. the RT version can improve but i'm not confident they will be able to iron down the issues. RDNA2 its just not cut for that level of RT features, the resolution is low enough, I can't see it getting better without sacrifices in IQ.
Non-RT version has some very bad bugs that they will need to iron out. Cal's hair becomes orange whenever you move around even in the dark. it looks fucking ridiculous. The reflections in large bodies of water literally display the cones instead of reflections if you have enemies on top of those water bodies. Indoor reflections are fine.

They said this game was designed with RTGI in mind so its possible that they made the non-RT mode for series s as a fallback and didnt bother fixing any of the glaring issues with standard rasterization. Thats the version we are getting on PC even at max settings and its not good. Trust me.
 

yamaci17

Member
It's not just great. It's phenomenal. The level design, the story beats, the fantastic NPC characters, the incredible open world exploration (finally rewarding), even the graphics are very impressive at times. It is probably not as good looking as the matrix demo, but compared to other cross gen games, it looks really good.

I just hate having to adjust and readjust settings. Even Gotham Knights, Hogwarts and Re4, I was just able to turn off RT and enjoy a smooth 4k 60 fps experience. I cant that get that here even with RT off and FSR on at 4k 40 fps.
im still surprised u have problems with ue4 config frame limtier actually. for me it is super smooth in areas without usual stutters. in areas with usuall stutters... well they're usual stutters

you did not happen to combine it with any other frame limit have you? I plan on doing avideo later on but capturing smooth 40 fps on 30/60 hz recording tools is a bit hard
 
Top Bottom