• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi Survivor Performance Review - PS5 vs PC vs Xbox Series X|S (NXGamer)

They really stop using FSR 2.1 using performance mode in those consoles games. Artifacts are really ugly in motion because the base resolution is too low. Also this game won't be that much better on the Pro models. low max resolution, bad IQ, no VRR and no unlocked option in the quality mode.
The image quality is awful in all modes.
I think FSR is causing a blurry ghosting effect with any motion as it tries to keep up reconstruction.
Performance dropping below 20 in places on top of the stutters.
 

DJ12

Member
PS5 CPU performs better than what is in Xbox. It's probably more about API than hardware but something like that was seen already in many games.

Other than that it performs like shit on any hardware, lol.
Ita probably about the extra hardware ps5 has that does tasks the series x and s have to use the cpu for.

Obviously any API overhead isn't going to help either.
 

Bernardougf

Gold Member
Do you truly think adding any more SKUs would improve this mess? This is the root of the problem to begin with.
No no ...sorry... its another topic ... Its just that we are already seeing some pretty bad downgrades in performance modes (the only one I use) so imho and personal wish .. i want pro consoles out as soon as possible

There is no fixing this mess or any other mess in this regard... as long as people continue to pre order and buy day 01 Betas.. I just trained my mind to see the launch as beta test and real launch in 6 months.. going to buy cheap and play a better version .. win win
 
Last edited:

sinnergy

Member
Ita probably about the extra hardware ps5 has that does tasks the series x and s have to use the cpu for.

Obviously any API overhead isn't going to help either.
It’s more about clock speed of the cpu and older engines, which favor faster clocks over parallel operations. PS5 clock is faster not much more to explain .
 
Last edited:

DJ12

Member
It’s more about clock speed of the cpu and older engines, which favor faster clocks over parallel operations. PS5 clock is faster not much more to explain .
Rofl. Sadly not.

For due diligence I mark your post 0 out of 10.
 

Lysandros

Member
There is no fixing this mess or any other mess in this regard... as long as people continue to pre order and buy day 01 Betas.. I just trained my mind to see the launch as beta test and real launch in 6 months.. going to buy cheap and play a better version .. win win
I agree, seems to be the way to go.
 

Puscifer

Gold Member
PS5 CPU performs better than what is in Xbox. It's probably more about API than hardware but something like that was seen already in many games.

Other than that it performs like shit on any hardware, lol.
Pretty sure a lot of it has to do with DirectX at this point. Vulkan is chewing through a lot of these problems on Linux like TLOU and if it weren't for the fact I just have so much work to do on Windows I'd go back.
 

skit_data

Member
Haven’t watched any comparison videos in quite a while but it generally feels like in general games performed better on PS5 at the start of the generation, shifted over to being better on Series X for about a year and now we’re back to games most often being better on PS5 again?
 

Loxus

Member
Guess who joined the party?



"it's ABoUt vrAM! thESE sHIt pOrts FiNALLy vAliDATe The DicK sucKING I dId FoR aMd 2 yEaRs aGO"

Mocking Spongebob Squarepants GIF

The issue is 8GB VRAM isn't enough for 1080p gaming anymore and as you can see, Star Wars Jedi: Survivor uses 9GB at 1080p. 10GB with RT enabled.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis Review
XLIAvCi.jpg
 

Lysandros

Member
Haven’t watched any comparison videos in quite a while but it generally feels like in general games performed better on PS5 at the start of the generation, shifted over to being better on Series X for about a year and now we’re back to games most often being better on PS5 again?
I don't remember such a one sided back to back results favoring one platform (with small margins or not) until his period (from about the beginning of 2023). Even the very beginning of the generation wasn't this lopsided. After that, it was about evently matched, with slight/academic differences and few outliers. I think this is partly due to GDK updates/improving environment for both platforms among other things.
 
Last edited:

Gaiff

Member
The issue is 8GB VRAM isn't enough for 1080p gaming anymore and as you can see, Star Wars Jedi: Survivor uses 9GB at 1080p. 10GB with RT enabled.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis Review
XLIAvCi.jpg
I swear you have no idea how to read. How about you check the performance metrics?

performance-2560-1440.png

performance-rt-2560-1440.png


VRAM isn't an issue at 1440p. Don't just read the numbers and call it a day. Contextualizie. VRAM measurements will vary wildly depending on what card you are using and cards with more VRAM will tend to report higher usage because of the allocation.



Here is the game running at all resolutions and VRAM is never an issue. You attempted to pull the same bullshit in the other thread and got called out and proven wrong. Now here you are blatantly misrepresenting the truth again.
 
Last edited:
Don't see the point of these comparisons, these games all need a few patches to get the true performance, so unless there is a followup video these are kind of useless, other than to show you how poor the games are running nowerdays at launch.
 

Darsxx82

Member


Stable game at 30fps?? only in the first section maybe. After that the experience is different.


Then someone will say: "they are not reliable sources". OK. I will say that the source of the video is someone with a stated preference in favor of PS5. The cover image is meaningful vs what You see in the video😉 .

But the importante thing is that these results coincide with other sources (baned and not baned, pro Xbox Or ProPS) and that, at least, is difficult to ignore if you are really looking for an accurate reality on consoles that, in my personal opinion of course, IGN-NXG may not be offering .

And I repeat, these are not even the most demanding areas, which is why I hope DF can analyze much more than first section or the start Of the game.

Each one is free to ignore the results that are seen there, but it does not stop coinciding with the impressions of those who are playing it. You even have people here attesting.
Post in thread 'Star Wars Jedi Survivor Performance Review - PS5 vs PC vs Xbox Series X|S (NXGamer)' https://www.neogaf.com/threads/star...box-series-x-s-nxgamer.1655999/post-267885619



. Both consoles drop to ~30fps (surely XSX more times) in performance mode even in that first section.


Fu9rkCXXsAYOmop


I don't know who is going to be in charge of doing the DF analysis, but I hope they will analyze beyond the first sections of the game because it is a type of game that requires more detail.

Hopefully DF and VGtech can clarify just how extreme the console version is in their experience. What Resolution It have is not cleary also.

I am afraid that Respawn will focus on PC and may stop Or ignore working on console optimization due to the fact that the media will drop the idea that everything is OK on consoles.
 
Last edited:

ToTTenTranz

Banned
Jedi Survivor looks gorgeous. The game's raytracing is also actually pretty decent, too.





The TechPowerUp comparison is showing the usual performance brackets among the different tiers, with just a slight advantage to AMD cards. 8GB cards get punished, but that's just how all the current generation of games is running. Anyone who recently bought a 8GB videocard thinking they'd get to max out textures, shadows and geometry during the SeriesX/PS5-era games, simply made a bad purchase.
Bringing down the framebuffer size by reducing the resolution isn't going to do any miracles. It's the maxed out textures and shadows, which are now designed for the >12GB available VRAM on the 2020 consoles, that will be taking up the space. At some point you can even go down to 720p that it won't make much of a difference.


The actual problem with some people over this game's RT implementation is the fact that the RT-off mode also looks pretty good (very large textures & shadowmaps & geometry which is why it takes lots of VRAM), and they're used to Nvidia sponsored titles where the original non-RT lighting system looks so bad that it makes the RT mode look substantially better by comparison. Like Metro Exodus and Cyberpunk which digitalfoundry loves so much.



Troll tweeet?

Yes, of course.
Though it's mostly trolling Alex Battaglia's blatant bias against any Nvidia and/or Microsoft competition, which he's hardly capable of hiding nowadays.
If the game is showing slightly better performance on Radeon cards and/or slightly better performance on the PS5 over the SeriesX the dude gets immensely triggered and turns into a keyboard warrior.
FSR2 is doing its job just fine post day0-patch. Battaglia himself reviewed FSR2 when it came out and claimed it was quite close to DLSS, but if a game fails to bring Nvidia's proprietary upscaler the dude puts on his Nvidia fanboy cap and completely loses control. For the Jedi Survivor the he had the gall to claim FSR2 is horrible, directly contradicting his own previous statements.



You like to follow these troll/fanboy twitter accounts don't you lol.
I don't follow this account. The tweet appeared on my feed because I follow @kopite7kimi and @RetiredEngineer which do follow it.
It's a funny tweet though.



I have to admit it's been pretty fun watching these threads (and resetera/digitalfoundry) with all the triggered man-baby meltdowns because AMD dared to make a marketing deal with EA/Respawn for this game, and Respawn dared to use an open source temporal reconstruction tech that would work on all PC GPUs and consoles.
How Dare You Greta GIF
 
Last edited:

Traxtech

Member
Has it been brought up now much water absolutely tanks the frame rate?? It's easily 20fps when running through water..
 

Gaiff

Member
Yes, of course.
Though it's mostly trolling Alex Battaglia's blatant bias against any Nvidia and/or Microsoft competition, which he's hardly capable of hiding nowadays.
If the game is showing slightly better performance on Radeon cards and/or slightly better performance on the PS5 over the SeriesX the dude gets immensely triggered and turns into a keyboard warrior.
FSR2 is doing its job just fine post day0-patch. Battaglia himself reviewed FSR2 when it came out and claimed it was quite close to DLSS, but if a game fails to bring Nvidia's proprietary upscaler the dude puts on his Nvidia fanboy cap and completely loses control. For the Jedi Survivor the he had the gall to claim FSR2 is horrible, directly contradicting his own previous statements.
This isn't the gotcha moment that you think it is. Both DLSS and FSR have improved since then and DLSS more so than FSR, widening the gap. Furthermore, not all implementations of FSR are created equal. It works great in some games and is almost as good as DLSS while in some others it completely falls apart. The problem though is that below Quality Mode at 4K, FSR is shit. This game reconstructs from a 1080p base in its performance mode which is way too low. Hardware Unboxed did a piece recently and concluded that FSR performance at 4K is garbo because the internal res is just too low for it to work well. Tim even said he was surprised at how much better DLSS is.
 

onQ123

Member
I don't remember such a one sided back to back results favoring one platform (with small margins or not) until his period (from about the beginning of 2023). Even the very beginning of the generation wasn't this lopsided. After that, it was about evently matched, with slight/academic differences and few outliers. I think this is partly due to GDK updates/improving environment for both platforms among other things.
As streaming comes into play

https://www.neogaf.com/threads/peop...the-ssd-is-going-to-be-its-life-line.1536804/
 

SlimySnake

Flashless at the Golden Globes
Haven’t watched any comparison videos in quite a while but it generally feels like in general games performed better on PS5 at the start of the generation, shifted over to being better on Series X for about a year and now we’re back to games most often being better on PS5 again?
Yes, but not this time. The game runs like shit on PS5 and XSX. As well as PC.
 

Zathalus

Member
Jedi Survivor looks gorgeous. The game's raytracing is also actually pretty decent, too.





The TechPowerUp comparison is showing the usual performance brackets among the different tiers, with just a slight advantage to AMD cards. 8GB cards get punished, but that's just how all the current generation of games is running. Anyone who recently bought a 8GB videocard thinking they'd get to max out textures, shadows and geometry during the SeriesX/PS5-era games, simply made a bad purchase.
Bringing down the framebuffer size by reducing the resolution isn't going to do any miracles. It's the maxed out textures and shadows, which are now designed for the >12GB available VRAM on the 2020 consoles, that will be taking up the space. At some point you can even go down to 720p that it won't make much of a difference.


The actual problem with some people over this game's RT implementation is the fact that the RT-off mode also looks pretty good (very large textures & shadowmaps & geometry which is why it takes lots of VRAM), and they're used to Nvidia sponsored titles where the original non-RT lighting system looks so bad that it makes the RT mode look substantially better by comparison. Like Metro Exodus and Cyberpunk which digitalfoundry loves so much.





Yes, of course.
Though it's mostly trolling Alex Battaglia's blatant bias against any Nvidia and/or Microsoft competition, which he's hardly capable of hiding nowadays.
If the game is showing slightly better performance on Radeon cards and/or slightly better performance on the PS5 over the SeriesX the dude gets immensely triggered and turns into a keyboard warrior.
FSR2 is doing its job just fine post day0-patch. Battaglia himself reviewed FSR2 when it came out and claimed it was quite close to DLSS, but if a game fails to bring Nvidia's proprietary upscaler the dude puts on his Nvidia fanboy cap and completely loses control. For the Jedi Survivor the he had the gall to claim FSR2 is horrible, directly contradicting his own previous statements.




I don't follow this account. The tweet appeared on my feed because I follow @kopite7kimi and @RetiredEngineer which do follow it.
It's a funny tweet though.



I have to admit it's been pretty fun watching these threads (and resetera/digitalfoundry) with all the triggered man-baby meltdowns because AMD dared to make a marketing deal with EA/Respawn for this game, and Respawn dared to use an open source temporal reconstruction tech that would work on all PC GPUs and consoles.
How Dare You Greta GIF

Or they could just include both DLSS and FSR. It would require almost no work. Oh wait, that would make AMD cards look bad.
 

Mister Wolf

Member
I'd like to know what this and Callisto Protocol are doing with Unreal 4 that's causing these modern CPUs to bottleneck even with raytracing turned off. It's not like any of the included tech in UE4 is new.
 

SomeGit

Member
Wait, is this disc version without the patches? Why no Xbox version in the same context? I'll stick to NXgamer's analysis i think, that was far more comprehensive.

Note: EA did not provide console code for Survivor until launch day. However, we acquired a physical copy of the PS5 version ahead of that. We'll be following up soon with a more detailed breakdown, along with everything you need to know about the Xbox Series console versions, PS5 comparisons etc.
 

Zathalus

Member
Wait, is this disc version without the patches? Why no Xbox version in the same context? I'll stick to NXgamer's analysis i think, that was far more comprehensive.
No, it's the current version. They don't have the Xbox version up yet as they has to buy it themselves. Anyway, sub 720p and drops to the 30s for the performance mode is a joke. Even the quality mode has drops to the teens.
 

adamsapple

Or is it just one of Phil's balls in my throat?
What the fuck is up with EA and their action games having ridiculously low pixel counts. Dead Space was also like this, but owing to that games dark nature, it was harder to notice there.
 
I think is that the high end GPU is not getting scaled properly by developers due to Engine limitations and CPU bottleneck. If you see performance of RTX 3060 on 4K with DLSS on or FSR you will get very similar or even better experience than PS5 or Xbox X series. The problems is on scalability of High GPU, which is chocked by CPU Limitations, Driver overhead+OS overheads and that is you see on lower resolution even at times RTX 3060 gets chocked even if you have high end CPU (Uncharted and Dead Space Remake) on 1080p, but when the resolution is at 4K you will very much similar performance or better than PS5. Therefore, this generation is all about CPU domination as GPU segment has reached it breaking point the more high end launches than you will more disappointment and more scalability issue.
 

DenchDeckard

Moderated wildly
I don't remember such a one sided back to back results favoring one platform (with small margins or not) until his period (from about the beginning of 2023). Even the very beginning of the generation wasn't this lopsided. After that, it was about evently matched, with slight/academic differences and few outliers. I think this is partly due to GDK updates/improving environment for both platforms among other things.
Bro, the game is fucked....you can't be serious about this?
 

Loxus

Member
I swear you have no idea how to read. How about you check the performance metrics?

performance-2560-1440.png

performance-rt-2560-1440.png


VRAM isn't an issue at 1440p. Don't just read the numbers and call it a day. Contextualizie. VRAM measurements will vary wildly depending on what card you are using and cards with more VRAM will tend to report higher usage because of the allocation.



Here is the game running at all resolutions and VRAM is never an issue. You attempted to pull the same bullshit in the other thread and got called out and proven wrong. Now here you are blatantly misrepresenting the truth again.

Something is not right with that video.
It has the game running at 55fps @4k with RT.

But here it's 24fps.
JX5gtk3.jpg


Running out of VRAM effects more than just frame rate, it effects textures and can cause stutters.

What's funny is you went to the same source were I got the VRAM usage to try and prove me wrong.
llzT9lW.jpg


Maybe you should take this up with Techpowerup and not me.
 
Top Bottom