• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Remnant 2 dev: We've designed the game with upscaling in mind (DLSS/FSR/XeSS)

Gaiff

SBI’s Resident Gaslighter
This is from Saturday.

We've heard from a few folks about the game's overall performance. We're definitely going to roll out performance updates after the game's launch. But for the sake of transparency, we designed the game with upscaling in mind (DLSS/FSR/XeSS). So, if you leave the Upscaling settings as they are (you can hit 'reset defaults' to get them back to normal), you should have the smoothest gameplay.

Here is the reddit post.

This is also from Gunfire Games so do not expect great performance improvements. This is the same incompetent team that produced Darksiders III and the game is a performance hog despite looking very meh. If I recall correctly, even an RTX 3080 has trouble maintaining 60fps at 4K/max settings. For what is basically a mid-tier PS4 game, that's incredibly disappointing. Furthermore, there are a bunch of major bugs that were reported in that game but never fixed. The icing on the cake is that the dev also promised to include DLSS in that game but never did. They never gave the community an update about it. Ironic, isn't it?

Anyway, not sure what kind of nonsense this is but it seems that upscaling, rather than being a nice bonus, will increasingly be used as a crutch for developers not to bother with making sure their games run properly.

That we need DLSS when using advanced rendering techniques such as ray tracing or path tracing is 100% understandable. That we need upscaling to play a basic-bitch looking games with acceptable performance is not.
 
Last edited:

Red5

Member
Your game better look like how Crysis did on Max settings at launch natively if you're going to go with that excuse.
 
I can only speak to personal experience, but on my rtx3070 system, if DLSS is available, I turn it on. I don't notice any change in visuals or latency, but it's a massive improvement to framerate.

We are seeing similar adoption of FSR2.0 on Console as a baseline standard.

With image reconstruction being as good as it is, and as available as it is to users.. I just don't see why you wouldn't use it. Like.. Even if the game ran at 1440p 60fps on my system natively.. it would still run better with DLSS.. So why would I not use it?

I keep seeing this popping up as a spicy issue. Like that thread about 4090 performance. Like sure in principal it is crazy that a 4090 can't render this game natively 4k60, but who out there has a 4090 and isn't using DLSS? It's one of the best thing about that card! The AI cores that allow for DLSS3.0 integration are amazing, why... on earth.. would you spend so much money on that card and NOT use it's key differentiating feature? You're literally paying for 4th gen Tensor Cores that you aren't using if you turn it off. Sounds like a waste to me. Turn on DLSS. Use your fancy card the way it was designed.
 
Last edited:

Sakura

Member
I can only speak to personal experience, but on my rtx3070 system, if DLSS is available, I turn it on. I don't notice any change in visuals or latency, but it's a massive improvement to framerate.

We are seeing similar adoption of FSR2.0 on Console as a baseline standard.

With image reconstruction being as good as it is, and as available as it is to users.. I just don't see why you wouldn't use it. Like.. Even if the game ran at 1440p 60fps on my system natively.. it would still run better with DLSS.. So why would I not use it?

I keep seeing this popping up as a spicy issue. Like that thread about 4090 performance. Like sure in principal it is crazy that a 4090 can't render this game natively 4k60, but who out there has a 4090 and isn't using DLSS? It's one of the best thing about that card! The AI cores that allow for DLSS3.0 integration are amazing, why... on earth.. would you spend so much money on that card and NOT use it's key differentiating feature? You're literally paying for 4th gen Tensor Cores that you aren't using if you turn it off. Sounds like a waste to me. Turn on DLSS. Use your fancy card the way it was designed.
If it ran well without DLSS though, wouldn't you get even more frames with DLSS on?
Requiring DLSS to get to 60 in the first place isn't really something we should be fine with unless the game is truly pushing the latest tech or something.
 

ZehDon

Gold Member
If it ran well without DLSS though, wouldn't you get even more frames with DLSS on?
Requiring DLSS to get to 60 in the first place isn't really something we should be fine with unless the game is truly pushing the latest tech or something.
To play devil's advocate, it is pushing Unreal Engine 5 and makes use of UE5's nanite for its geometry. This may just be the reality of how heavy Unreal Engine 5 is, as we've only seen it in a handful of titles - even the initial PS5 tech demo was a reconstructed 1440p.
 

Arsic

Loves his juicy stink trail scent
Idk why people are mad. Using my 3080 I’m at 90 fps on ultra settings 1440p , and dlss quality . Game doesn’t look blurry in the slightest , crisp, and a real looker on PC.

Boo fucking hoo. Games need DLSS to have good fps. Beats shader compilation stutter and CPU heavy loaded games not using GPU at all.

Game is great on PC, but some Incels with a 4090 can’t cope so they’ll berate the game as bad and poorly optimized. Same guy is playing Roblox in 4k pissed he’s only getting 239fps instead of 240 on his 4090ti.

Touch grass.
 
Last edited:

phant0m

Member
I can only speak to personal experience, but on my rtx3070 system, if DLSS is available, I turn it on. I don't notice any change in visuals or latency, but it's a massive improvement to framerate.

We are seeing similar adoption of FSR2.0 on Console as a baseline standard.

With image reconstruction being as good as it is, and as available as it is to users.. I just don't see why you wouldn't use it. Like.. Even if the game ran at 1440p 60fps on my system natively.. it would still run better with DLSS.. So why would I not use it?

I keep seeing this popping up as a spicy issue. Like that thread about 4090 performance. Like sure in principal it is crazy that a 4090 can't render this game natively 4k60, but who out there has a 4090 and isn't using DLSS? It's one of the best thing about that card! The AI cores that allow for DLSS3.0 integration are amazing, why... on earth.. would you spend so much money on that card and NOT use it's key differentiating feature? You're literally paying for 4th gen Tensor Cores that you aren't using if you turn it off. Sounds like a waste to me. Turn on DLSS. Use your fancy card the way it was designed.
Way more fun to just shit on developers, no?

Playing on my 3080 on Ultra @ 1440p myself. Game looks great from an IQ perspective and easily holds 72fps locked (half rate 144hz)
 

Mr Moose

Member
Idk why people are mad. Using my 3080 I’m at 90 fps on ultra settings 1440p , and dlss quality . Game doesn’t look blurry in the slightest , crisp, and a real looker on PC.
1440p DLSS quality, isn't that 960p? Can you test it at native 1080p with the same settings?
 
I can only speak to personal experience, but on my rtx3070 system, if DLSS is available, I turn it on. I don't notice any change in visuals or latency, but it's a massive improvement to framerate.

We are seeing similar adoption of FSR2.0 on Console as a baseline standard.

With image reconstruction being as good as it is, and as available as it is to users.. I just don't see why you wouldn't use it. Like.. Even if the game ran at 1440p 60fps on my system natively.. it would still run better with DLSS.. So why would I not use it?

I keep seeing this popping up as a spicy issue. Like that thread about 4090 performance. Like sure in principal it is crazy that a 4090 can't render this game natively 4k60, but who out there has a 4090 and isn't using DLSS? It's one of the best thing about that card! The AI cores that allow for DLSS3.0 integration are amazing, why... on earth.. would you spend so much money on that card and NOT use it's key differentiating feature? You're literally paying for 4th gen Tensor Cores that you aren't using if you turn it off. Sounds like a waste to me. Turn on DLSS. Use your fancy card the way it was designed.
Because ppl want a native 4k image and not a 720p image upscaled?
 

Arsic

Loves his juicy stink trail scent
1440p DLSS quality, isn't that 960p? Can you test it at native 1080p with the same settings?
1080p always looks fuzzy/blurry on my 4k monitor with dlss off. Upscaling looks way more sharp. *shrug*
 

Mr Moose

Member
1080p always looks fuzzy/blurry on my 4k monitor with dlss off. Upscaling looks way more sharp. *shrug*
Yeah for sure it's the same when I use my 4K TV along with my 1080p monitor, looks like cack on the TV. I love DLSS/FSR 2.+, just wondering the difference in fps between a native 1080p ultra settings and DLSS quality 1440p with ultra settings.
 

Arsic

Loves his juicy stink trail scent
Yeah for sure it's the same when I use my 4K TV along with my 1080p monitor, looks like cack on the TV. I love DLSS/FSR 2.+, just wondering the difference in fps between a native 1080p ultra settings and DLSS quality 1440p with ultra settings.
Will check for you later tonight when I play.

Stay tuned!
 

Hoddi

Member
I don't see the problem with it as long as the game looks good. UE5 itself is arguably designed around upscaling via TSR.

The game also runs fine on my 2080Ti at 3440x1440. Can't tell much difference between medium and ultra settings anyway.
 
Last edited:

Arsic

Loves his juicy stink trail scent
Entitled gamers, the thread. Go cry into your 4090's.

Games great, possible GOTY material and I for one thank these "lazy devs".
No cap. The quality of this game is mind bending .

I also should share if you update your graphics drivers you’ll see considerable fps improvements. I can play 4k60 on a 3080 now in dlss quality after the drivers update.
 

ToTTenTranz

Banned
The game looks awesome and plays perfectly well in all hardware. Upscaling tech is pretty standard practice on all console and PC games except for indies and such.


In case anyone was wondering why there are 2 threads dedicated to this on top of the discussion on the game's review thread, the reason we have some very specific individuals shitting on the game on this forum is because the $950 RX 7900 XTX beats the $1800 RTX 4090 and the $600 RX 6950XT gets the same performance as the $1200 RTX 4080.


VL4BDf6.jpg





Also, Gunfire Games is a 82-people studio dedicated to making games, so of course they're going to cut corners here and there. It's perfectly acceptable.

Can you imagine these same people making the same complaints about cyberpunk's ultra-super-duper-moar-RTX mode not running on a 4090 at more than 20 FPS without DLSS2+3+4+5 despite being made by a dev with 1100 people?
Of course not. That's not what they were told to do.
 
How to say it's unoptimised without saying it's unoptimised.
The second Sony talked about checkerboarding it was crystal clear to me that this is going to be part of the optimisation process in all games sooner than later. PC crowd laughed then about me saying it, but here we are.

And I am not against it. Every dev has just a limited amount of time given. Smaller devs certainly less, while Rockstar possibly has a bunch of people fiddling with all sorts of code but also incl. upscalers. Why should they try to create better code, when the easier solution is to make upscalers do the same work for less. Ideally we would get both, but it isn't like Horizon or whatever aren't designed with upscalers in mind.
 

GymWolf

Member
Unfortunately it doesn't looks much better than Crysis 3 from 10 years ago
This is the second time i read this from you, the game is not path traced cyberpunk but on 4k max details it looks great for a AA and sure as hell better than crysis 3 in most things that matter, i'm sure that every game has some graphical aspect that can be beaten by 10 years old game, arkham knight water looks better than water in 99% of games, it doesn't mean that knight looks overall better than 99% of games.
 
Last edited:

LiquidMetal14

hide your water-based mammals
If the dev team is "small", you could throw the excuse of rite forcing it and not optimizing as much out there. It may be reality but still is disappointing knowing it could run better overall before even considering DLSS.

DLSS is great but using that as a crux to only optimize to a degree just reads poorly. Not earth ending but we talk about all aspects of the game and this is worthy of discussion regardless if the game is good or not.
 

Spyxos

Gold Member
I'm curious to see how hardware-hungry the Ue5 games will be. But it doesn't look good that they need upscalers to make it playable at all.
 

Zathalus

Member
The game looks awesome and plays perfectly well in all hardware. Upscaling tech is pretty standard practice on all console and PC games except for indies and such.


In case anyone was wondering why there are 2 threads dedicated to this on top of the discussion on the game's review thread, the reason we have some very specific individuals shitting on the game on this forum is because the $950 RX 7900 XTX beats the $1800 RTX 4090 and the $600 RX 6950XT gets the same performance as the $1200 RTX 4080.


VL4BDf6.jpg





Also, Gunfire Games is a 82-people studio dedicated to making games, so of course they're going to cut corners here and there. It's perfectly acceptable.

Can you imagine these same people making the same complaints about cyberpunk's ultra-super-duper-moar-RTX mode not running on a 4090 at more than 20 FPS without DLSS2+3+4+5 despite being made by a dev with 1100 people?
Of course not. That's not what they were told to do.
I'm sure you would say the exact same if AMD was underperforming in this game as well.

The 6950XT matching the 4080 is not OK, you can't blame the engine either as other UE5 games don't have this problem.
 
Idk why people are mad. Using my 3080 I’m at 90 fps on ultra settings 1440p , and dlss quality . Game doesn’t look blurry in the slightest , crisp, and a real looker on PC.

Boo fucking hoo. Games need DLSS to have good fps. Beats shader compilation stutter and CPU heavy loaded games not using GPU at all.

Game is great on PC, but some Incels with a 4090 can’t cope so they’ll berate the game as bad and poorly optimized. Same guy is playing Roblox in 4k pissed he’s only getting 239fps instead of 240 on his 4090ti.

Touch grass.
90fps at Ultra settings on a 3080? That's interesting. At the Tower of Unseen and the next area I went to it dropped to below 80 quite a bit. It even drops below 60 in crowded areas (when turning settings to Ultra and Quality DLSS and 1440p).

With high settings, shadows dropped to medium and DLSS on balanced I can get over 90, no problem. Still looks really good too.

The rest of my rig is more than up to the task as well. I’ve got a 12700K, 32GB and it’s installed on a S850x NVME. So unless you’ve got something else going on or a better GPU disguised as a 3080, I call bullshit on you reaching those frames at those settings in an actual gameplay scenario, not just staring at a wall in an empty room.
 

ToTTenTranz

Banned
I'm curious to see how hardware-hungry the Ue5 games will be. But it doesn't look good that they need upscalers to make it playable at all.
UE5's Nanite is very compute-heavy. All the 30FPS demos on the current-gen consoles are running at 1080p internal with TSR reconstruction to 4K output.
That's the equivalent of DLSS / FSR Performance.





The 6950XT matching the 4080 is not OK, you can't blame the engine either as other UE5 games don't have this problem.
Is there any other game running Nanite out there? Perhaps Fortnite UE5 in ultra mode (if there is such a thing), but I haven't seen any GPU comparisons.
 

ToTTenTranz

Banned
Eh. I really enjoyed Remnant and I'm having a great time with the sequel, but both the visuals in both games look dated relative to when they were released.
Like I said: the game looks fine. Runs very well on pretty much all hardware when paired with different levels of temporal upscalers, and they support them all.


Is it a game that looks like Horizon Forbidden West? No. Is it a game that is as thoroughly optimized as Doom Eternal? No.
It's a game from a medium-small studio with a pretty big scope as it is, and still they managed to get excellent review scores.
 

ToTTenTranz

Banned
No, you said it "looks awesome" which I think is a bit of reach.
Looks fine and awesome and good and great and cool. Awesome was about the game in general, fine was about image quality.

Playability and gameplay looks awesome. The game's graphics look fine, and they look awesome considering dev team's size, the scope and the critical acclaim it got.
 

Zathalus

Member
UE5's Nanite is very compute-heavy. All the 30FPS demos on the current-gen consoles are running at 1080p internal with TSR reconstruction to 4K output.
That's the equivalent of DLSS / FSR Performance.






Is there any other game running Nanite out there? Perhaps Fortnite UE5 in ultra mode (if there is such a thing), but I haven't seen any GPU comparisons.
Fornite is using nanite and the 4080 slightly outperforms the 7900 XTX.
 

winjer

Gold Member
The heavier effect in the game is Shadow Quality. Setting this from Ultra to high, gives a nice improvement to performance, with a small loss in image quality.
The rest of the settings seem to only give a couple of FPS.
 

winjer

Gold Member
Fornite is using nanite and the 4080 slightly outperforms the 7900 XTX.

The reason why the 4080 outperforms the 7900XTX in Fortnite is not nanite, it's Lumen.

------------

BTW, disabling r.nanite in Remnant 2 drops performance by half.
Probably, the engine just loads all assets with the highest LOD and crushes performance.
 
Last edited:

Wildebeest

Member
The heavier effect in the game is Shadow Quality. Setting this from Ultra to high, gives a nice improvement to performance, with a small loss in image quality.
The rest of the settings seem to only give a couple of FPS.
Is the frame rate even GPU limited for you, or is it CPU?
 

RoboFu

One of the green rats
Well there it is.. the beginning of making something that was meant to push lower specs further to just normalized so nothing was gained.
 
The game looks awesome and plays perfectly well in all hardware. Upscaling tech is pretty standard practice on all console and PC games except for indies and such.


In case anyone was wondering why there are 2 threads dedicated to this on top of the discussion on the game's review thread, the reason we have some very specific individuals shitting on the game on this forum is because the $950 RX 7900 XTX beats the $1800 RTX 4090 and the $600 RX 6950XT gets the same performance as the $1200 RTX 4080.


VL4BDf6.jpg





Also, Gunfire Games is a 82-people studio dedicated to making games, so of course they're going to cut corners here and there. It's perfectly acceptable.

Can you imagine these same people making the same complaints about cyberpunk's ultra-super-duper-moar-RTX mode not running on a 4090 at more than 20 FPS without DLSS2+3+4+5 despite being made by a dev with 1100 people?
Of course not. That's not what they were told to do.
Wow this graph is so bad...
 
Top Bottom