• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Brothers: A Tale of Two Sons Remake - PS5/XSX/S Tech Review - UE5 Nanite/Lumen Come at a Heavy Cost

SlimySnake

Flashless at the Golden Globes
I don't understand the push for making games on UE5. UE4 should be the sweet spot between the graphics quality and what kind of performance current consoles (and cheaper PC's) can offer.
UE4 has some issues with open world games. Good devs can overcome them, but there are a lot of games where streaming issues especially can become a bottleneck. Days Gone had a severe streaming bug that brought the game's framerate to a crawl every few hours or so. hogwarts is very CPU bound in towns and Star Wars jedi is even worse. The ray tracing in these games is very heavy on the CPU and vram which is why Lumen and Nanite were needed.

Most UE5 nanite based games top out at 5-6GB whereas UE4 games can go up to 10 GB if not higher with ray tracing. Hogwarts was using 25GB of system ram on top of maxing out the 10 GB of vram. Rebirth has these really low res assets and textures that show how dated UE4 really is if you want to push a lot of different assets.

On consoles devs should choose between:

-lumen
-nanite
-virtual shadow maps

Using 3 of them together is too taxing. They should evaluate what they need the most.

For example in this game with this camera angle nanite and VSM are IMO not needed, you won't see many benefits but lumen is essential without baked lighting. At the same time they could have used nothing with good baked lighting but that requires more manual work.
Nanite also saves time because they no longer have to create different LODs for each object in the game. Even for top down games like this, they would have to do load in higher fidelity assets as the camera pans over.

the immortals devs had a really good interview about just how much time they saved thanks to UE5 features.

Lords of the Fallen devs actually baked in the Lumen lighting to save on performance. They got the same results, but saved on having Lumen run realtime. really good optimization techniques this devs probably shouldve used.

I would have thought UE5 was designed specifially with current gen in mind. And vice versa current gen was desigend with UE5 in mind.
So why does it kinda kill consoles while also making top PC gear sweat? Shouldn't there be some lower/consolespecific settings and workflow proposals like with RT (or like softshadows after Doom3 and Fear pushed that stuff, or the several AA techniques...). I understand that you can't just turn it off, since that would require an entire different approach, but if the used methods are too taxing, it has to have some slider values to crank up the framerate and be less costly for resolution. What's the point in some barely visible effects, if the resolution is going back to PS3 era? Is this tech just more convenient for the dev, so not using it, doing it with old PS4 pipeline, is out of the question, and they can really not change anything in UE5 to make it run better, find a better compromise? It's either on, using UE5 fully or off, which means they could just use UE4 for the practically same results?
Is it killing consoles? Epic said at the PS5 UE5 reveal that they were targeting 1440p 30 fps which is basically what this game and other UE5 games are running at. The problem is that gamers now want 1440p 60 fps which is simply not what this engine was designed for. It was sold as an engine for higher fidelity not higher performance. Gamers loved it. They loved the Matrix. Now the gamers want everything in 60 fps. Not gonna happen.

And UE5 is not the only engine suffering from PS3 era resolutions in their 60 fps mode. FF16 ran at 720p internally to hit 60 fps. Avatar and Alan Wake 2 are all on different engines and hit 720p. Star Wars hit 648p before they took out ray tracing and it was UE4. Skull and Bones runs at 720p. Guardians of the Galaxy couldnt run at 1080p 60 fps either and wouldve likely dropped to pS3 era resolutions if they wanted a locked 60 fps.

hell, even sonys own first party developer insomniac drops the reoslution all the way down to 1080p while pairing back some visuals. Thats PS4 era resolution from a first party dev in a game that barely even looks next gen.

You guys just have to stop looking at 60 fps modes and judging games and engines based on that. That is not the engine's fault. it is basically all you are going to get from these consoles that put the cpu on the same die as the GPU and have it fight for resources. The GPU is also only 10 tflops so its not like it was going to set the world on fire anyway. You have to have reasonable expectations.
 
Top Bottom