• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Inside Unreal: In-depth look at PS5's Lumen in the land Of Nanite demo(only 6.14gb of geometry) and Deep dive into Nanite

Status
Not open for further replies.

assurdum

Banned
a GPU with Less cu starts at a disadvantage and needs a higher clock to stay in the game. The PS5 in no way will perform like a 3080 (and we can bet the account on this) despite having more fillrates...it seems people will never learn
You missed the point. Perform or outperform in a specific scenario, it's not like to say it perform likely a better GPU.
 
Last edited:
yes another type of astroturfing is starting .... my god ... this now is the clock thing they are trying to hide behind ...to somehow see a PS5 advantage over more powerful GPUs in teraflops

Yeah it fascinating. Why would Epic think it would be good to make fun of ps5 performance by locking framerate to 30 ?

And yet we can have 3080 doing 60+ fps.They have facts in front of their noses and yet still come up with excuses hoping someone will buy it.
 

MonarchJT

Banned
You missed the point. Perform better in a single scenario it's not like to say it perform likely a better GPU.
I understand very well but every time I read the usual 4 or 5 users try to push who knows what new theory (until debunked) .. or who knows what special sauce to raise the PS5 above the competition. No guys, it's time to accept it. The PS5 is a very good console Cerny has chosen to spend a good part of the R&D betting on the speed of the SSD , was his bet and it's great to have almost zero loading times or for devs to never worry about i / o. But the PS5 is and will remain a 10 tf variable console. It is becoming symptomatic to always read glimpses as you write "eh but this is an advantage of the ps5" most likely the only advantage Sony has at the moment is not the console but its studios. First you will first accept this type of thread where one or more are derided en masse will end.
 
yeah all beautiful all clear the clock has advantages IF ... and I say IF ... compared to a card with the same characteristics , otherwise as we well know (and above all all the producers teach us) the paralelization of several CUs and all the rest trump those small advantages that you have at the expense of heating / dissipation / consumption

Nvidia 3090 :
Base Clock 1395M MHz
BoostClock 1695 MHz

Nvidia 3080:
Base Clock 1440 MHz
Boost Clock1710 MHz

Nvidia 3070:
Base Clock1500 MHz
Boost Clock1725 MHz

AMD 6900XT:
Base Clock1825 MHz
Game Clock2015 MHz
Boost Clock2250 MHz

AMD 6700 XT:
Base Clock2321 MHz
Game Clock2424 MHz
Boost Clock2581 MHz

By their own logic 6900 xt should walk all over rtx 3080, but it's not the case in UE5 valley demo.

6900 xt @2,5 GHz

OiBEdKX.jpeg


3080 @ only 1,9 GHz

7kNaCZk.jpg



You have to look at all the gfx cards specs, coz all of them count in one way or another. Btw 6900 xt is at the very minimum twice faster than ps5 in ANY game.


edit: I didn't notice 6900 xt is only at 91% load, but still wouldn't be far ahead. Strange it is dx12, but Nvidia driver has better load than AMD ? What happened with Nvidia driver issues in dx12?
 
Last edited:

MonarchJT

Banned
By their own logic 6900 xt should walk all over rtx 3080, but it's not the case in UE5 valley demo.

6900 xt @2,5 GHz

OiBEdKX.jpeg


3080 @ only 1,9 GHz

7kNaCZk.jpg


You have to look at all the gfx cards specs, coz all of them count in one way or anotehr. Btw 6900 xt is at the very minimum twice faster than ps5 in ANY game.
higher cu count trumps every single mHz in modern engines with high paralelization
 

PaintTinJr

Member
Still not sure how you can say it's like to claim that ps5 hardware perform likely a 3080 on pixel fill rate with the UE5. You know the bandwidth it's still part of such processing.
True, but the lifts all boats comment by Cerny - along with bandwidth and latency saving cache scrubbers - hints it is a good setup. And I'm a bit fuzzy on it, but didn't some font package benchmark on PS5 compare fill-rate/throughput with high-end cards six months ago?
 
This was a cool video, I learned a lot. TBH, I was more excited about Lumen and not having to build lighting, so I just jumped right into using it with even understanding how cool Nanite is.

He used a command to show stats for Nanite that's different from the one in the editor window. While explaining clusters, he shows that a typical scene has ~12mil Nanite tris. I was able to shortly import a nanite model from Bridge, spool up a shitload of instances, and push the scene to 20-25mil Nanite tris and run in PIE mode smoothly without any issues.

It's a trip to put on Nanite triangle or cluster mode and move the camera around and see it rearrange itself.

Even with full logic and sub-systems running that would be very good low polygon usage from Nanite.

What's you GPU specs, by chance? Speaking generally here but 20-25 million Nanite polys (considering they're 1:1 with traditional triangles) is gonna be a breeze on Series X and PS5 (7.3 billion prims (polys)/sec & 8.9 billion prims (polys)/sec, respectively). Even Series S, it theoretically wouldn't be a stressing figure (6.4 billion prims(polys)/sec).
 

MonarchJT

Banned
True, but the lifts all boats comment by Cerny - along with bandwidth and latency saving cache scrubbers - hints it is a good setup. And I'm a bit fuzzy on it, but didn't some font package benchmark on PS5 compare fill-rate/throughput with high-end cards six months ago?
Cerny isn't stupid and his comment was about the clock compared to an hypothetical same gpu with 36 cu at a lower clock. Fans see what doesn't exist by creating false realities at this point.
 
Last edited:

CrustyBritches

Gold Member
Even with full logic and sub-systems running that would be very good low polygon usage from Nanite.

What's you GPU specs, by chance? Speaking generally here but 20-25 million Nanite polys (considering they're 1:1 with traditional triangles) is gonna be a breeze on Series X and PS5 (7.3 billion prims (polys)/sec & 8.9 billion prims (polys)/sec, respectively). Even Series S, it theoretically wouldn't be a stressing figure (6.4 billion prims(polys)/sec).
I won't presume to speak on how this all works in relation to theoretical specs. I have 2060 Super along with Ryzen 1600, 32GB DDR4-2400, 1TB Gen 3 x4 NVMe.

That post was in response to comments about the PS5 demo being more complex. There's been a lot assumptions or misinformation about these demos.
 
True, but the lifts all boats comment by Cerny - along with bandwidth and latency saving cache scrubbers - hints it is a good setup. And I'm a bit fuzzy on it, but didn't some font package benchmark on PS5 compare fill-rate/throughput with high-end cards six months ago?
Ps5 can beat mid-high end GPU's with fonts. Like in a text editor. No joke.



This was against a regular 2080 though. There's super, TI's, and all of the Ampere lineup as well.








 
Last edited:

Corndog

Banned
Look man, I'm trying to help you out. Instead of assuming or going by what you think, just hear me out.


When you export the demo, you can run it without the editor. You can even uninstall the editor, and still run the demo, as the editor will no longer be a prerequisite. And you don't have to upload it either. Just export and run.


Why not just try it out? That way you can have a real comparison, and can see for yourself that the exported demo uses way less resources than it does running the editor on top.
Maybe he already has.
 

PaintTinJr

Member
Ps5 can beat mid-high end GPU's with fonts. Like in a text editor. No joke.



This was against a regular 2080 though. There's super, TI's, and all of the Ampere lineup as well.

...
By the in depth explanation of nanite as a software rasterizer and the font package is one of those too - but will have more branched direct/indirect drawcalls than nanite, which works in a single call - the nanite performance should be similar or better on PS5, than the font package AFAIK
 

CamHostage

Member
Watch it again. Everything you said was wrong. Grass does indeed move under foot and the leaves sway in the wind.

Horizon 2's foliage system doesn't "crush" underfoot is the thing (which is weird, I would have thought they'd improve that in the second game given that so much of Horizon is about that lush vegetation, PS5 or not, albeit laying flat could be a problem for the foliage elements as I assume they are primarily 2D) There's kind of a light wiggle effect added to grasses when she passes over grass but it's not much, and Aloy still generally clips through leafy fronds; it's only the patches of tall grass that has full collision.

That said, even Horizon 1 had motion to all of its vegetation, so it's in there and it's in the second game. Looks like they messed up / didn't finish the ending of the Forbidden West Gameplay Demo (there's particles and flowing dust effects but then they didn't crank up the vegetation movement,) and I'm not seeing in either the Horizon 2 trailer or anything of Horizon 1 something that could really be considered a "storm" with like howling winds and turbulence in the trees (the closest I've found is her riding through the whipping grasses overlooking Valley of the Fallen in Horizon 2, even Frozen Wilds is generally a nice day to go exploring...) So I'm not sure if that will change; they certainly spent a lot of time having the cloud system simulate chaotic weather so hopefully they have more motion planned for trees to match the new storm systems.

 
Last edited:

GuinGuin

Banned
Horizon 2's foliage system doesn't "crush" underfoot is the thing (which is weird, I would have thought they'd improve that in the second game, PS5 or not, albeit laying flat could be a problem for the foliage elements as I assume they are primarily 2D) There's kind of a light wiggle effect added to grasses but it's not much, and Aloy still generally clips through leafy fronds; it's only the patches of tall grass that has full collision.

That said, even Horizon 1 had motion to all of its vegetation. Looks like they messed up / didn't finish the ending of the Forbidden West Gameplay Demo (there's particles and flowing dust effects but then they didn't crank up the vegetation movement,) and I'm not seeing in either the Horizon 2 trailer or anything of Horizon 1 something that could really be considered a "storm" with like howling winds and turbulence in the trees (the closest I've found is her riding through the whipping grasses overlooking Valley of the Fallen in Horizon 2, even Frozen Wilds is generally a nice day to go exploring...) So I'm not sure if that will change; they certainly spent a lot of time having the cloud system simulate chaotic weather so hopefully they have more motion planned for trees to match the new storm systems.



You are correct that she doesn't crush the vegetation but she knocks it out of the way with her feet.
 

iHaunter

Member
This is a bold statement. I'd refrain from making blanket statements like this.

Sure, PC doesn't have dedicated HW decompressors and DirectStorage yet, and at the moment it's behind PS5. But that doesn't mean it will never catch up.

PCs already have access to faster SSDs than PS5. This is one piece of the puzzle... With DirectStorage - MS is bringing DirectCompute-based decompressor i.e. software-based decompression which will work on today's GPUs via RTX IO on the NVIDIA side by utilizing the GPU's SMs for asset decompression work, and there are plans to move towards hardware implementation just like they have on consoles. Think of it as moving from running ray tracing on shader cores to dedicated RT Cores.
1QNpe1t.png

Won't be long before AMD and NVIDIA come out with dedicated hardware decompression units built into their GPUs which might even surpass what's inside consoles in terms of max throughput of the decompression unit. PS5's max throughput of its Kraken decompression unit is 22 GB/s.
It's not a blanket statement. You would have to design it all to work under one I/O. Slamming components from random companies together will not achieve that. Unknown if raw power can achieve it either.
 

Corndog

Banned
It will probably do better than that at nanite - lumen might be different we'll see in the next video - because the info by Brian and co in the video gave really illustrates why the old multi stage hw pipeline is via for less work in games, and why even 36 CUs might only really be there for traditional graphics/cross-gen that don't exploit nanite or something better - and 128 ROPs would have been even better in the consoles.

With the PS5 low latency, cache scrubbers and the to the metal nature of the console's graphics access - even if the XsX/Xss get it indirectly - we should see amazing nanite results on all the consoles, and probably see significant changes in PC graphics drivers to remove inefficiencies. The latency on material change pass should see big gains for the to-the-metal access where api calls become a drain on performance again - going by the info I took from the video.
Are you in the gaming industry?
 
We can talk about the Unreal Engine all we want, but we all know Naughty Dog is gonna is gonna make us wonder just how the hell they pulled that sort of performance out of a PS5 when the time comes.
 

Lethal01

Member
I understand very well but every time I read the usual 4 or 5 users try to push who knows what new theory (until debunked)


It feels like the last page or so people have being pointing out a spefic part of the pipeline that the PS5 would "win" in and you're extrapolating that to people saying that the PS5 will run Unreal 5 games better than a 3080.
 

MonarchJT

Banned
It feels like the last page or so people have being pointing out a spefic part of the pipeline that the PS5 would "win" in and you're extrapolating that to people saying that the PS5 will run Unreal 5 games better than a 3080.
would win in compared to what?
 

ZywyPL

Banned
The 3080 was still designed with brute forcing drawcalls in the GPU driver - that lives in RAM, - and taxes the CPU on a PC heavily.

That's not true at all, Ampere (and Turing) GPUs have Mesh Shading for years and recently they received RTX I/O as well, it's just that none of those are being used at the moment, hence the GPUs are indeed brute forcing the rendering as of now, still outperforming PS5 by quite a margin, so imagine what will happen once that new tech will start being utilized. Not to mention when DLSS steps in.
 

GuinGuin

Banned
We can talk about the Unreal Engine all we want, but we all know Naughty Dog is gonna is gonna make us wonder just how the hell they pulled that sort of performance out of a PS5 when the time comes.

Naughty Dog are masters of the craft but I think Guerilla and Santa Monica are their equal as far as graphics go. They all share information and technology so it's not totally surprising.
 

Lethal01

Member
would win in compared to what?

I meant compared to a card with higher terahlops. A higer fill rate would indeed lead to doing some specific tasks faster.

Again, there are tons of steps in rendering a frame, not saying that the PS5 doing a couple of those steps faster would mean anything close to better overall performance.
 

MonarchJT

Banned
I meant compared to a card with higher terahlops. A higer fill rate would indeed lead to doing some specific tasks faster.

Again, there are tons of steps in rendering a frame, not saying that the PS5 doing a couple of those steps faster would mean anything close to better overall performance.
ok 👌 sure it have some advantages ....it is what it is. But exactly as you said i don't think that these advantages at the expense of the biggest disadvantages bring no one better final perfomance
 

GuinGuin

Banned
ok 👌 sure it have some advantages ....it is what it is. But exactly as you said i don't think that these advantages at the expense of the biggest disadvantages bring no one better final perfomance

It has come to a point where the time, skill and money required to make the assets are what determines how good a game looks. It will be a long, long time before a PC game matches Rift Apart.
 

MonarchJT

Banned
It has come to a point where the time, skill and money required to make the assets are what determines how good a game looks. It will be a long, long time before a PC game matches Rift Apart.
as i said before in a other post ...the only real advantage Sony have at the moment a part from faster loading are their studios. E3 is around the corner we'll see if Microsoft will change the cards on the table
 
Last edited:

GuinGuin

Banned
as i said before in a other post ...the only real advantage Sony have at the moment a part from faster loading are their studios. E3 is around the corner we'll see if Microsoft will change the cards on the table

None of the studios they bought are known for their graphical prowess except maybe Ninja Theory.
 

Brofist

Member
It's not a blanket statement. You would have to design it all to work under one I/O. Slamming components from random companies together will not achieve that. Unknown if raw power can achieve it either.
Those random companies? You mean AMD and Nvidia? Right? What do they know about any of this?
 
Last edited:

assurdum

Banned
I understand very well but every time I read the usual 4 or 5 users try to push who knows what new theory (until debunked) .. or who knows what special sauce to raise the PS5 above the competition. No guys, it's time to accept it. The PS5 is a very good console Cerny has chosen to spend a good part of the R&D betting on the speed of the SSD , was his bet and it's great to have almost zero loading times or for devs to never worry about i / o. But the PS5 is and will remain a 10 tf variable console. It is becoming symptomatic to always read glimpses as you write "eh but this is an advantage of the ps5" most likely the only advantage Sony has at the moment is not the console but its studios. First you will first accept this type of thread where one or more are derided en masse will end.
ps5 it's not variable 10 TF though I'm not sure what it has to do to with the pixel fill rate discussion and to point out such evidence. Although it seems to hurt your feeling so much, listen of such relative advantage. Back to your argumentation it's not like neither series X hardware is any kinda of more special just because it's 12 TF. The solely 12 TF number means nothing. A substained 12 TF performance it's not guaranteed just because the frequency is steady. Can even underperform the truly "12 TF" number for thousands of reasons. But it seems your point it's as always to downplay ps5 hardware against the untouchables king of the substained 12 TF of performance. First you accept the difference between the 2 hardware aren't great as you hoped, better will be for the discussions.
 
Last edited:
  • Like
Reactions: Rea

harmny

Banned
Horizon 2's foliage system doesn't "crush" underfoot is the thing (which is weird, I would have thought they'd improve that in the second game given that so much of Horizon is about that lush vegetation, PS5 or not, albeit laying flat could be a problem for the foliage elements as I assume they are primarily 2D) There's kind of a light wiggle effect added to grasses when she passes over grass but it's not much, and Aloy still generally clips through leafy fronds; it's only the patches of tall grass that has full collision.

That said, even Horizon 1 had motion to all of its vegetation, so it's in there and it's in the second game. Looks like they messed up / didn't finish the ending of the Forbidden West Gameplay Demo (there's particles and flowing dust effects but then they didn't crank up the vegetation movement,) and I'm not seeing in either the Horizon 2 trailer or anything of Horizon 1 something that could really be considered a "storm" with like howling winds and turbulence in the trees (the closest I've found is her riding through the whipping grasses overlooking Valley of the Fallen in Horizon 2, even Frozen Wilds is generally a nice day to go exploring...) So I'm not sure if that will change; they certainly spent a lot of time having the cloud system simulate chaotic weather so hopefully they have more motion planned for trees to match the new storm systems.



now this is a good take. thanks
 

iHaunter

Member
Those random companies? You mean AMD and Nvidia? Right? What do they know about any of this?
You don't get it. I mean they would have to make the CPU, Motherboard, RAM, GPU, everything for it to work.

Can they? Sure, will they? Doubt it.
 

Corndog

Banned
ps5 it's not variable 10 TF though I'm not sure what it has to do to with the pixel fill rate discussion and to point out such evidence. Though it seems hurt your feeling so much listen of such relative advantage. Back to your argumentation it's not like neither series X hardware is any kinda of more special just because it's 12 TF. The solely 12 TF number means nothing. A substained 12 TF performance it's not guaranteed just because the frequency is steady. Can even underperform the truly "12 TF" number for thousands of reasons. But it seems your point it's as always to downplay ps5 hardware against the untouchables king of the substained 12 TF of performance. First you accept the difference between the 2 hardware aren't great as you hoped, better will be for the discussions.
No matter how many times you repeat it the gpu does throttle. When and how often I don’t know. Tell me if it is constant then why have not have it at a set speed like Xbox?
 

assurdum

Banned
No matter how many times you repeat it the gpu does throttle. When and how often I don’t know. Tell me if it is constant then why have not have it at a set speed like Xbox?
No matter how many times developers explain this, or games show it, someone still will persist with this urban legend. A couple of percentage in frequency it's not a couple of percentage in TF. Though many developers has already repeated to the exhausting about ps5 as an effective 10 TF performance machine and not below it thanks exactly to the variable setup. You won't see the multiplat perform stay so close if ps5 goes below. I don't understand even how after so many multiplat comparison we are still here about this asinine assumption. As if there would be any evidence about it.
 
Last edited:

Nowcry

Member
No matter how many times you repeat it the gpu does throttle. When and how often I don’t know. Tell me if it is constant then why have not have it at a set speed like Xbox?
I don't want to get into it but no PC GPU for a long time has constant clocks and it is a waste of power and logic. And the same happens in the processors if you go into the configuration of your motherboard and see the advanced overclock part, you will find the AVX offset option that its function is to lower the clocks in front of AVX functions and I do not see anyone complaining about it in PC.

On the other hand, we can admit that PS5 has a variable frequency similar to what all PCs have, but we must also admit that 14/12 CU in an antisymmetric way per shader array that the SX has together with the divided memory that Xbox has is the largest botch that has never been done to the point where developers prefer to develop on switch versus SX.

Precisely for this reason, PS5 finally obtains much better performance than SX because the IO controller improves performance with greater use of CUs and also due to the improvements that not having as much memory as pool provides.

On the other hand, as for UE5 on PC, it is true that it can be moved in project mode but nothing assures me that the same precision of triangles is being used, nor that the rasterizers in project mode are taking simplifications, we do not know how it would move. currently the project on PS5 or how it would move on a PC with 8k textures. UE5 is very scalable but moving the project does not have the same fidelity as the game, it is heavier for the CPU and for multitasking, but graphically many simplifications are taken since its main priority is to move fast.

In the second demo and the first they have not demonstrated the flight speed which is where the IO controller stands out and it is precisely those mechanics and teleportation along with using less memory as a pool that really makes the change.

You are comparing a demo of 1 year ago of PS5 with something current without running in project mode and possibly with more simplifications than textures, for example the spin speed or an analysis of ms to see poppping, you are seeing how UE5 scales great since That's what it was designed for but we can't see how it currently moves on PS5 that its performance could be much higher, nor have we seen things actually moving on PC and most importantly we haven't seen any area where there is a fast scrolling speed.


There are still too many unknowns that we should clear up before starting to despise the PS5 hardware
 
Last edited:

Corndog

Banned
No matter how many times developers explain this, or games show it, someone still will persist with this urban legend. A couple of percentage in frequency it's not a couple of percentage in TF. Though many developers has already repeated to the exhausting about ps5 as an effective 10 TF performance machine and not below it thanks exactly to the variable setup. You won't see the multiplat perform stay so close if ps5 goes below. I don't understand even how after so many multiplat comparison we are still here about this asinine assumption. As if there would be any evidence about it.
A couple percent frequency is a couple percent of max clocks. It’s 1 to 1. And like I said I don’t know which titles will throttle what. But why include it if you are not going to use it?
 

PaintTinJr

Member
That's not true at all, Ampere (and Turing) GPUs have Mesh Shading for years and recently they received RTX I/O as well, it's just that none of those are being used at the moment, hence the GPUs are indeed brute forcing the rendering as of now, still outperforming PS5 by quite a margin, so imagine what will happen once that new tech will start being utilized. Not to mention when DLSS steps in.
That's certainly a good point about the 10% of data in a game scene where using nanite isn't needed or for real-time procedural geometry - which nanite doesn't support - but meshlets are still going to need fill-rate for a zbuffer pass , that nanite doesn't need(because of its SDF rendering), and are going to cost you orders of performance more - because without nanite you can't use lumen's SDF lighting, and get the cheap rendering benefit - instead of traditional lighting. And you will still incur all the geometric rendering costs in the GPU per meshlet instance you render - unlike nanite - So you are still going to want nanite/lumen for the 90% of your scene data AFAIK.

It sounds like you haven't watched the Inside nanite twitch video, because over the 3hrs they do explain the reasons nanite was a desirable solution, and why the traditional pipeline is reaching practical limits to make the leap to scale rendering to the level of nanite. Meshlets also add additional complexity to your solution, so there use has to be worth the effort/work they add, too AFAIK. Nanite is as simple as flagging the UE5 static mesh as nanite enabled.
 
Last edited:

Corndog

Banned
I don't want to get into it but no PC GPU for a long time has constant clocks and it is a waste of power and logic. And the same happens in the processors if you go into the configuration of your motherboard and see the advanced overclock part, you will find the AVX offset option that its function is to lower the clocks in front of AVX functions and I do not see anyone complaining about it in PC.

On the other hand, we can admit that PS5 has a variable frequency similar to what all PCs have, but we must also admit that 14/12 CU in an antisymmetric way per shader array that the SX has together with the divided memory that Xbox has is the largest botch that has never been done to the point where developers prefer to develop on switch versus SX.

Precisely for this reason, PS5 finally obtains much better performance than SX because the IO controller improves performance with greater use of CUs and also due to the improvements that not having as much memory as pool provides.

On the other hand, as for UE5 on PC, it is true that it can be moved in project mode but nothing assures me that the same precision of triangles is being used, nor that the rasterizers in project mode are taking simplifications, we do not know how it would move. currently the project on PS5 or how it would move on a PC with 8k textures. UE5 is very scalable but moving the project does not have the same fidelity as the game, it is heavier for the CPU and for multitasking, but graphically many simplifications are taken since its main priority is to move fast.

In the second demo and the first they have not demonstrated the flight speed which is where the IO controller stands out and it is precisely those mechanics and teleportation along with using less memory as a pool that really makes the change.

You are comparing a demo of 1 year ago of PS5 with something current without running in project mode and possibly with more simplifications than textures, for example the spin speed or an analysis of ms to see poppping, you are seeing how UE5 scales great since That's what it was designed for but we can't see how it currently moves on PS5 that its performance could be much higher, nor have we seen things actually moving on PC and most importantly we haven't seen any area where there is a fast scrolling speed.


There are still too many unknowns that we should clear up before starting to despise the PS5 hardware
I have an rtx 3070. I am well aware you can vary clocks. That’s because of isn’t a closed box. One rtx might even perform better then another. This is not what you want in the console space. People would be angry. Ps5 performance is repeatable across all consoles but it is indeed variable. On some titles either the cpu or gpu will not be maxed out. This may not be a problem if you are never cpu constrained. You just always drop the cpu performance. It maybe be a problem if both are constrained at once.
 

MonarchJT

Banned
That's certainly a good point about the 10% of data in a game scene where using nanite isn't needed or for real-time procedural geometry - which nanite doesn't support - but meshlets are still going to need fill-rate for a zbuffer pass , that nanite doesn't need(because of its SDF rendering), and are going to cost you orders of performance more - because without nanite you can't use lumen's SDF lighting, and get the cheap rendering benefit - instead of traditional lighting. And you will still incur all the geometric rendering costs in the GPU per meshlet instance you render - unlike nanite - So you are still going to want nanite/lumen for the 90% of your scene data AFAIK.

It sounds like you haven't watched the Inside nanite twitch video, because over the 3hrs they do explain the reasons nanite was a desirable solution, and why the traditional pipeline is reaching practical limits to make the leap to scale rendering to the level of nanite. Meshlets also add additional complexity to your solution, so there use has to be worth the effort/work they add, too AFAIK. Nanite is as simple as flagging the UE5 static mesh as nanite enabled.
not all.games will use ue5 ...and btw most dev that are good with or use it regularly in the world of consoles are almost everyone among Microsoft first parties on the contrary very few will use ue5 between sony studios ..very very few. or here someone thinks naughyy dog, guerrilla and santa Monica ,insomniac will they abandon their engines? lol
 
Last edited:

Nowcry

Member
I have an rtx 3070. I am well aware you can vary clocks. That’s because of isn’t a closed box. One rtx might even perform better then another. This is not what you want in the console space. People would be angry. Ps5 performance is repeatable across all consoles but it is indeed variable. On some titles either the cpu or gpu will not be maxed out. This may not be a problem if you are never cpu constrained. You just always drop the cpu performance. It maybe be a problem if both are constrained at once.
As a general rule, the PS5 will be kept to the maximum both the CPU and the GPU as long as it does not include too many AVX instructions, which is the same as happens on PC.

It is a very common failure to think that it is the clocks that increase consumption and in a certain way it is true, but another variable is missing, which is the instruction. Not all instructions consume the same power, in this case it is possible to use all the power of the GPU and the CPU until you enter AVX instructions, if you do so you have to make sure that they are for small periods of time, or combine it with another instruction low-power CPU.

Heavy AVX instructions are not very famous and have a bad reputation in games for data management, as a general rule many instructions of this type are not necessary, but you always have to make use of some, the luck of this is that in general they tend be heavy on loading screens where the GPU just doesn't need processing power.

That is why it is difficult to find yourself in these situations.
 
Last edited:

PaintTinJr

Member
not all.games will use ue5 most dev that are good with or use it regularly in the world of consoles are almost everyone among Microsoft first parties on the contrary very few will use ue5 between sony studios ..very very few. or here someone thinks naughyy dog, guerrilla and santa Monica will will they abandon their engines? lol
Do you really think all those games aren't powered by UE technology - along with Sony's own platform specific software? Pretty sure in the making of UC on the original PS3 disc, the video shows it is UE. There was also full page Ninja Theory vacancy adverts in Develop magazine back in the day - for Heavenly Sword - and they were all Unreal engine jobs.

Look at the type of talks we've seen from NaughtyDog on PS4, like the GDC(?) one for UC4. If they aren't using UE then the talks probably wouldn't work for the audience IMHO because they couldn't easily integrate those techniques to their own UE projects on PS4.
All three console platform holders will almost certainly have special setups for UE and Unity where they don't need to show the powered by UE logo.

I'd also be shocked if Konami haven't been using UE for Metal gear all these years, too. Decima looks like UE, or at least it certainly does in Death Stranding.

Just think about it for a second, why would Sony want to showcase UE5 on PS5, if they themselves aren't using that technology for all their own most visually stunning games?

I wouldn't get too hung up on what engine you think people are using, because even if a first party developer had their own unique bespoke engine, they can still integrate features of UE or Unity that they want for a fee IMHO.
 

Lethal01

Member
Do you really think all those games aren't powered by UE technology - along with Sony's own platform specific software? Pretty sure in the making of UC on the original PS3 disc, the video shows it is UE. There was also full page Ninja Theory vacancy adverts in Develop magazine back in the day - for Heavenly Sword - and they were all Unreal engine jobs.

Look at the type of talks we've seen from NaughtyDog on PS4, like the GDC(?) one for UC4. If they aren't using UE then the talks probably wouldn't work for the audience IMHO because they couldn't easily integrate those techniques to their own UE projects on PS4.
All three console platform holders will almost certainly have special setups for UE and Unity where they don't need to show the powered by UE logo.

I'd also be shocked if Konami haven't been using UE for Metal gear all these years, too. Decima looks like UE, or at least it certainly does in Death Stranding.

Just think about it for a second, why would Sony want to showcase UE5 on PS5, if they themselves aren't using that technology for all their own most visually stunning games?

I wouldn't get too hung up on what engine you think people are using, because even if a first party developer had their own unique bespoke engine, they can still integrate features of UE or Unity that they want for a fee IMHO.

no, all these devs aren't secretly just using reskinned unreal engine.
 
Status
Not open for further replies.
Top Bottom