• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NXGamer] The Matrix Awakens: Tech-Demo Analysis - PS5 | SX | SS

ethomaz

Banned
Really depends. does rockstar have a nanite equivalent? you can get ray traced lighting and reflections in any engine, but nanite is a beast and i highly doubt teams like Rockstar, Naughty Dog, and Ubisoft Montreal can implement something like this to their engines.

I think it's better they all ditch their engines and go with UE5. all these assets are available to every UE5 user. imagine how quickly they can make cities out of this. this massive city took 60-100 artists just over a year to build.
They won’t ditch their engine.
They will create tech specifically for them in their own engine.

Nanite after all is a generic feature created in a generic engine… specialized engines just do better what the developer wants it to do because well it is tailored specifically for their needs.

Big studios rely on their engines for the best results… something you can’t reach with generic engines like UE5 where you need to work with the limitations of the UE5.
 
Last edited:

SSfox

Member
This is just a demo for the wow effect, and for Epic to promote their engine for devs, i'm more excited about real games that are using it like Wukong or Hellblade, of course those games aren't out but we know those are real games not just demos, and it's just the begining, i would bet that 2023 and beyond AAA games from devs like ND or Rockstar will destroy this demo graphics wise. Or also FF7 Part 2 (that actually may use UE5 now thinking about it)

At the end of the day it is just a test demo. UE5 games when they come will be optimised.

This is not something to war over. Unfortunately some won’t see it that way
I usually don't agree with, but this.


I think i said this in another thread but i don’t think other engines can do this. Nanite is in a league of its own and You will not get unlimited deal like this in other engines unless they create their own mesh or primitive shader based engines. Even then epic has resources and engineers even Rockstar and Naughty Sox don’t have access to. They are working with Hollywood cg houses on asset creation. Their CTO is literally the guy who invented bullet time in the first Matrix. They literally went out and bought the mega scans company quixel.

Rockstar and ND might have great lighting, great character models, hair and ray traced reflections but nanite is the game changer here. Ubisoft’s snow drop engine looks stunning but it still looks like a game. Whereas the matrix demo at least in the chase scene looked real.

Oh yeah i absolutely agree UE5 is very impressive and it was the case since that first demo with the flying girl. This new demo show more in other areas with mostly urban areas, and actually playable by the public unlike the first (that was still playable but just available for public as this one). IDK tho if it surpass new gen engines of ND or Rockstar for examples, that we actually didn't see yet. Rockstar been secretly all in working on GTA6 since 3 years at least for sure, probably with minimum staffs of 2000 persons or something like this, the result is probably gonna be insane and will put this demo of Matrix made by few people (in probably less than few weeks) to the trash. And probably not just them that will surpass it and a lot of devs actually (most AAA devs), you can go back and watch demos of UE showed by Epic in the past, it usually always get surpass by the mid new generation of consoles.
 
Last edited:

Hendrick's

If only my penis was as big as my GamerScore!
30 f
No they're not. Framepacing issues are frame pacing issues. You could have a 60fps game with framepacing issues and a game that barely maintains 30fps without any framepacing issues. Which is performing better?

Framepacing is just timing in delivering the frames. You get a quick 16ms frame then a longer 33ms or more one. Some people are more sensitive to it than others.
Frametime is absolutely related to performance. Not just framerate.
 
Last edited:
Maybe... We'll see what they say. The framerate looks clearly better on PS5 in free roam sections though.

Or is this confirmation bias too (see link I gave above for the source video) ?


Think you could post a comparison pic where they are both at least on the same block? Thanks.
 

DaGwaphics

Member
The Series S is an abomination. I've been saying this for sometime.



It was clear to anyone thinking about this situation logically and without bias. That stupid machine is going to be a fucking albatross.

So you get a demo of this new engine that is perfectly playable on the XSS and this is what makes the machine an albatross? Okay.

It isn't high res, no. But this demo still looks very nice on the XSS and a 1080p display.
 

Panajev2001a

GAF's Pleasant Genius
So you don't know.
We know the frequency of the chip (~300 MHz lower) and thanks to the HotChip presentation on the XSX we know the number of ROPS matches what AMD mentions for RDNA2 chips (half the rops as RDNA1 architecture but double the throughput generally speaking).

uFZ6SiA.jpg
qBHgLJp.jpg


We now have seen the die shot of both SoC (being generous it is 2 shader engines and 2 shader arrays each… each shader engine has its own ROPS and rasteriser units and the geometry engine is shared between shader engines… a single shader engine with more DCU’s would mean even less resources):

XSX
p5qadpq.jpg


XSS
yUn1D7F.jpg


XSS looks like an XSX with only 1 Shader Engine and 2 Shader Arrays.

RDNA2 reference:
eM943Io.png


See all the units outside of the Dual Compute Units? They ran at about 300 MHz less on XSS than XSX. If you want to label the XSS die you can see exactly if those shared resources are lower in number on XSS than XSX (and it still does not change the frequency)… but saying they are the same seems generous as with only 1 Shader Engine you have half the rasterisation and primitive setup units and half the ROPS).

You forgot to mention your split memory claim, where have you seen the 2gb OS reserve can be used for games?
I am not saying it cannot be used, but the bandwidth is really low on XSS for the non GPU optimised portion (which is why I said GPU accessing the non GPU optimised RAM region, but you do read others’ posts before replying right?)… so… it affects performance if you do which is why developers will try not to have to do that.
 

Riky

$MSFT
We know the frequency of the chip (~300 MHz lower) and thanks to the HotChip presentation on the XSX we know the number of ROPS matches what AMD mentions for RDNA2 chips (half the rops as RDNA1 architecture but double the throughput generally speaking).

uFZ6SiA.jpg
qBHgLJp.jpg


We now have seen the die shot of both SoC (being generous it is 2 shader engines and 2 shader arrays each… each shader engine has its own ROPS and rasteriser units and the geometry engine is shared between shader engines… a single shader engine with more DCU’s would mean even less resources):

XSX
p5qadpq.jpg


XSS
yUn1D7F.jpg


XSS looks like an XSX with only 1 Shader Engine and 2 Shader Arrays.

RDNA2 reference:
eM943Io.png


See all the units outside of the Dual Compute Units? They ran at about 300 MHz less on XSS than XSX. If you want to label the XSS die you can see exactly if those shared resources are lower in number on XSS than XSX (and it still does not change the frequency)… but saying they are the same seems generous as with only 1 Shader Engine you have half the rasterisation and primitive setup units and half the ROPS).


I am not saying it cannot be used, but the bandwidth is really low on XSS for the non GPU optimised portion (which is why I said GPU accessing the non GPU optimised RAM region, but you do read others’ posts before replying right?)… so… it affects performance if you do which is why developers will try not to have to do that.

"(and somewhat the impact of bandwidth of the non GPU optimised memory pool if you need to feed the GPU from it)."

You still haven't answered this despite me asking you twice, you just post something else.

There is 2GB reserved out of the 10gb for the OS, this runs at the lower bandwith.
Once again where have you seen this can be used for games and so would affect performance?
 

DaGwaphics

Member
Thank you.

The stats for what was used in this demo is insane...like every other UE5 demo before....

For this one, for example, almost 10 million unique or duplicated assets were used. A plaque in the demo says 7 million instanced assets. Plaques have messages about what was done with the demo.


Each time a UE5 demo is shown its like nothing matters but resolution, frame rates, etc. Without factoring in whats actually being done in the demo.

When I first tried it, I said the cars at a glance looked like it was from a movie. I later said I may have been exaggerating.

Then ppl started posted screen shots....that looked like a damn movie, lol.

Overall, I'm just glad it was available for both the PS5 and Series consoles at the same time. That this is even running on the consoles should not be overlooked., regardless of resolution.
True. Kudos to Epic for delivering a demo like this directly to players that we can interact with right on our consoles.

Much easier than trying to use the UE to create an exe file and all that. I didn't even have the ram on my PC to run the engine properly the last time, had to find someone that had compiled a low spec version and made it available.
 

buenoblue

Member
Ok I take it back. Just replayed the chase sequence and it's phenomenal. Probably the best graphics I've seen. Resolution and framerate don't matter if the end result looks so good. I need a full matrix game like this now!!!
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
They won’t ditch their engine.
They will create tech specifically for them in their own engine.

Nanite after all is a generic feature created in a generic engine… specialized engines just do better what the developer wants it to do because well it is tailored specifically for their needs.

Big studios rely on their engines for the best results… something you can’t reach with generic engines like UE5 where you need to work with the limitations of the UE5.
While I do agree generally studios that built their own engines have a better handle on their tech.

I still think middleware will always have a place in game development.
And if something comes up that just does the job better you might find yourself changing engines while injecting your tools, techniques and technologies to it.

Remember that once youve licensed Unreal Engine you have source code access.
You can basically make it your own engine with enough modifications calling it "generic" Unreal Engine becomes an insult.

Just look at Housemarque and Hazelight, while they are Unreal Engine games, their branches of Unreal are quite different from what other developers are using due to all the plugins and edits done to the code.

So if a studio even a big one sees that Nanite is just that far ahead they may switch to Unreal Engine while still plugging in the best features of their own engine.

Right now Nanites handling of polygons is basically magic, im sure other engines will develop their own virtual shadow maps and polygon handling tech.
But it would be foolish not to take shortcuts where you can.
 

Lethal01

Member
this demo of Matrix made by few people (in probably less than few weeks) to the trash.

THis was made over a year, I think with about a hundred people.

Also Unreal is generic if you let it bet Generic, the latest mortal kombat runs on super customized unreal 3.
Final Fantasy remake had it's shader and lighting system totally redone which is likely to give the a reason to just continue customizing their own version of the engine rather than switching to unreal 5(they could though)
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
I am so underwhelmed by all these comments expressing disappointment at the 1080p res and then do clown-level trolling about how consoles were promised 4K.

This is an unoptimized, cutting edge tech demo taking advantage of console hardware in Year 2.

I should be surprised, but im not. Always the same kind of people making these tired opinions.
 
Some parts of what a GPU does (for example fillrate) are highly dependent on clock speed, which is why the PS5 GPU is objectively better than the XSX at certain things even though it has lower floating point performance. The same applies here. The XSS has 1/3 of the floating point performance of the XSX, but it's also worse in other ways because of the lower clock speed.
Everything a GPU does is highly dependent on clock speed.

What Sony gained by upping the clock speed on a GPU that has less compute units in comparison to the series x but with Similar number of other components... They closed some of the compute power gap and took advantages in other performance metrics (fillrate, memory latency, what not).

If the PS5 had its GPU clocked at the same speed as the one in the x it would get disadvantages or equal performance in all these other aspects of rendering games as well as the lower computing performance.
 
Can't we go a single thread without people constantly being concerned about Series S performance? This demo clearly shows it is not going to impact your experience on the XSX or PS5 so get over it and move on.
But let not act like 720p is fine in demanding games (as we have seen a few times now).
 

ethomaz

Banned
While I do agree generally studios that built their own engines have a better handle on their tech.

I still think middleware will always have a place in game development.
And if something comes up that just does the job better you might find yourself changing engines while injecting your tools, techniques and technologies to it.

Remember that once youve licensed Unreal Engine you have source code access.
You can basically make it your own engine with enough modifications calling it "generic" Unreal Engine becomes an insult.

Just look at Housemarque and Hazelight, while they are Unreal Engine games, their branches of Unreal are quite different from what other developers are using due to all the plugins and edits done to the code.

So if a studio even a big one sees that Nanite is just that far ahead they may switch to Unreal Engine while still plugging in the best features of their own engine.

Right now Nanites handling of polygons is basically magic, im sure other engines will develop their own virtual shadow maps and polygon handling tech.
But it would be foolish not to take shortcuts where you can.
Middleware engines have a big place in game industry… that is why Unreal Engine is so successful.

It allow these with lower budget or limited design needs to cut development and time (aka costs) to bring their games to public.

But that is not every studio needs… that is why I tried to use the word “big studios”.

Sont you think that studios like Rockstar, Naughty Dogs, Guerrilha Games, i343, Capcom, or several others big ones didn’t test and tried Unreal Engine? So why they choose still work in a in-house engine?

Well because Unreal Engine could not fit their needs due limitations tied to the generic nature of the engine… some can choose to change what Unreal Engine can do but even that allowing a more specific implementation it is still tied to the core of the Engine (it is not possible to change it at core or even it is too hard/cost to do that).

So in-house engines have a goal that generic engines like Unreal will never fulfill… to give birth to the specific vision of the developer… in house engines are tailored to what the developer needs and the end result will be in most cases if not all way better that what you can do with Unreal Engines.

There is the downside too… in-house engines are made to fit some specific needs so it is not a engine that can be used to every project and so it is way more limited going outside the strict needs the engine was created.

Said that I believe Rockstar will never use anything other than RAGE for their Open World games (GTA, RDR, or possible futures titles)… same can be said for Naughty Dogs with their TPS games.

Edit - Just to add not all in-house engines can archive their planned goals too… a classic case was Square Enix engine that have several issues and at end didn’t fulfill the goal they wanted for the Final Fantasy franchise to the point they lose a lot of money development it to just throw on trash after a single use… their started to use way more 3rd-party engines in their games after that (Unreal Engine for example).
 
Last edited:
  • Like
Reactions: Rea

elliot5

Member
Can't we go a single thread without people constantly being concerned about Series S performance? This demo clearly shows it is not going to impact your experience on the XSX or PS5 so get over it and move on.
it's easy to be cynical about it, but one way to look at it is if memory optimizations are needed for series s, those good practices trickle up to the bigger machines and everyone benefits
 

Panajev2001a

GAF's Pleasant Genius
"(and somewhat the impact of bandwidth of the non GPU optimised memory pool if you need to feed the GPU from it)."

You still haven't answered this despite me asking you twice, you just post something else.

There is 2GB reserved out of the 10gb for the OS, this runs at the lower bandwith.
Once again where have you seen this can be used for games and so would affect performance?

On XSX the non GPU optimised memory pool can be accessed by both GPU and CPU (https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs), why would the XSS be different… happy to hear your proof (it is a bit rich when you accuse others of avoiding to answer while staying on topic :)… I take it you accept the discussion on shared resources and lower clocks at least).
"Memory performance is asymmetrical - it's not something we could have done with the PC," explains Andrew Goossen "10 gigabytes of physical memory [runs at] 560GB/s. We call this GPU optimal memory. Six gigabytes [runs at] 336GB/s. We call this standard memory. GPU optimal and standard offer identical performance for CPU audio and file IO. The only hardware component that sees a difference in the GPU."
 
Last edited:

Riky

$MSFT
On XSX the non GPU optimised memory pool can be accessed by both GPU and CPU (https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs), why would the XSS be different… happy to hear your proof (it is a bit rich when you accuse others of avoiding to answer while staying on topic :)).

Because that's not the XSS, the XSX has extra ram that isn't reserved for the OS but is not optimised. The XSS does not, that's why it would be different.
You have once again created a fake concern that somehow on XSS developers would be using the reserved Ram at the lower bandwith to feed the GPU when there is no proof this is possible.
 

ethomaz

Banned
Because that's not the XSS, the XSX has extra ram that isn't reserved for the OS but is not optimised. The XSS does not, that's why it would be different.
You have once again created a fake concern that somehow on XSS developers would be using the reserved Ram at the lower bandwith to feed the GPU when there is no proof this is possible.
So you are saying PS5 has more memory available to games than Series X? Because on PS5 you have around 13GB of memory available to games.

If you are not touching the slow part of the memory in then you are limited to 10GB on Series X.

In Series S the things are better because the slow part is basically for OS… it probably has less than 8GB available to games that is the full speed part.

The split bandwidth setup is more a concern in Series X than Series S… Series S concerns are others points.
 
Last edited:

RoadHazard

Gold Member
Everything a GPU does is highly dependent on clock speed.

What Sony gained by upping the clock speed on a GPU that has less compute units in comparison to the series x but with Similar number of other components... They closed some of the compute power gap and took advantages in other performance metrics (fillrate, memory latency, what not).

If the PS5 had its GPU clocked at the same speed as the one in the x it would get disadvantages or equal performance in all these other aspects of rendering games as well as the lower computing performance.

Yes, of course everything is affected by clock speed, I was just trying to explain why you can't just look at TF numbers. If you had two GPUs with the same raw compute performance, but one was clocked higher with fewer compute units and the other was clocked lower with more compute units, the former would perform better overall.
 

Panajev2001a

GAF's Pleasant Genius
Because that's not the XSS, the XSX has extra ram that isn't reserved for the OS but is not optimised. The XSS does not, that's why it would be different.
You have once again created a fake concern that somehow on XSS developers would be using the reserved Ram at the lower bandwith to feed the GPU when there is no proof this is possible.

Fake concern, ok so I guess you have proof of the OS partition being the same size… but sure great point you are driving at here ;).

Lovely how you are ignoring everything else… again, but selective reading plus accusations seem once again the recipe.
 

Riky

$MSFT
Fake concern, ok so I guess you have proof of the OS partition being the same size… but sure great point you are driving at here ;).

Lovely how you are ignoring everything else… again, but selective reading plus accusations seem once again the recipe.

Since they run the same OS, you would suspect they would be similar size, you have any proof they are not?
I'm not ignoring anything, I want you to substantiate your claim that developers can access the reserved lower bandwith Ram, you haven't managed to do that unsurprisingly.
 

Zathalus

Member
So you are saying PS5 has more memory available to games than Series X? Because on PS5 you have around 13GB of memory available to games.

If you are not touching the slow part of the memory in Series X then you are limited to 10GB.
No, the XSX has 16GB total with 13.5GB available for games. 10GB fast, 3.5GB slow and 2.5GB slow reserved by the OS.

The XSS has 10GB total with 8GB available for games. 8GB fast for games and the 2GB slow reserved for the OS.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Middleware engines have a big place in game industry… that is why Unreal Engine is so successful.

It allow these with lower budget or limited design needs to cut development and time (aka costs) to bring their games to public.

But that is not every studio needs… that is why I tried to use the word “big studios”.

Sont you think that studios like Rockstar, Naughty Dogs, Guerrilha Games, i343, Capcom, or several others big ones didn’t test and tried Unreal Engine? So why they choose still work in a in-house engine?

Well because Unreal Engine could not fit their needs due limitations tied to the generic nature of the engine… some can choose to change what Unreal Engine can do but even that allowing a more specific implementation it is still tied to the core of the Engine (it is not possible to change it at core or even it is too hard/cost to do that).

So in-house engines have a goal that generic engines like Unreal will never fulfill… to give birth to the specific vision of the developer… in house engines are tailored to what the developer needs and the end result will be in most cases if not all way better that what you can do with Unreal Engines.

There is the downside too… in-house engines are made to fit some specific needs so it is not a engine that can be used to every project and so it is way more limited going outside the strict needs the engine was created.

Said that I believe Rockstar will never use anything other than RAGE for their Open World games (GTA, RDR, or possible futures titles)… same can be said for Naughty Dogs with their TPS games.

Edit - Just to add not all in-house engines can archive their planned goals too… a classic case was Square Enix engine that have several issues and at end didn’t fulfill the goal they wanted for the Final Fantasy franchise to the point they lose a lot of money development it to just throw on trash after a single use… their started to use way more 3rd-party engines in their games after that (Unreal Engine for example).

The "main" selling point of developing an engine in house is being able to iterate on it as needed.
The actual engineers of said engine are in house so if you need to do XYZ with your next game you can ask the engineers to implement that feature.

As I said I dont disagree.
Im sure for any number of studios big and small at some point reworking your engine might be more trouble than simply getting a "generic" engine and working in what you need.

Do I think ND or Rockstar will shift engines....probably not.
But remember even RAGE is actually a reworking and advancement of the Angel Game Engine to suit what Rockstar have seemingly shifted to.

Its not inconceivable that a big developer would look at Unreal Engine 5 and think we could probably work our systems into this engine and get where we need to be sooner rather than later.
The amount of research and engineering Epic put into Nanite is insane...im sure other engines will catchup soon but some developers might just think the workflow advantages and performance advantages that Nanite brings are too sweet to skip and we see them move over to the engine while still carrying over a bunch of their own engines systems.
 
There is 2GB reserved out of the 10gb for the OS, this runs at the lower bandwith.
Once again where have you seen this can be used for games and so would affect performance?
I don't think that using it directly is the issue. From my understanding people who point the split speed memory pool as a problem for the series consoles is explained by an assumption that whenever the CPU access it while the game runs it reduces the speed of all memory to the slowest of the two pools for a time.

I can't tell if this is valid or not.
 

SlimySnake

Flashless at the Golden Globes
Can't we go a single thread without people constantly being concerned about Series S performance? This demo clearly shows it is not going to impact your experience on the XSX or PS5 so get over it and move on.
lol We cant talk about performance in a performance thread now?

What's next? Should we ban all discussion of the PS5 in games where it performs worse than the xsx?
 

Panajev2001a

GAF's Pleasant Genius
Since they run the same OS, you would suspect they would be similar size, you have any proof they are not?
I'm not ignoring anything, I want you to substantiate your claim that developers can access the reserved lower bandwith Ram, you haven't managed to do that unsurprisingly.
Don’t you say they save resources by targeting a 1080p UI? Same thing in terms of real-time video recording and other buffers scaling down? You are avoiding anything as you are attaching to the only thing you think you want / can argue, but fair… keep at it :).

(we know they are not the same as the OS partition is 2.5 GB on XSX…)
 

JackMcGunns

Member
Great vid. So series S got more drawbacks than just resolution

You’re ignoring the fact that it looks amazing, massive levels above anything that could be achieved even on the refresh consoles of last gen, a true next gen machine.

But…

Even more important and the point all along…

Series S did not affect PS5/XSX results as is the bottom line when arguing on whether it should exist or not. Don’t like it? don’t buy it, but it won’t affect you, so why the constant bitching?

Is it an ego thing? can’t admit you were wrong about Series S? hilarious 😂
 

Kataploom

Gold Member
$300 device doing things last generation consoles can't. It is damn impressive and I can't find any sources saying it would only have resolution drawbacks.


Name one hardware feature the XSS lacks the XSX has. People swear they are making a revelation pointing out the XSS runs games at lowered settings. This seems to be a perfect case of scaling especially since you can't run this game on last gen consoles at all. It's up to developers to use the features they are offered.
XSS performs better than XSX and PS5, shit sames basically the same except for resolution and maybe something that is not obvious for everyone, and now the narrative changed from "will hold generation back LMAO" to "LOD is slightly lower than XSX LMAO", they don't want to "win" an argument, they're just trolling because they can't accept the machine is more than good enough for the entire gen at 1080p.

There was a user saying "I thought it could do everything XSX but at 1080p"... as if XSX wasn't already running this thing at 1080p, this is just trolling, XSS seems to be actually be better at 1080p than the other two at 4K since resources streaming generates less constrains, it was basically same settings (literally same textures) and with just lower resolution and performance is still better than in PS5/XSX.

But what can we expect from people that says "1080p in 2021 🤪" as if their houses were the whole world.

I'm waiting for this to come to PC btw, I want to know if they're gonna work on 10 series GPUs or how good it will, I'm getting an S anyway at this point, I don't think I can change my GPU anytime soon and 4K TVs are not worth it right now if I want all the good features unless I have tons of money to burn in a TV bigger than the one I want.
 

Riky

$MSFT
Don’t you say they save resources by targeting a 1080p UI? Same thing in terms of real-time video recording and other buffers scaling down? You are avoiding anything as you are attaching to the only thing you think you want / can argue, but fair… keep at it :).

(we know they are not the same as the OS partition is 2.5 GB on XSX…)

You're the one who was making the claim, obviously with no evidence, like I said it's a fake concern that doesn't exist.
The 2gb of slow Ram on Series S is reserved and can't be used by developers for the GPU which is what you said and I quoted. Everything else you've said is irrelevant.
 

Riky

$MSFT
I don't think that using it directly is the issue. From my understanding people who point the split speed memory pool as a problem for the series consoles is explained by an assumption that whenever the CPU access it while the game runs it reduces the speed of all memory to the slowest of the two pools for a time.

I can't tell if this is valid or not.

I haven't seen or read anything that states this and don't see why it would be the case. On Series S the 2gb is totally reserved for the OS so I don't see how it would be true.
 

rofif

Can’t Git Gud
You’re ignoring the fact that it looks amazing, massive levels above anything that could be achieved even on the refresh consoles of last gen, a true next gen machine.

But…

Even more important and the point all along…

Series S did not affect PS5/XSX results as is the bottom line when arguing on whether it should exist or not. Don’t like it? don’t buy it, but it won’t affect you, so why the constant bitching?

Is it an ego thing? can’t admit you were wrong about Series S? hilarious 😂
you got all that from my post? wow.
I wasn't thinking about any of that lol. I am just saying IT IS toned down a bit. Not what I think about it. It looks good obviously
 

azertydu91

Hard to Kill
So do you guys think this will lead to a Matrix game?Because their previous tech demo didn't, but this one would be great to play.

Yes I am still salty about the Samaritan not becoming a game.
 
So you get a demo of this new engine that is perfectly playable on the XSS and this is what makes the machine an albatross? Okay.

It isn't high res, no. But this demo still looks very nice on the XSS and a 1080p display.

Hi! It's not just the resolution though. Actually, I would agree that the resolution is the least of the problem!

Instead it's the asset reduction and less populated map in terms of dynamic entities populating the streets in this case. These are things which could start to impact game design, level design and creativity and how the gameplay evolves. You can't lose a police tail in a crowd when there's 2 people on the whole street.

So, these are the things I'm referring to. Anyways, have a nice day!
 

DaGwaphics

Member
Hi! It's not just the resolution though. Actually, I would agree that the resolution is the least of the problem!

Instead it's the asset reduction and less populated map in terms of dynamic entities populating the streets in this case. These are things which could start to impact game design, level design and creativity and how the gameplay evolves. You can't lose a police tail in a crowd when there's 2 people on the whole street.

So, these are the things I'm referring to. Anyways, have a nice day!

There's still a lot of traffic, both car and pedestrian, in the XSS demo, so much so that I can't keep a car pristine if my life depended on it. :messenger_tears_of_joy:

Certainly I guess some scenario could come along in the future who knows. Since the systems have the same CPU, if the gameplay mechanic required huge crowds I think they would just lower the quality of the models to make that happen as needed. Dead Rising had large hordes on 360, so did L4D, they will find a way.
 

Hoddi

Member
There's still a lot of traffic, both car and pedestrian, in the XSS demo, so much so that I can't keep a car pristine if my life depended on it. :messenger_tears_of_joy:

Certainly I guess some scenario could come along in the future who knows. Since the systems have the same CPU, if the gameplay mechanic required huge crowds I think they would just lower the quality of the models to make that happen as needed. Dead Rising had large hordes on 360, so did L4D, they will find a way.
It's easy enough to compare them. There's not exactly a massive difference even though it's there.

XSS

PS5
 

Lysandros

Member
There are not just shader ALU’s (hence the TFLOPS count), but you ignored this argument comparing XSX and PS5, so I kind of thought you would say the same here… geometry engine, triangle rasteriser, ROPS, etc… there is a whole bunch of logic that is shared between XSX and XSS (the same at best, not sure if you have the same number of ROPS and other units for example)… so lower clockspeed brings the performance of those parts down further.
Indeed and XSS has only half the number of rasterizers and prim units to begin with being a 1 SE design. Geometry throughput is x2 less even before we add frequency difference to the picture.
 

JackMcGunns

Member
you got all that from my post? wow.
I wasn't thinking about any of that lol. I am just saying IT IS toned down a bit. Not what I think about it. It looks good obviously

It was a general response towards Series S haters, no offense.

Overall I guess this demo quashes all the fears that UE5 needed the super fast SSD of the PS5 in order to achieve exactly what this demo is doing, amazing display, but the speed of SX/S SSD’s did just fine to pull it off. Series S scaled down detail as expected, but an outstanding job nonetheless.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Enjoy your last gen looking games then?

I want photo real worlds and
am fine with HD with reconstruction.

Edit I’m fine with 30 or 40 FPS for some games.

Indeed.

An oddly high number of users here are disappointed just because the demo is running at ´only´ 1440p, or lower than 1080p in Series S´s case.

I think we´ve seen enough recent examples, and this demo itself, to understand that targeting for native pixels is a fools errand and a waste of resources when reconstruciton techniques are getting so good. UE5´s built in super resolution is excellent.

And of course this is just a demo, not every UE5 game will need/use this kind of density or photo-realism. Getting 60 FPS will not be an issue if a developer wants to.
 

Kagey K

Banned
Like I said I saw this on another board and I don't know where it's from. If it's fake or whatever it's not my fault.

fT1lDYs.jpg


J2oLhbQ.jpg


s57C7Su.jpg
You can literally watch this happen in Fortnite

If you unlock the monster truck glider it's all really good until about 200 ft, then it blurs in and out as it tries to figure out what you are looking at.
 
Top Bottom