• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NXGamer] The Matrix Awakens: Tech-Demo Analysis - PS5 | SX | SS

I bet Insomniac can pull this same graphic fidelity at 60 fps thanks to the fact that they don't do night and day cycle. So a lot of power would be saved by baked lighting instead of dynamic lighting as used here.

I'm expecting spider-man 2 to be this good looking graphically at 60fps.
 

SlimySnake

Flashless at the Golden Globes
I bet Insomniac can pull this same graphic fidelity at 60 fps thanks to the fact that they don't do night and day cycle. So a lot of power would be saved by baked lighting instead of dynamic lighting as used here.

I'm expecting spider-man 2 to be this good looking graphically at 60fps.
The Sun is static in this game. The only time it changes is when you change it in debug mode.
 

SlimySnake

Flashless at the Golden Globes
It will be like 720p reconstructed to 1080p and everyone will try and insult the series s again “lol 720p in 2022 lololol” when the ps5 and series x will be reconstructing from little over 1080p…but you know…that’s fine.
Series s has far more concessions than just rendering at 720p. They are listed in the op of this very thread.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Series s has far more concessions than just rendering at 720p. They are listed in the op of this very thread.

I think y'all are putting so much focus on what's missing from the Series S version and not enough focus on what the $299 machine with clearly less specs than the PS5/SX is doing.

Not everyone wants or needs the $500 console, a lot of people are more than happy with a lower end console that guarantees them all of next gen games.

We're focusing on the wrong thing here, and it's just weird to see users get angry/upset when outlets like DF or NX praise the Series S for getting so close to the stronger consoles with its weaker specs, instead of outright bashing it.
 
Last edited:
I bet Insomniac can pull this same graphic fidelity at 60 fps thanks to the fact that they don't do night and day cycle. So a lot of power would be saved by baked lighting instead of dynamic lighting as used here.

I'm expecting spider-man 2 to be this good looking graphically at 60fps.

i would temper expectations a little. spider-man 2 is going to look amazing but as detailed as this tech demo? i dont know about that. it would be insane if it does though. can you imagine swinging through the city with that much detail?
 

DenchDeckard

Moderated wildly
Series s has far more concessions than just rendering at 720p. They are listed in the op of this very thread.

Which is understandable. I don't go on the Internet and rip the piss out of people with a 1660 super gpu when I'm rocking a 3080.

The reality is they get a great experience at 1080p with cut back settings.

Still a great gpu
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Which is understandable. I don't go on the Internet and rip the piss out of people with a 1660 super gpu when I'm rocking a 3080.
Is criticizing the hardware the same as bashing hardware owners?

When the reviewers trashed the 6600xt, were they shitting on the eventual owners of the 6600xt? The series S now sells more than the XSX. Hell, there is a good chance that it even outsold the PS5 in November and if the trend continues then there is a good chance it will be the console where the majority of gamers get their next gen experience. Surely, we should be allowed to at least discuss its performance shortcomings.







We're focusing on the wrong thing here, and it's just weird to see users get angry/upset when outlets like DF or NX praise the Series S for getting so close to the stronger consoles with its weaker specs, instead of outright bashing it.
Criticizing the performance of a console is now bashing it?

I dont understand this sentiment at all. This is the second time im addressing this in this thread. We are in a performance thread and the topic of the thread is being discussed. If people have become so attached to a console where any criticism of it is taken as a personal insult then we all need to take a step back and reevaluate just how the fuck we got here.

I think y'all are putting so much focus on what's missing from the Series S version and not enough focus on what the $299 machine with clearly less specs than the PS5/SX is doing.


In the Guardians thread last month, we all spent pretty much the entire thread wondering just why the fuck the PS5 and XSX cant do 1080 60 fps or 1440p 60 fps instead of fawning over the native 4k 30 fps version. Thats just what these performance threads are for. We discuss why games are not performing up to par or why the consoles are struggling to do things. When I was making the assertion that the PS5 might be memory bandwidth limited leading to a poor 60 fps implementation, no one got their panties in a twist and asked that I look on the bright side.

  • Series S: 720p-756p (along with reduced world detail, reduced pedestrian/car count, less geometry and more aggressive LOD)

This is the topic of thread. I replied to Dench pointing this out. If we are no longer allowed to discuss it then lets ban discussion of all performance threads because the moment a PS5 game underperforms relative to the XSS or XSX, PS fans are going to demand the same: Thou shalt not criticize thy neighbor's console of choice.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
Is criticizing the hardware the same as bashing hardware owners?

It's not easy to determine it here, lol.

There's more than enough of the "Where the Series S crew at ?" etc etc kind of replies here.

Criticizing the performance of a console is now bashing it?

I dont understand this sentiment at all. This is the second time im addressing this in this thread. We are in a performance thread and the topic of the thread is being discussed. If people have become so attached to a console where any criticism of it is taken as a personal insult then we all need to take a step back and reevaluate just how the fuck we got here.

My reply wasn't to you specifically but in general, I've also posted something similar more than once.

There's more negativity towards the Series S towards not matching the stronger consoles instead of an academic discussio.

There's a line between pointing out performance deficiencies and trying to 1-up an obviously weaker console running something not as good as the stronger ones as some kind of a zinger towards people who praise what the Series S can do based on its specs and lower MSRP.
 
Last edited:

Iced Arcade

Member
I bet Insomniac can pull this same graphic fidelity at 60 fps thanks to the fact that they don't do night and day cycle. So a lot of power would be saved by baked lighting instead of dynamic lighting as used here.

I'm expecting spider-man 2 to be this good looking graphically at 60fps.
can't wait to see the new boat people
 

SlimySnake

Flashless at the Golden Globes
No static lighting though, since you can move the sun around at will.
Yes, but that is when the lighting is recalculated. Otherwise, the lighting is static while you are driving around just like in spiderman which does not switch between different time of day.

The point i am trying to make is that the lighting in spiderman 2 wont be any less demanding than the lighting in the matrix demo just because they went with baked lighting. They are definitely not going to get a 2x performance gain by switching to baked GI. Games like Horizon, AC Unity, Uncharted, TLOU all use the same baked in GI solution. Horizon switches between 6 different bakes throughout their 24/7 day night cycle. The cost associated with realtime GI comes only when the sun decides to move. If the sun is static, there is no recalculation of the lighting model.
 

DaGwaphics

Member
Yes, but that is when the lighting is recalculated. Otherwise, the lighting is static while you are driving around just like in spiderman which does not switch between different time of day.

The point i am trying to make is that the lighting in spiderman 2 wont be any less demanding than the lighting in the matrix demo just because they went with baked lighting. They are definitely not going to get a 2x performance gain by switching to baked GI. Games like Horizon, AC Unity, Uncharted, TLOU all use the same baked in GI solution. Horizon switches between 6 different bakes throughout their 24/7 day night cycle. The cost associated with realtime GI comes only when the sun decides to move. If the sun is static, there is no recalculation of the lighting model.

I see what you are saying now.
 

VFXVeteran

Banned
There is no doubt that these are the best graphics we have ever seen. If instead of graphics you want resolution, they go and play Pong at 8K.



Game companies hired many talent from other industries for decades: animation, cinema, comic, novels, music and so on. Games also did use movie IPs since forever to get extra attention and sales (see E.T. the videogame). There is nothing new related to movie and gaming industries working together in this project. Obviously they got references from the movies to replicate them, got the actors for the voice/motion capture and I assume they agreed/negotiated/licensed the story/dialogs shown here. As happens with any licensed product.

This demo has been made by game developers: Epic (with support from other studios like The Coalition) and uses the tech from Epic (UE5 and all its features), which is what they are showcasing here.
UE5 was developed and being refined by people from both industries. My point is that there is no delineation between a graphics engineer in gaming and a graphics engineer in film. They both have skillsets that can be used interchangeably. Getting more graphics engineers from film will be a positive contributing factor to the tech evolution. Back in the day, game developers would never use film software engineers for their game development for fear that they didn't know enough to be successful.
 

ZywyPL

Banned
The PS5 is practically never flat on the frametime graph?

Yeah im not touching this one.

Just want to come in here and say TSR gets the job done.
Looking at the internal resolution when TSR is in play is pointless.
In motion, in game there near no chance any console warriors would be able to tell the internal res is 1080p.
And you guys are doing yourselves a disservice thinking too much about how the internal res is 1080p with TSR bumping it to 1440p.

Pretty much all Unreal Engine 5 titles will use TSR to reach target res.
Games will be 1080p or 1440p with TSR bumping to 1440p or 1800p(4K if youre lucky).
Its pretty much free UE4 used TAAU for a similar cost.
The final image is nigh indistinguishable from native for the vast vast vast majority of gamers.
And fuck it....we will get better visuals over all vs higher internal resolutions.

Native res is dead.

TSR gets rid of all the shimmering and jaggies that's for sure, but c'mon, you can easily tell the final image is way below 4K. It's like watching a 1080p blu-ray move instead of UHD one - sure there are no image artifacts in both, but only the UHD one gives you that sharpness and all the tiny details.

Upscaling is obviously the direction the games are taking, the tech gets better and better each year, but a software solution used from as low as 1080p can go only so far. The engine obviousy needs a lot of work seeing how even with such upscaling the demo runs in the 20s most of the time.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
TSR gets rid of all the shimmering and jaggies that's for sure, but c'mon, you can easily tell the final image is way below 4K. It's like watching a 1080p blu-ray move instead of UHD one - sure there are no image artifacts in both, but only the UHD one gives you that sharpness and all the tiny details.

Upscaling is obviously the direction the games are taking, the tech gets better and better each year, but a software solution used from as low as 1080p can go only so far. The engine obviousy needs a lot of work seeing how even with such upscaling the demo runs in the 20s most of the time.
Im not saying 1080p will look like 4K. (especially to nerds)
If you read my post i stated if TSR bumps 1080p to 1440p
Asking someone if the image(in motion) is 1440p or 1080p the majority of people wouldnt be able to tell its 1080p vs a native 1440p
Assuming the games internal is 1440 - 1800p and bumps up to 4K.
You would again find that most people wont be able to tell the game is at a lower resolution.

To summarize

TSR 1080 -> 1440p vs 1440p (nobody would be able to really tell which is which)
TSR 1440 -> 1800p vs 1800p (Again hard to tell)
TSR 1800 -> 2160p vs 2160p (Most people wouldnt be able to tell).

Even today the vast majority of my IRL friends who are basically a good representation of what gaming at large is (most gamers arent on forums or even know what digitalfoundry is) 1440p might as well be 4K for them....even before upscaling the image.
The Matrix demo for them was as good as 4K native even if at best it was reaching a TSR bumped 1620p and theyve seen native 4K before.
When you are trying to impress the majority NOT DF thread lurkers getting a TSR bump close enough to 4K is more than enough.
Which is why this demo still impressed even with its "low" resolution.
 
It's not easy to determine it here, lol.

There's more than enough of the "Where the Series S crew at ?" etc etc kind of replies here.



My reply wasn't to you specifically but in general, I've also posted something similar more than once.

There's more negativity towards the Series S towards not matching the stronger consoles instead of an academic discussio.

There's a line between pointing out performance deficiencies and trying to 1-up an obviously weaker console running something not as good as the stronger ones as some kind of a zinger towards people who praise what the Series S can do based on its specs and lower MSRP.
There is a bit of glass half empty vs half full argument. For all the talk of how much 'weaker' the XSS is there is something quite remarkable that a console that's $300 is pushing these graphics. The top end last generation consoles can't run this demo AT ALL.

Seeing the difference between the versions also pushes back against the narrative that the lowest speced device holds higher speced devices back. Which was never true On PC. It's impressive to see this demo running on all the systems, XSS even more so based on how it is perceived by some. Next year's Stalker game will be an interesting test case for Unreal 5 in a complete game.
 

FrankWza

Member
There is a bit of glass half empty vs half full argument. For all the talk of how much 'weaker' the XSS is there is something quite remarkable that a console that's $300 is pushing these graphics. The top end last generation consoles can't run this demo AT ALL.

Seeing the difference between the versions also pushes back against the narrative that the lowest speced device holds higher speced devices back. Which was never true On PC. It's impressive to see this demo running on all the systems, XSS even more so based on how it is perceived by some. Next year's Stalker game will be an interesting test case for Unreal 5 in a complete game.
Seinfeld Subway GIF by HULU
 

avin

Member
My take on this demo is it's reassuring how well the S runs it, and how modest the concessions were. Down the road, eventually, that won't be the case, and I'll pick up something else. But I imagine it'll be fine for at least a couple years, which is all I need it to do. It works out to $150 a year. Or, about two games.

avin
 

Tchu-Espresso

likes mayo on everthing and can't dance
definitely not happening. at 30? yea. def not 60. this demo is incredibly detailed
Insomniac are able to basically display the same fidelity at 60 albeit at a lower resolution (although this is hardly a big deal at a normal viewing distance).

I’ve tried to justify playing Ratchet and Clank at 40fps fidelity over performance RT but I see no reason to.
 

DenchDeckard

Moderated wildly
Is criticizing the hardware the same as bashing hardware owners?

When the reviewers trashed the 6600xt, were they shitting on the eventual owners of the 6600xt? The series S now sells more than the XSX. Hell, there is a good chance that it even outsold the PS5 in November and if the trend continues then there is a good chance it will be the console where the majority of gamers get their next gen experience. Surely, we should be allowed to at least discuss its performance shortcomings.








Criticizing the performance of a console is now bashing it?

I dont understand this sentiment at all. This is the second time im addressing this in this thread. We are in a performance thread and the topic of the thread is being discussed. If people have become so attached to a console where any criticism of it is taken as a personal insult then we all need to take a step back and reevaluate just how the fuck we got here.




In the Guardians thread last month, we all spent pretty much the entire thread wondering just why the fuck the PS5 and XSX cant do 1080 60 fps or 1440p 60 fps instead of fawning over the native 4k 30 fps version. Thats just what these performance threads are for. We discuss why games are not performing up to par or why the consoles are struggling to do things. When I was making the assertion that the PS5 might be memory bandwidth limited leading to a poor 60 fps implementation, no one got their panties in a twist and asked that I look on the bright side.



This is the topic of thread. I replied to Dench pointing this out. If we are no longer allowed to discuss it then lets ban discussion of all performance threads because the moment a PS5 game underperforms relative to the XSS or XSX, PS fans are going to demand the same: Thou shalt not criticize thy neighbor's console of choice.


the 6600 xt is a shit card for its price. It’s not terribly designed or anything.

the series s is a brilliantly designed small system that is 250 quid and includes the same cpu less memory and a decent gpu and memory ssd configuration for the price.

it doesn’t deserve all its hate. If you look at price vs performance Imo.
 

RoadHazard

Gold Member
Insomniac are able to basically display the same fidelity at 60 albeit at a lower resolution (although this is hardly a big deal at a normal viewing distance).

I’ve tried to justify playing Ratchet and Clank at 40fps fidelity over performance RT but I see no reason to.

The resolution difference is rather obvious in Ratchet IMO. The fidelity mode makes the cutscenes come DAMN close to CG movie quality. In performance mode stuff like hair rendering takes a very noticeable hit, with more pixel shimmering etc that kinda breaks that CG look.

But in gameplay it's far better at 60 of course
 
Last edited:

yewles1

Member
The resolution difference is rather obvious in Ratchet IMO. The fidelity mode makes the cutscenes come DAMN close to CG movie quality. In performance mode stuff like hair rendering takes a very noticeable hit, with more pixel shimmering etc that kinda breaks that CG look.

But in gameplay it's far better at 60 of course
Fidelity matches CG quality from the same period of time as the Matrix trilogy, such as Pixar's Monsters Inc.
 

SlimySnake

Flashless at the Golden Globes
the 6600 xt is a shit card for its price. It’s not terribly designed or anything.

the series s is a brilliantly designed small system that is 250 quid and includes the same cpu less memory and a decent gpu and memory ssd configuration for the price.

it doesn’t deserve all its hate. If you look at price vs performance Imo.
When we start looking at price to performance, the series s does not fare well since it offers 1/4 the performance more than half the price. Pretty much every game that runs at native 4k on the XSX runs at 1080p on the Series s with lower quality settings. Thats 1/4th the resolution. Guardians, Halo Infinite, Forza Motorsport. Even in those rare occasions when we get 1440p vs 4k versions, the settings are dialed down.

If we are looking at the price of performance ratio, a console offering 1/4th performance should cost 1/4th less, right? That should put the XSS at $125 when compared against what the XSX is offering. $100 when compared against the PS5.
 

RoadHazard

Gold Member
When we start looking at price to performance, the series s does not fare well since it offers 1/4 the performance more than half the price. Pretty much every game that runs at native 4k on the XSX runs at 1080p on the Series s with lower quality settings. Thats 1/4th the resolution. Guardians, Halo Infinite, Forza Motorsport. Even in those rare occasions when we get 1440p vs 4k versions, the settings are dialed down.

If we are looking at the price of performance ratio, a console offering 1/4th performance should cost 1/4th less, right? That should put the XSS at $125 when compared against what the XSX is offering. $100 when compared against the PS5.

If you're talking TF it's 1/3, not 1/4 (4 vs 12). But then it's also worse in other ways, while still having the same CPU, SSD, etc. So it's not as simple as saying "1/3 the GPU performance means it should be 1/3 the price". I do thinks it's a far worse value than the PS5 DE though.
 

SlimySnake

Flashless at the Golden Globes
If you're talking TF it's 1/3, not 1/4 (4 vs 12). But then it's also worse in other ways, while still having the same CPU, SSD, etc. So it's not as simple as saying "1/3 the GPU performance means it should be 1/3 the price". I do thinks it's a far worse value than the PS5 DE though.
That's another topic of discussion. The tflops difference suggests that it should give 1/3rd the performance, and yet how many times have we seen the xss version top out at 1080p while the xsx version is native 4k? That's 1/4th the performance, not 1/3rd. It's pretty much standard at this point. So if we go by price per performance, it's not up to par. If we go by tflops per dollar, the performance is even worse.

And again, this is before we even bring in the PS5, let alone the PS5 DE.
 
Last edited:

RoadHazard

Gold Member
That's another topic of discussion. The tflops difference suggests that it should give 1/3rd the performance, and yet how many times have we seen the xss version top out at 1080p while the xsx version is native 4k? That's 1/4th the performance, not 1/3rd. It's pretty much standard at this point. So if we go by price per performance, it's not up to par. If we go by tflops per dollar, the performance is even worse.

And again, this is before we even bring in the PS5, let alone the PS5 DE.

I think it also has a slower GPU clock? Which would make the fillrate worse etc.
 

SlimySnake

Flashless at the Golden Globes
I think it also has a slower GPU clock? Which would make the fillrate worse etc.
Yeah it tops out at 1.55 Ghz. It also has only 228 GBps of VRAM compared to 560 GBps of XSX RAM. To me, the fact that it isnt even scaling down 1:1 when you reduce the resolution 4x means there is something in the console thats holding back its true potential. Otherwise, there would be no need for adjusting graphics settings and a simple resolution cut would do.

So right off the bat, the price to performance argument gets derailed.
 

DaGwaphics

Member
@ SlimySnake SlimySnake welcome to PC building 101. There is a certain base cost for a complete system. $300 for 8 core Ryzen and a NVMe drive plus the memory and GPU is a good deal.

If you search around YT there are people playing 3rd party games from a Switch connected to their TV. Thus, there are users that don't care about playing at max settings as long as the games are feature complete and they are getting the full gameplay/story experience.

You are not one those people, which is completely fine. People should be able to discuss anything they want, It is a discussion board after all. Expect people with opposing views to discuss right back with you though.
 
Last edited:
@ SlimySnake SlimySnake welcome to PC building 101. There is a certain base cost for a complete system. $300 for 8 core Ryzen and a NVMe drive plus the memory and GPU is a good deal.

If you search around YT there are people playing 3rd party games from a Switch connected to their TV. Thus, there are users that don't care about playing at max settings as long as the games are feature complete and they are getting the full gameplay/story experience.

You are not one those people, which is completely fine. People should be able to discuss anything they want, It is a discussion board after all. Expect people with opposing views to discuss right back with you though.
I'm not aware of a more powerful system that costs less or even the same. It's pretty obvious the XSS offers excellent performance for the price. Plenty of actual customers seem pretty satisfied.
 

DaGwaphics

Member
I'm not aware of a more powerful system that costs less or even the same. It's pretty obvious the XSS offers excellent performance for the price. Plenty of actual customers seem pretty satisfied.

Agreed. I've been nothing but pleased with mine. Seems like most of the people buying it are as well, at least what you see on Twitter and YT. I think people buying it know what it is they are getting.
 

PaintTinJr

Member
I think y'all are putting so much focus on what's missing from the Series S version and not enough focus on what the $299 machine with clearly less specs than the PS5/SX is doing.

Not everyone wants or needs the $500 console, a lot of people are more than happy with a lower end console that guarantees them all of next gen games.

We're focusing on the wrong thing here, and it's just weird to see users get angry/upset when outlets like DF or NX praise the Series S for getting so close to the stronger consoles with its weaker specs, instead of outright bashing it.
My take from trying this demo - tried on the PS5 - and comparing it to the UE5 early access demo - I've got on PC - and keeping in mind the visuals and omission of features used by the Coalition across their prototyping on X and S showcase, and keeping in mind that the UE5 early access wasn't done on Series S, my observation is that the Series S has now had two outings with the X where neither demo have stressed nanite and/or used heaving kit bashing situations with nanite, as seen in this demo by the abundance of very large single primitives in triangle debug mode - suggesting that the Series S is either holding feature use back because it favours use of h/w geometry pipelines, which are 2/3rds less efficient than nanite (according to Epic) and oddly, even if using mesh shading in this demo in place of nanite for those large triangle parts - so the X should have a performance advantage there, because it has more CUs, weirdly the X isn't performing as well or better than the PS5 judging by DF videos, which makes one wonder, why not? is that the cross development with Series S holding it back or an actual real world performance advantage of the PS5?

Visually the scenery looks good, but it looks closer to cross-gen Spidey remaster/Miles Morales for visuals than either the UE5 early access or the original (Lumen in that Land of nanite) demo IMHO.

It is a great start, but outside of the drone camera or real-time rendered cinematics the traditional geometry rendering of things breaks the next-gen illusion quite quickly IMO like morales does, and I don't think the Series S is going to help third parties get above that illusion breaking look on any of the systems. It costs most of what the true next-gen systems cost, but has less performance that doesn't scale down in a linear way with resolution. IMHO it isn't at all good for the pushing forward of the visuals in the games industry to have a mediocre specced version of a next-gen machine in the market, also.
 
Last edited:

Lethal01

Member
And cinematic sub 30fps
Is this how future games will run and look? Smh



I fucking hope that future games will look this good

FUCK 60fps and 4k give me graphics that are actually better or stop going for anything similar to photorealism with your game..

Actually, scratch that, everyone should give up on photorealism either way since you will never reach the heights of Guilty Gear Strive or Into The Spider-Verse like that.
 
Last edited:

Lethal01

Member
even with such upscaling the demo runs in the 20s most of the time.
Think this is untrue, it's very variable but at 30fps more often than not.
Also, you can get it to locked 30 by turning down the traffic percentage which is clearly EXTREME overkill compared to current games.
 

yurinka

Member
UE5 was developed and being refined by people from both industries. My point is that there is no delineation between a graphics engineer in gaming and a graphics engineer in film. They both have skillsets that can be used interchangeably. Getting more graphics engineers from film will be a positive contributing factor to the tech evolution. Back in the day, game developers would never use film software engineers for their game development for fear that they didn't know enough to be successful.
UE5 is developed by Epic, like the previous UE versions. They make games and game engines, not movies, even if now UE is starting to be used for movies and commercials.

A graphics engineer in gaming has pretty similar skillsets than one from movies, but it's pretty different because running stuff in real time in a single PC or console -and more in a pretty humble hardware- has nothing to do with a a movie scene pre-rendered in cloud farm.

If you're an vfx artist you should know that, because vfx in games and movies are pretty different because real time -specially during gameplay- is way more limited.
 
  • Like
Reactions: Rea

adamsapple

Or is it just one of Phil's balls in my throat?
And cinematic sub 30fps
Is this how future games will run and look? Smh
Come on, let's not be daft.

Locking the FPS to 24 during all the character scenes is an artistic choice. The 'game' is pretty good 30 FPS during all the hectic cinematic scenarios and as pointed above, if you reduce traffic density, you get a pretty good locked frame rate in the open world segment too. The default setting is *WAY* more over populated than any game we have in the market right now. Over time this engine will get more improvements via optimizations as well. Games look and perform worse at console launch windows than they do in the later years in almost every console generation.

So much FUD, I swear ...
 

yewles1

Member
UE5 is developed by Epic, like the previous UE versions. They make games and game engines, not movies, even if now UE is starting to be used for movies and commercials.

A graphics engineer in gaming has pretty similar skillsets than one from movies, but it's pretty different because running stuff in real time in a single PC or console -and more in a pretty humble hardware- has nothing to do with a a movie scene pre-rendered in cloud farm.

If you're an vfx artist you should know that, because vfx in games and movies are pretty different because real time -specially during gameplay- is way more limited.
Exactly, finally reaching the point for a console to match, in realtime, the fidelity of CG from as recent as 16 -17 years ago is a tremendous feat.
 
Last edited:

Rykan

Member
I fucking hope that future games will look this good

FUCK 60fps and 4k give me graphics that are actually better or stop going for anything similar to photorealism with your game..

Actually, scratch that, everyone should give up on photorealism either way since you will never reach the heights of Guilty Gear Strive or Into The Spider-Verse like that.
Perhaps you should go watch a movie instead. Then you don't have to worry about things like "Framerate" or "Playabilitiy" and you can just gawk at pretty graphics.
 
In the Guardians thread last month, we all spent pretty much the entire thread wondering just why the fuck the PS5 and XSX cant do 1080 60 fps or 1440p 60 fps instead of fawning over the native 4k 30 fps version. Thats just what these performance threads are for. We discuss why games are not performing up to par or why the consoles are struggling to do things. When I was making the assertion that the PS5 might be memory bandwidth limited leading to a poor 60 fps implementation, no one got their panties in a twist and asked that I look on the bright side.
These people are praising Halo:Infinite for its looks.


Honestly I can't believe that in the same month we have a demo like this where you can hardly tell what is on the screen is real time graphics–only to get people complaining about performance "issues" in it... and we have people lauding Halo:I's visuals as if they were somehow "OK" for a AAA game released in this time period.

I would have preferred something in a forest, with intense weather effects (this is the kind of thing I am not sure how nanite would handle.. we have only seen it used on rocks and concrete)... but still I am very impressed by this demo.
 

Lethal01

Member
Perhaps you should go watch a movie instead. Then you don't have to worry about things like "Framerate" or "Playabilitiy" and you can just gawk at pretty graphics.
Perhaps you should go eat a dick.

Astral Chain is more fun and feels way better to play at 30fps than any game this year at 60fps.
Also, I got a PC that I can play any game at 60 on, I don't need consoles holding devs back by trying to get 4k60 out of them.
 

Lethal01

Member
This is the kind of thing I am not sure how nanite would handle.. we have only seen it used on rocks and concrete)...

IT would not handle it at all, they have been very clear that You can use the standard rendering pipeline for these things instead of Nnaite and it won't be any worse than it's been before. You can look up tons of UE4 videos of forests.
 

Rykan

Member
Perhaps you should go eat a dick.

Astral Chain is more fun and feels way better to play at 30fps than any game this year at 60fps.
Also, I got a PC that I can play any game at 60 on, I don't need consoles holding devs back by trying to get 4k60 out of them.
Yea you know what would make the game even better? Playing it at 60 FPS instead.

Also love the irony here.
"FuCk 60 FpS. Also Go EaT a DiCk I hAs A pC thAt CaN rUn 60 FpS". Fucking assclown.
 

Lethal01

Member
Yea you know what would make the game even better?

Better graphics? I agree.
The game already feels better to play than most 60fps games, Better to improve the visuals than waste power on framerate.

30 should be the default target on consoles unless you're a first person shooter. If people demand 60 they should go to PC rather than making devs gimp their games.
 
Last edited:

Rykan

Member
Better graphics? I agree.
The game already feels better to play than most 60fps games, Better to improve the visuals than waste power on framerate.
Better graphics are not a bigger improvement over improved input delay and the improved clarity you get from requiring less motion blur. Every single action game that is comparable in its genre (DMC,Bayonetta,MG:R,Nier:A) that runs at 60 fps feels better to play than Astral Chain.
 

Lethal01

Member
Better graphics are not a bigger improvement over improved input delay and the improved clarity you get from requiring less motion blur

They are, The input delay and clarity at 30fps are more than enough to play at the top level in these games.
 
Last edited:
definitely not happening. at 30? yea. def not 60. this demo is incredibly detailed
But I believe what's really tanking the performance is Lumen and not Nanite. So it's not the details that are hard on performance but the lighting. If Insomniac will do baked lighting then they can save a lot of performance cycle.

Insomniac can make the same detailed world if not more detailed. The benefit of nanite is the fact that they don't have to create lods. Insomniac will just have to do it the hard way.
 
  • Like
Reactions: Rea
can't wait to see the new boat people
That's ps4. If spider-man 2 is ps5 only, then we can expect this kind of graphical leap.

Even without virtual texturing (nanite), Insomniac can stream extreme details in and out because of the fast ssd. What nanite does is save on development work because they don't have to author lods anymore.
 
Top Bottom