• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Heard that Xbox Series S Is A "Pain" For Developers Due To Memory Issues

Status
Not open for further replies.

Allandor

Member
Even Metro. They dont care if they hit 512p. They shipped the game in that state without giving two shits about the user experience. This will not hold back the series x. Several games have skipped 60 fps modes or ray tracing modes already. Guardians literally has 90% of the foliage missing despite the 1/4th resolution downgrade.

BBz1gPP.gif
So what is the problem here?
Yes, there are cutbacks but the game runs well enough it can be played and that is the reason for Series S. Still the image looks "samey" enough to recognize that it is the same game.
If you don't accept that games get cut back until they run, don't buy a Series S. Most people just don't care, they just want to play the newest stuff.


But that's what I'm saying Slimy. Sizable effort was placed into downporting Matrix to the S.

Not many devs will go through such time & effort and instead build for S and scale up with essentially slider bumps - this is the hold back. The target platform by virtue of time, money & business becomes the lowest common denominator. The worst aspect to this gen.
This won't hold back anything, as the "sliders" will be in the game if you want it or not. Even for PS5 only games. As the developers might have to adjust the quality when the game is almost ready to ship to make the game playable. This is how game development works, you must always have an option to scale something else something might just not work out as you have expected and you can't react to the situation. Sure that there is less memory will "force" some developers to cut back more than really needed (because they just want it to get running well enough and that is ok) but it won't hold back anything.
Yes, solutions like in Metro with RT to go fall back to 512p is not really nice looking, but the game runs well enough to be played.
 
Last edited:

azertydu91

Hard to Kill
So what is the problem here?
Yes, there are cutbacks but the game runs well enough it can be played and that is the reason for Series S. Still the image looks "samey" enough to recognize that it is the same game.
If you don't accept that games get cut back until they run, don't buy a Series S. Most people just don't care, they just want to play the newest stuff.
I understand and I agree but where else would they talk about the Series S shortcomings except in a thread where devs complain about Series S shortcomings?
 

Md Ray

Member
Why focus on raytracing and not the 120fps mode which WAS in the game? For a fast paced action game I'll let you decide which was more important. Seems more like the goal post shifting again. 'Little Beast' redeemed.

Devs will always make the final call about graphical effects in games like how raytraced reflections were omitted on Cyberpunk 2077 even on PS5 and XSX. No one seemed to shed a tear strangely.
sam winchester yawn GIF


Why don't you take the fight to the id software guy, man? Why is it so hard for you to accept and move on that Series S has memory limitations (a pain for devs) in ways that SX doesn't/isn't? The only person moving goalposts here is YOU.
 
Last edited:
sam winchester yawn GIF


Why don't you take the fight to the id software guy, man? Why is it so hard for you to accept and move on that Series S has memory limitations (a pain for devs) in ways that SX doesn't/isn't? The only person moving goalposts here is YOU.
The reason why this thread is 26 pages long is that some people argue, without evidence, that Series S is going to hold back the generation. Nobody is arguing about Series S having less memory than Series X or PS5.
 

Three

Member
The reason why this thread is 26 pages long is that some people argue, without evidence, that Series S is going to hold back the generation. Nobody is arguing about Series S having less memory than Series X or PS5.
That's not even what the thread is about but what evidence can they provide you that you're willing to accept?
 

arvfab

Banned
The reason why this thread is 26 pages long is that some people argue, without evidence, that Series S is going to hold back the generation. Nobody is arguing about Series S having less memory than Series X or PS5.

Well no evidences can be provided for the other side, neither, because nobody but the devs will know which compromises had to be made in order to let a game run on Series S.
 
There has been evidence from the id software dev
Nope, Doom Eternal runs on last gen consoles too.
Game dev is about time and cost. If you think that doesn't have an influence then it's not worth discussing it.
By that logic, any version beside the version you're playing is holding the game back, because time and money went into porting those other versions. Makes no sense.
 

arvfab

Banned
By that logic, any version beside the version you're playing is holding the game back, because time and money went into porting those other versions. Makes no sense.

Well exactly this makes sense, thou. Why you think people are against PC releases of PlayStation games?

MS PR people in this thread keep telling how SFS etc. aren't being used yet. Maybe the reason for that is, that devs need to accommodate hardware which is not able to have SFS, so they don't bother implementing it.
 
Exactly, that is the reason why people usually see exclusives as the games with better quality.
Maybe, but I don't really agree with your premise. Sony has released two of their four best looking PS4 games on PC, and there was zero console exclusive secret sauce involved in the development of those games, by the looks of it.
 

Shmunter

Member
So what is the problem here?
Yes, there are cutbacks but the game runs well enough it can be played and that is the reason for Series S. Still the image looks "samey" enough to recognize that it is the same game.
If you don't accept that games get cut back until they run, don't buy a Series S. Most people just don't care, they just want to play the newest stuff.



This won't hold back anything, as the "sliders" will be in the game if you want it or not. Even for PS5 only games. As the developers might have to adjust the quality when the game is almost ready to ship to make the game playable. This is how game development works, you must always have an option to scale something else something might just not work out as you have expected and you can't react to the situation. Sure that there is less memory will "force" some developers to cut back more than really needed (because they just want it to get running well enough and that is ok) but it won't hold back anything.
Yes, solutions like in Metro with RT to go fall back to 512p is not really nice looking, but the game runs well enough to be played.
Kmart suit in whatever size vs Tailor made suit just for you. Mmm, French kissing a chef!
 

arvfab

Banned
Maybe, but I don't really agree with your premise. Sony has released two of their four best looking PS4 games on PC, and there was zero console exclusive secret sauce involved in the development of those games, by the looks of it.

They were also ported a lot later and we don't know how much "pain" it was to port the engine to PC. If an engine is tailored to a specific hardware, it is surely better in regards to squeezing as much out of the hardware as possible.
 

Shmunter

Member
They were also ported a lot later and we don't know how much "pain" it was to port the engine to PC. If an engine is tailored to a specific hardware, it is surely better in regards to squeezing as much out of the hardware as possible.
There was pain. Streaming aspect issues were documented at some point. Also, we're talking last gen games designed around an abacus of a cpu.

Would like to see them running on whatever equivalent compute power pc DF used use at the beginning of the gen.
 
Last edited:
There was pain. Streaming aspect issues were documented at some point. Also, we're talking last gen games designed around an abacus of a cpu.

Would like to see them running on whatever equivalent compute power pc DF used use at the beginning of the gen.
Yeah, that would be an interesting benchmark. I saw one with an Radeon 7850 and the game ran slightly worse than a PS4, but this was before the optimization patches.
 

yamaci17

Member
Well exactly this makes sense, thou. Why you think people are against PC releases of PlayStation games?

MS PR people in this thread keep telling how SFS etc. aren't being used yet. Maybe the reason for that is, that devs need to accommodate hardware which is not able to have SFS, so they don't bother implementing it.

youngest GPUs that does not have SFS on Nvidia side is gtx 1000 and 1660 series (2016-2018). since 2018, all nvidia gpus are capable of running sampler feedback (rtx 2000s, 3000s). rdna 2 gpus are also capable of SFS. rdna1/rdna2/gcn representation in PC gaming is very small and weak so they're irrelevalant to this discussion

in 2-3 years, it is possible that some games designed around SFS will ignore gtx 1000 series. they will be 8 years old at that point so I dont think anybody would complain. 8 years are literally console cycle worth of time.

pascal and older architectures will absolutely be abandoned. true nextgen games should require 12.2 at some point. but i'd say we are 2-3 years shy of that, yet. problem is that nvidia cannot pump reliable, cheap midrange options for players. we talk all the time about how there are tons of 1060 computers but there was a reason for that: it was dirt cheap. it was cheap, performed amazingly for its price and all gtx 700-900 users jumped the ship to 1060s. in 2017, you could hardly find anyone who was rocking a gtx 700 card. by 2018, even maxwell gpus were abandoned by pc gamers.

3060 could've been a worthy successor but it is being sold for scalp prices. rightfully, people still cling on to their trusty 1060s, 1070s, because they still perform all right in most new titles.

if 60-70 lineup can get rid of scalp pricing, gamers will flock to those GPUs and all of a sudden pascal representation will be a minority, just like kepler was. and at that point, devs can start ignoring dx12.1 on pc space
 
Last edited:

arvfab

Banned
youngest GPUs that does not have SFS on Nvidia side is gtx 1000 and 1660 series (2016-2018). since 2018, all nvidia gpus are capable of running sampler feedback (rtx 2000s, 3000s). rdna 2 gpus are also capable of SFS. rdna1/rdna2/gcn representation in PC gaming is very small and weak

in 2-3 years, it is possible that some games designed around SFS will ignore gtx 1000 series. they will be 8 years old at that point so I dont think anybody would complain. 8 years are literally console cycle worth of time.

pascal and older architectures will absolutely be abandoned. true nextgen games should require 12.2 at some point. but i'd say we are 2-3 years shy of that, yet. problem is that nvidia cannot pump reliable, cheap midrange options for players. we talk all the time about how there are tons of 1060 computers but there was a reasno for that: it was dirt cheap. it was cheap, performed amazingly for its price and all gtx 700-900 users jumped the ship to 1060s. in 2017, you could hardly find anyone who was rocking a gtx 700 card. by 2018, even maxwell gpus were abandoned by pc gamers.

3060 could've been a worthy successor but it is being sold for scalp prices. rightfully, people still cling on to their trusty 1060s, 1070s, because they still perform all right in most new titles.

if 60-70 series can get rid of scalp pricing, gamers will flock to those GPUs and all of a sudden pascal representation will be a minority, just like kepler was. and at that point, devs can start ignoring dx12.1 on pc space

So basically you say that in a few years the min specs on PC might rise? Yet people in this thread wanted to make me believe that greedy devs would want to support ancient hardware until the end of time...
 
So basically you say that in a few years the min specs on PC might rise? Yet people in this thread wanted to make me believe that greedy devs would want to support ancient hardware until the end of time...

Rtx 3050 mobile, Steamdeck, rx6500xt etc all are recent releases and are selling shitload. All are below Series S performance.

PC min specs will be behind Series S for a long time.
 

yamaci17

Member
So basically you say that in a few years the min specs on PC might rise? Yet people in this thread wanted to make me believe that greedy devs would want to support ancient hardware until the end of time...

if those people believe that SFS/VRS and stuff will be an integral part of developing games further down the line, yes, min spec will have to change. notice how some ports already kept Kepler out of their support list because the architecture was not supporting dx 12.0. we're talking about serious implications here, supposedly "SFS" can act as a multiplier for the amount of VRAM budget you have (i don't fully believe that it will work that efficiently in actual scenarios btw. i will believe it when i see it). so non-SFS GPUs would simply not have enough memory for such games that are completely designed around the so called SFS, therefore they would be left out.

supposedly, SFS can increase the effective VRAM budget by 2 times or 3 times or something like that (i don't believe it will apply to actual gaming scenarios, once again). if a dev completely creates a game based on that multiplier they have been provided, their game would simply not scale back for reasons I won't discuss further.

but what if SFS is used in a way it only provides funny 100-500 mb memory reduction? in that case it will be a meme :messenger_grinning: i'm sure certain people will now post the so called "demos" but demos are demos. they're to show a technology's peak success. VRS was providing huge gains in some "demos." it ended up boosting a measly %5-15. so much fuss for so little gains.
 
Last edited:

yamaci17

Member
relevant example how "tech demos" can amplify the effect they have;



see? VRS here adds a whopping %70 performance total to this "specifically" demo tuned to show VRS power. did it apply to actual gaming scenarios? no. will it? i don't think so. you can design demos and tech samples that is designed in a way that VRS can work with peak efficiency. but in real scenarios, even with tier 2 VRS, it provides a %10-15 perf boosst in gears tactics and such.

so, showing SFS demos where it reduces memory usage by 2-3 times are simply no indication that it will work exactly like that in actual gaming scenarios.

btw it is also documented that most engines already use similar tactics to reduce their memory usage.
 
Last edited:

Three

Member
Nope, Doom Eternal runs on last gen consoles too.

By that logic, any version beside the version you're playing is holding the game back, because time and money went into porting those other versions. Makes no sense.
What has Doom Eternal got to do with this? I'm talking about the direct id engine dev telling you this regarding the series s:

Also "it always scaled on PC" is nonsense. Every AAA game in the past decade or so has their assets made once so they run on min spec. Increasing sample counts a bit here and there for high settings isn't what you could truly have done with more power. Min spec matters.
It's direct evidence from a dev.
By that logic, any version beside the version you're playing is holding the game back, because time and money went into porting those other versions. Makes no sense.
Not sure what you're trying to say here but I suspect you're trying to suggest time and money spent on other platforms is time and money not spent on your single platform. That's not what I'm saying. I'm saying any dev worth their weight aims for as close an experience within the scope of the project (all their target platforms) and takes all their capabilities in mind when creating their game to minimise time and money spent doing it.

Let me give you an example where you might be able to turn off your Series S blinkers. During the PS3 gen the simple fact that the memory was split (wasn't even less) gave devs a lot of trouble porting their games to it (Bethesda especially). Simply saying 'so what scale graphics down' didn't work so well because not everything scales and not everything in memory is VRAM related. So what most devs kept saying to avoid the added time and cost of porting to PS3 is make the game for PS3 with its memory constraints then port to 360.

It's a similar situation here but much less memory. If the memory constraints are giving devs problems with things that don't scale so easily that is costing them time and money. The solution to this is making a game for whatever your min spec is then increasing resolution/sample counts for the better machines just as the id software dev says.

I've said in this thread that I believe this is also why raytracing has been dropped by some first party MS devs. It didn't scale in their engines on the Series S and having an entirely different workflow for Series X and Series S would certainly not be worth the effort.
Any new games will take into account whatever the Series S is capable of and make development across all the platforms as easy as possible.
Dropping raytracing in those games due to Series S is speculation that MS will never ever confirm but the id software dev quote is not speculation at all. It's evidence clearly saying what you don't want to accept. Min spec matters and influences what you do.
 
Last edited:

arvfab

Banned
relevant example how "tech demos" can amplify the effect they have;



see? VRS here adds a whopping %70 performance total to this "specifically" demo tuned to show VRS power. did it apply to actual gaming scenarios? no. will it? i don't think so. you can design demos and tech samples that is designed in a way that VRS can work with peak efficiency. but in real scenarios, even with tier 2 VRS, it provides a %10-15 perf boosst in gears tactics and such.

so, showing SFS demos where it reduces memory usage by 2-3 times are simply no indication that it will work exactly like that in actual gaming scenarios.

btw it is also documented that most engines already use similar tactics to reduce their memory usage.


Please stop, you are crushing a lot of dreams right now
 
What has Doom Eternal got to do with this? I'm talking about the direct id engine dev telling you this regarding the series s:


It's direct evidence from a dev.
Multiplatform game assets have always been made to scale with a wide range of PCs. That's why Doom Eternal, despite being built for the PS4 as a "minimum spec", runs on the Switch.
Let me give you an example where you might be able to turn off your Series S blinkers. During the PS3 gen the simple fact that the memory was split (wasn't even less) gave devs a lot of trouble porting their games to it (Bethesda especially). Simply saying 'so what scale graphics down' didn't work so well because not everything scales and not everything in memory is VRAM related. So what most devs kept saying to avoid the added time and cost of porting to PS3 is make the game for PS3 with its memory constraints then port to 360.
Wasn't Cell the main reason PS3 was struggling? This (and last) gen, all console are essentially the same.
It's a similar situation here but much less memory. If the memory constraints are giving devs problems with things that don't scale so easily that is costing them time and money. The solution to this is making a game for whatever your min spec is then increasing resolution/sample counts for the better machines just as the id software dev says.
That's not what has happened so far. Otherwise id software would've just skipped raytracing implementation alltogether.
I've said in this thread that I believe this is also why raytracing has been dropped by some first party MS devs. It didn't scale in their engines on the Series S and having an entirely different workflow for Series X and Series S would certainly not be worth the effort.
I remember the discussion, I still think you're wrong.
Any new games will take into account whatever the Series S is capable of and make development across all the platforms as easy as possible.
Dropping raytracing in those games due to Series S is speculation that MS will never ever confirm but the id software dev quote is not speculation at all. It's evidence clearly saying what you don't want to accept. Min spec matters and influences what you do.
We'll see if id software will skip implementing raytracing in their next game just because the XSS can't handle it. I really doubt it.
 

People are still going on about SFS? :messenger_grinning_smiling:

Just going by the lack of interest in it or developer talks about the feature would imagine it's already been superseded.

The Coalition would have been all over that for the Matrix teaser, instead it was described as a 'gargantuan effort' 🤷‍♂️
Like I said you guys are using this quote wrong and just have no idea how context works. Pretty sure that's why threads like this are 30 pages.
 

DaGwaphics

Member
Rtx 3050 mobile, Steamdeck, rx6500xt etc all are recent releases and are selling shitload. All are below Series S performance.

PC min specs will be behind Series S for a long time.

Even if XSS did fall to the very back of the line in performance as time went on, would that be that different from normal? X1 was likely the baseline for quite some time, seems like a lot of impressive games got released in that time.

All of it seems like much ado about nothing, if you want to move up in performance get an XSX. Games are just as fun on the XSS so far, IMO.
 
Even if XSS did fall to the very back of the line in performance as time went on, would that be that different from normal? X1 was likely the baseline for quite some time, seems like a lot of impressive games got released in that time.

All of it seems like much ado about nothing, if you want to move up in performance get an XSX. Games are just as fun on the XSS so far, IMO.

True, I am playing some XOne S version of games on series s. They look pretty still (Dishonoured 2, Deus ex MD etc).
 

FrankWza

Member
Like I said you guys are using this quote wrong and just have no idea how context works. Pretty sure that's why threads like this are 30 pages.
.
Why doesn’t a quote ever mean what it says when it comes to the series s?
We have the systems engineer saying it will have the same experience as series x at lower resolution. We have press and interviews saying the same thing. And we have a first hand account from developers that shared development of the matrix demo saying the effort to get the s version going necessitated an extra studio to get running at lower, scaled back res and features. Are all of these people being misquoted? Why are they all open to interpretation when they seem to be pretty clear in the point they are making?
 
The quote means exactly what it says. You just dont understand how context works like I said. You cant just take the last 5 words of a paragraph and ignore the rest to make your own quote. You need the context of the entire paragraph. English is a trash language and lots of words have multiple meanings depending on context. For example did you know that "effort" can and does in this case mean result. So once again you are using this quote wrong.
 

FrankWza

Member
The quote means exactly what it says. You just dont understand how context works like I said. You cant just take the last 5 words of a paragraph and ignore the rest to make your own quote. You need the context of the entire paragraph. English is a trash language and lots of words have multiple meanings depending on context. For example did you know that "effort" can and does in this case mean result. So once again you are using this quote wrong.
This makes zero sense. The effort IS the context. The “pain” in the title of this thread IS the context.. Written by the same guy. Not his opinion but what he’s been told and is relaying
 
Last edited:

SlimySnake

Flashless at the Golden Globes
no my 3070 performs just like his at 1440p and i proved it just a couple posts prior back. i don't know why you keep bringing this up.

a) locations are different
b) game has a dynamic performance profile that ranges between 80 and 130 frames depending on the GPU load. the videos you've posted have tons of light GPU load sections which i have zero cares about. the video i posted is a specific small sample that takes part in a very GPU bound section that has nothing to do with intro mission.

you're not being any different than comparing series s's higher resolution to ps5's lowest resolution and claim they're performing similar.
I was just trying to help. I saw your GPU sit around 60-80 fps at 1080p and other benchmarks showed the same performance at 1440p so figured id let you know. Didnt mean anything else by it. Just an FYI or PSA of sorts. Didnt mean to make you feel harassed.
 

Three

Member
Multiplatform game assets have always been made to scale with a wide range of PCs.
it's not worth discussing for me to be frank. you don't accept the devs comment and bring up scaling on PC which he directly addresses already.

That's why Doom Eternal, despite being built for the PS4 as a "minimum spec", runs on the Switch
You're bringing up Doom Eternal again. For Sony platforms PS4 is min spec but not for the game. For one, an xbox one version exists but also the switch version was within the project scope all along and planned to release beside PS4/XB1. The switch version was just delayed more than the other versions but Doom eternal had all the consoles in mind from the beginning. Switch was the min spec for the game and they had planned for a release there. They had the cost and time for it but they took it and paid it. another studio and a 9 month long delay later they had that version. Not all will do that. Whether that was a good call i dont know because I don't know Doom Switch sales but they got bought out later that year.

Games on xbox though is different to the Switch existing since these things are not an option. Devs certainly can't stagger Series X and Series S releases and they can't ignore the Series S like a lot of big game devs do Switch. So if they run into troubles with it of course they will complain about the pain it's giving them and it will influence XSX development because it's tied to it by policy. It's easier and cheaper to make sure you don't have these issues to begin with. Make a game Series S would have no problem running then increase sample counts here and there for high settings. That's the way the id dev is telling people it usually is. They take min spec into account and it matters.

Wasn't Cell the main reason PS3 was struggling? This (and last) gen, all console are essentially the same.
Partly Cell complexity but an example of one of the worst offenders of the memory situation from PS3 gen was this game and engine:


Now some devs are saying there is a memory constraint on a console they need to support that has differences far greater in terms of specs (memory bandwidth and amount especially) and people are brushing it off as lazy devs who need to call MS.

Why do that instead of accept the low specs make some things difficult or impossible to support and that may influence development on XSX. That doesn't mean a lot of devs don't love the Series S. The next gen console install base would be a lot lower without it for one.
 
Last edited:

yamaci17

Member
I was just trying to help. I saw your GPU sit around 60-80 fps at 1080p and other benchmarks showed the same performance at 1440p so figured id let you know. Didnt mean anything else by it. Just an FYI or PSA of sorts. Didnt mean to make you feel harassed.
no, i'm highly knowledgeable about this stuff and i know what im doing. i just don't like people making wrong assumptions or drawing wrong conclusions from the data i present, that's all and I feel obliged to explain where it is necessary

see the below video. it is comparing native 1440p performance to dlss 4k+dlss balanced performance (4k + dlss balanced is 1252p. 2160*0.58=1252p)



can you see that "1252p" input renders 59 frames and native 1440p renders 74 frames? that does seem logical to you? probably not. but it is what it is. that is DLSS tax. DLSS is taxing for the GPU itself... see the chernobliyte section as well, "1252p" section renders 88 frames and native 1440p renders 109 fps. 4k+dlss balanced, despite having an input resolution of 1252p, is actually %25 harder to render than ACTUAL native 1440p.

4k+dlss performance is almost taxing as much as native 1440p. or maybe more. or better yet, here is actual data

sooner you understand the dlss tax, the better we can discuss it with you not saying "you were getting 60 80 fps at 1080p" because, as i've said again and again, its not actual 1080p. in actual 1080p, i easily get north of 100 frames consistently, which you cannot get at 1440p

actual native 1080p, 90 frames;
JX1RTpv.png


native 1440p gets you to 75 frames;

aNKDTi4.jpg


and kicker: 4k+dlss performance renders... 66 frames

fUTWgHN.jpg



do you see any kind of CPU incluence in this test? or would you say "your gpu is getting 66 fps at 1080p"?

can you hopefully understand that 4k+dlss performance is often more taxing on GPU than native 1440p?

(before anyone asking why 3070 only gets 90 frames at 1080p on god of war, its ultra+ settings. at original ps4 settings, the gpu is able to get north of 140 frames)
 
Last edited:
Partly Cell complexity but an example of one of the worst offenders of the memory situation from PS3 gen was this game and engine:

Interesting, apparently the engine had difficulties dealing with the split memory.
Now some devs are saying there is a memory constraint on a console they need to support that has differences far greater in terms of specs (memory bandwidth and amount especially) and people are brushing it off as lazy devs who need to call MS.
Yeah, but it's a different situation, since RAM (and especially VRAM) are quite scalable.
 

Bojji

Member
But then by that logic, every 3rd party game is being held back just by being multiplatform. So there's no base for discussion anyway.

Yes, lowest performing system is the base, game has to work on it and has to have all gameplay features. Right now for most games this console is X1 but soon Series S will take that role.

Down ports (like for switch) are completely different story but usually games are made for the lowest common denominator and then features are added on better systems.
 
sam winchester yawn GIF


Why don't you take the fight to the id software guy, man? Why is it so hard for you to accept and move on that Series S has memory limitations (a pain for devs) in ways that SX doesn't/isn't? The only person moving goalposts here is YOU.
There For You Sympathy GIF


Dude you never answered my question to name the complaining developers that actually used SFS to even attempt to mitigate the memory problems they were complaining about. The ID devs focused on what actually matters for their game on XSS and that was 120fps mode you apparently didn't even know about.

It is always a new complaint when it comes to the the XSS either its resolution and when that's OK its 120/60 fps and when that's OK its raytracing or some other nonsense. Raytracing super important for XSS when PS5 or XSX don't always have it? More goal post shifting. It's clear it's just console war trash talk with you people because you aren't even customers.

Why can't YOU accept that perhaps the system or Xbox in general just aren't for you and and spend your time actually focusing on a system you actually like? No one is forcing you to buy an XSS and you can't name even ONE title on the system that affects your enjoyment of the PS5 or Switch. Let me know when you can provide that list of SFS using devs.

Yeah you have been banned for console warring several times.

Can’t relate.
It's never console warring when trash talking the budget console.
 
This makes zero sense. The effort IS the context. The “pain” in the title of this thread IS the context.. Written by the same guy. Not his opinion but what he’s been told and is relaying

"Platform comparisons? In the wake of our initial head-to-head screenshots, isolated scenarios were highlighted in social media to show geometric detail favouring PlayStation 5 over Xbox Series X. Revisiting The Matrix Awakens, there appears to be a certain level of dynamism here: some scenes saw Xbox Series X resolving more far-off detail, something that may even change on a run-by-run nature. It's Xbox Series S where there's a real story here - just how did the junior Xbox with just four teraflops of compute power pull this off?

First of all, Epic enlisted the aid of The Coalition - a studio that seems capable of achieving results from Unreal Engine quite unlike any other developer. Various optimisations were delivered that improved performance, many of which were more general in nature, meaning that yes, a Microsoft first-party studio would have helped in improving the PlayStation 5 version too. Multi-core and bloom optimisations were noted as specific enhancements from The Coalition, but this team has experience in getting great results from Series S too, so don't be surprised if they helped in what is a gargantuan effort.

Series S obviously runs at a lower resolution (533p to 648p in the scenarios we've checked), using Epic's impressive Temporal Super Resolution technique to resolve a 1080p output. Due to how motion blue resolution scales on consoles, this effect fares relatively poorly here, often presenting like video compression macroblocks. Additionally, due to a sub-720p native resolution, the ray count on RT effects is also reined in, producing very different reflective effects, for example. Objects within reflections also appear to be using a pared back detail level, while geometric detail and texture quality is also reduced. Particle effects and lighting can also be subject to some cuts compared to the Series X and PS5 versions. What we're looking at seems to be the result of a lot of fine-tuned optimisation work but the overall effect is still impressive bearing in mind the power of the hardware. Lumen and Nanite are taxing even on the top-end consoles, but now we know that Series S can handle it - and also, what the trades may be in making that happen."

This entire thing is very clearly overall positive for xss. You dont ask how did it pull it off if it wasnt a huge result or gargantuan effort (these 2 things mean the same thing)
 

FrankWza

Member
This entire thing is very clearly overall positive for xss. You dont ask how did it pull it off if it wasnt a huge result or gargantuan effort (these 2 things mean the same thing)
Overly positive?
Failing to live up to the prelaunch expectations set by the systems developer and needing an additional studio to come in to get a demo to run after a gargantuan effort and scaled back features, graphics and resolution on what is to be a widely used engine going forward. Having developers report that it’s a pain to work on. I think you’re missing the point. Let me guess..When you watch titanic youre happy that the people in the half full life boats have room to stretch their legs?

Edit: the huge result and gargantuan effort do not mean the same thing. It was “pulled off” because they brought in another studio just to help.
 
Last edited:

dcmk7

Banned
I have never seen such a concentrated effort to shoot down this article so much :messenger_grinning_smiling:.

The system is clearly hindering some developers progress out there. It's not the first time we have heard this, been heard surprisingly often over the last few years even from within MS first party studios.

Personally would have rather the Coalition had cracked on with developing a top class game rather than putting so much effort into getting a demo to run adequately, but that's just me.
 

Three

Member
Yeah, but it's a different situation, since RAM (and especially VRAM) are quite scalable.
VRAM on a console is the same pool of memory now so it's far more versatile. But if you use one you're using the other.

You must understand that if a dev is saying memory constraints are a pain the easy 'quite scalable' options that are deemed acceptable are not the issue and it wouldn't be a pain. The issue with that game came from knowing how much memory would be used depending on what the player has done in the game world. Not the size of framebuffers, textures, shaders, etc. I don't think it was a VRAM issue but a main memory constraint.

A full quote on how they were doing it:

“It’s an engine-level issue with how the save game data is stored off as bit flag differences compared to the placed instances in the main .esm + DLC .esms. As the game modifies any placed instance of an object, those changes are stored off into what is essentially another .esm. When you load the save game, you’re loading all of those differences into resident memory,” he revealed.

“It’s not like someone wrote a function and put a decimal point in the wrong place or declared something as a float when it should have been an int. We’re talking about how the engine fundamentally saves off and references data at run time.

Restructuring how that works would require a large time commitment. Obsidian also only had that engine for a total of 18 months prior to F:NV being released, which is a relatively short time to understand all of the details of how the technology works.


“Individual bits of data are tiny, but there are thousands upon thousands of objects in F:NV, each one containing numerous data fields that could potentially be changed in your save game. Over time, it adds up,”

It's main memory, not scalable unless you alter gameplay (number of objects), and it required huge time commitment to change how it works.
 
Last edited:

yamaci17

Member
@ yamaci17 yamaci17 didn't realize DLSS was that heavy. If FSR 2.0 is similarly heavy, I guess that won't amount to much on console either.
lets see

native 1440p gets 81 frames

LW9ihcm.png


4k dlss performance and 4k fsr performance... yep 79 and 81 frames


ZsLxx2d.png



also notice how 4k+dlss quality (1440p input) gets 61 and 64 frames, but native 1440p gets 81 frames. this data actually backs up my claim quite nicely. thanks you for asking !

finally, we can see actual native 1080p gets us 115 frames. a far cry from 4k dlss performance where you get 81 frames, right?

nQHFpy6.png



it is... heavy. but it looks bettter than native 1440p. at least that's a thing (at least in my experience) it will surely look better than your usual temporal upscaler, that's a huge plus if you ask me. but i still think Imsomniac's temporal injection can be on par with these two techs
 
Last edited:
Status
Not open for further replies.
Top Bottom