• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox June GDK: More dev memory for Series S and more.

Riky

$MSFT
Sounds like a strawman to me. There are no absolutes to game development. For every dev claiming to have issues there are devs who claim there are no problems at all. The 'lies from usual suspects' are the people who coincidentally don't have the platform exaggerating that this is a universal problem with no solution.

Riky Riky was rightly questioning if the features of the platform were being used. Since the evidence points to no, it is completely reasonable to question the veracity of the claims that the XSS has some sort of fundamental problem. Thankfully those who are most 'concerned' have little to worry about because they don't own the system anyway.

Exactly, why is it a problem to want all the hardware features to become used regularly? Since there are very good Series S versions of some games then it's obviously possible, this news of extra ram helps, how much we don't know but more is better.
It will be interesting to see if Turn 10 talk about if they are using deeper day one integration of Tier 2 VRS or if they are using SFS or both to get real time RT into Forza on Series S, something like DF said would be a step up for consoles and on a console that some people have constantly questioned.
It will show that it is capable after all, I don't see the problem with trying to find out how that was achieved.
 

Three

Member
Sounds like a strawman to me. There are no absolutes to game development. For every dev claiming to have issues there are devs who claim there are no problems at all. The 'lies from usual suspects' are the people who coincidentally don't have the platform exaggerating that this is a universal problem with no solution.

Riky Riky was rightly questioning if the features of the platform were being used. Since the evidence points to no, it is completely reasonable to question the veracity of the claims that the XSS has some sort of fundamental problem. Thankfully those who are most 'concerned' have little to worry about because they don't own the system anyway.
Strawman my ass. That's what your mate Riky was doing, talking in absolutes. Saying that universally a problem does not exist when some devs spoke out about the memory situation.

Riky was quoting Andrew saying that with a lower resolution you wouldn't run into memory problems and that 8GB is enough. If this were the case you wouldn't have had raytracing missing in games either and devs complaining about it. That is/was blatantly false though as he's here now singing praises when the 'major problem' is overcome.
 

01011001

Banned
Wondering if they felt the need to free up resources so they can have parity in first party games like Forza Motorsport coming next year.

could be.
I really wonder if it was their first party studios or complaints from third party devs that lead to this change.

kinda reminds me how Sony overclocked the PSP with a firmware update because Ready at Dawn needed the extra hardware power to run Chains of Olympus at a relatively steady 30fps
 
Last edited:

Orbital2060

Member
Im curious have they said what developers they are referring to in this context?

It would be helpful to know more exactly what kind of game these developers are making.
 

clampzyn

Member
Im curious have they said what developers they are referring to in this context?

It would be helpful to know more exactly what kind of game these developers are making.
Probably a game made to cater Series X hardware that dropping resolution, etc. for the series s version isn't enough
 

PaintTinJr

Member
...

Memory allocation on Xbox Series S consoles has been optimized.
“Hundreds of additional megabytes of memory are now available to Xbox Series S developers. This gives developers more control over memory, which can improve graphic performance in memory-constrained conditions.

...
Am I the only person reading that sentence and thinking it should really have the acronym, VRR in there - which would help very little people - or the sentence re-written about higher dynamic resolution or less aggressive VRS, or not dropping rendering features ?

Under all non-VRR situations the games target 30 and 60fps as is required to pass xbox certification - and other than the odd percentile dip in analyses shown by NXgamer, etc typically games on all consoles have stable 30 or 60fps frame-rates, so the frame-rate in games should be locked already with no performance gain to be had - without VRR - because games use dynamic resolution and drop features to hit those frame-rates to match the fixed display refreshes.

AFAIK, either Xbox is saying that this helps games that are missing those locked 30fps and 60fps targets in areas we don't see analysed - which technically wouldn't be more performance, just what was already supposed to be there -, or it is for VRR situations, or they haven't thought this sentence out very well.
 
Last edited:
Strawman my ass. That's what your mate Riky was doing, talking in absolutes. Saying that universally a problem does not exist when some devs spoke out about the memory situation.

Riky was quoting Andrew saying that with a lower resolution you wouldn't run into memory problems and that 8GB is enough. If this were the case you wouldn't have had raytracing missing in games either and devs complaining about it. That is/was blatantly false though as he's here now singing praises when the 'major problem' is overcome.
The absolute to me seems that no dev that complained actually used the feature to address the issue they were complaining about. Absolutely none of 'usual suspects' blowing this up to be a major issue acknowledged that it was a small number of people complaining and there were also devs that came out saying the opposite. Very telling indeed.

The raytracing argument also fall a bit flat because there are plenty of PC titles that lack the feature when ported to PS5 and XSX. It comes down to if developers want to put in the effort and if they think it's worth it. With the XSS because it's a lower performance, budget console simply choose to take the path of least resistance just like when they choose to not use SFS.

At the end of the day like all consoles as devs get more experience and MS enhances the XDK performance for all systems will improve. It was always hyperbolic nonsense to claim XSS had 'major problems' and was fundamental flawed.
 

MarkMe2525

Gold Member
Am I the only person reading that sentence and thinking it should really have the acronym, VRR in there - which would help very little people - or the sentence re-written about higher dynamic resolution or less aggressive VRS, or not dropping rendering features ?

Under all non-VRR situations the games target 30 and 60fps as is required to pass xbox certification - and other than the odd percentile dip in analyses shown by NXgamer, etc typically games on all consoles have stable 30 or 60fps frame-rates, so the frame-rate in games should be locked already with no performance gain to be had - without VRR - because games use dynamic resolution and drop features to hit those frame-rates to match the fixed display refreshes.

AFAIK, either Xbox is saying that this helps games that are missing those locked 30fps and 60fps targets in areas we don't see analysed - which technically wouldn't be more performance, just what was already supposed to be there -, or it is for VRR situations, or they haven't thought this sentence out very well.
I'm not understanding what you are saying if I'm reading this correctly. Are you stating that if a game on Series S doesn't currently hit performance targets and this helps with reaching those said targets, that would not be indicative of more performance?

If I am following you correctly, it sounds more like you are just changing your definition of what "improve graphic performance" means rather than providing any useful insight.
 
Last edited:
Wait, so getting more memory and other improvements a bad thing now?
If MS is constantly improving their GDK so that devs can get the most out of the hardware (either the Series S or X), how is this a bad thing?

It's something all platform owners should be doing.
 

Three

Member
The id Software guy always gets pulled up as an example of devs complaining, and it is clear that the dude never even saw a Series S before making those tweets. The issue got entirely overblown, purely for console war reasons.
An ID engine dev had never seen a series S before those tweets, yeah right, and yet as he said Doom Eternal was missing ray tracing only on the Series S. Low ram for BVH as he said.

The absolute to me seems that no dev that complained actually used the feature to address the issue they were complaining about. Absolutely none of 'usual suspects' blowing this up to be a major issue acknowledged that it was a small number of people complaining and there were also devs that came out saying the opposite. Very telling indeed.

The raytracing argument also fall a bit flat because there are plenty of PC titles that lack the feature when ported to PS5 and XSX. It comes down to if developers want to put in the effort and if they think it's worth it. With the XSS because it's a lower performance, budget console simply choose to take the path of least resistance just like when they choose to not use SFS.

At the end of the day like all consoles as devs get more experience and MS enhances the XDK performance for all systems will improve. It was always hyperbolic nonsense to claim XSS had 'major problems' and was fundamental flawed.

Here we go again. It's all a matter of effort. Doesn't mean the lower spec hardware can't give devs some trouble. The raytracing on PC but not on XSX or PS5 is again effort. If some dev doesn't try and get their RTX raytraced game on to XSX or PS5 nobody sane would pretend that XSX and PS5 are just as easily capable of raytracing as RTX cards and say the devs are lazy or something equally asinine though. The Series S memory is smaller and low bandwidth. The XSX and PS5 GPUs are not as capable as RTX cards at raytracing. It's fine to accept some facts.
 
Last edited:

PaintTinJr

Member
I'm not understanding what you are saying if I'm reading this correctly. Are you stating that if a game on Series S doesn't currently hit performance targets and this helps with reaching those said targets, that would not be indicative of more performance?
..
IMHO if the statement is true - without a VRR addendum -, it is an acknowledgment by Xbox that they are letting games wrongfully pass their certification that aren't meeting the targeted performance of 30fps or 60fps well enough - that this improvement makes a meaningful difference - and are in essence selling goods that should be held back until the performance is stable.

It is like saying you can get more speed out of a car with speed limited enabled at 30 or 60mph. THe only way that can be true is if the car isn't at those speeds to begin with. Personally I think they just haven't consider the statement well enough to realise what they were actually saying, rather than what they meant, as more memory will likely improve the graphics, just not the performance, outside of VRR
 

Hoddi

Member
IMHO if the statement is true - without a VRR addendum -, it is an acknowledgment by Xbox that they are letting games wrongfully pass their certification that aren't meeting the targeted performance of 30fps or 60fps well enough - that this improvement makes a meaningful difference - and are in essence selling goods that should be held back until the performance is stable.

It is like saying you can get more speed out of a car with speed limited enabled at 30 or 60mph. THe only way that can be true is if the car isn't at those speeds to begin with. Personally I think they just haven't consider the statement well enough to realise what they were actually saying, rather than what they meant, as more memory will likely improve the graphics, just not the performance, outside of VRR
'Graphics performance' doesn't just mean framerate. If you can push more graphics data at the same framerate then that is 'improved graphics performance'.
 
An ID engine dev had never seen a series S before those tweets, yeah right, and yet as he said Doom Eternal was missing ray tracing only on the Series S. Low ram for BVH as he said.



Here we go again. It's all a matter of effort. Doesn't mean the lower spec hardware can't give devs some trouble. The raytracing on PC but not on XSX or PS5 is again effort. If some dev doesn't try and get their RTX raytraced game on to XSX or PS5 nobody sane would pretend that XSX and PS5 are just as easily capable of raytracing as RTX cards and say the devs are lazy or something equally asinine though. The Series S memory is smaller and low bandwidth. The XSX and PS5 GPUs are not as capable as RTX cards at raytracings. It's fine to accept some facts.
Yes like the fact that lots of XSS commentary here wasn't done sincerely but to score platform war points.

No one said there wasn't a challenge in developing games on the XSS or ANY platform for that matter. Game development is hard. The question is are the complaining developers using all the tools available to them to address their concerns. No is the only answer I've seen.
 

Riky

$MSFT
Wait, so getting more memory and other improvements a bad thing now?
If MS is constantly improving their GDK so that devs can get the most out of the hardware (either the Series S or X), how is this a bad thing?

It's something all platform owners should be doing.

Some of these improvements might translate to Series X also in time, if they could get near 14gb of usable memory for Devs that could produce more performance also.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
Wow @Ezekiel_ is obsessed with this thread. Thanks for increasing engagement numbers in the Xbox threads. What a fan.
While I find this thread interesting and really doesn't have knowledge enough to join this debate, I also found it really tiresome to see his lame laughing emote on anything that seems positive towards xbox.

I wish blocking a person also removed their stupid fanboy reactions, but one can only hope that it will happen some day.
 

MarkMe2525

Gold Member
IMHO if the statement is true - without a VRR addendum -, it is an acknowledgment by Xbox that they are letting games wrongfully pass their certification that aren't meeting the targeted performance of 30fps or 60fps well enough - that this improvement makes a meaningful difference - and are in essence selling goods that should be held back until the performance is stable.

It is like saying you can get more speed out of a car with speed limited enabled at 30 or 60mph. THe only way that can be true is if the car isn't at those speeds to begin with. Personally I think they just haven't consider the statement well enough to realise what they were actually saying, rather than what they meant, as more memory will likely improve the graphics, just not the performance, outside of VRR
I'm sorry but I'm seeing nothing but a strawman here. Could you explain these certification requirements you are referring to?

The statement is also pretty straightforward and doesn't need much more consideration. As quoted "This gives developers more control over memory, which can improve graphic performance in memory-constrained conditions." There is no subtext here, no need to add any asterisk. It means what it says.
 

Three

Member
Exactly. He thought memory is split on Series S, but as we all know, it isn't split for games. So we can assume he had no experience with working on the console.
He said nothing about split memory he said low bandwidth memory. Split memory is still there too if they freed it from the OS.
 

PaintTinJr

Member
I'm sorry but I'm seeing nothing but a strawman here. Could you explain these certification requirements you are referring to?

The statement is also pretty straightforward and doesn't need much more consideration. As quoted "This gives developers more control over memory, which can improve graphic performance in memory-constrained conditions." There is no subtext here, no need to add any asterisk. It means what it says.
You are entitled to that opinion, but I don't agree with it, and it looks like technical info modified for marketing purposes IMO - because they didn't give us the exact amount of memory saved, as hundreds sounds better, than say 101MB or 199MB.

The saved memory in reality helps improve graphical fidelity of fx used or final resolution in 99 out of 100 situations, but because the XsS is aimed at an audience that is flexible on losing next-gen fidelity features and insensitive to resolution, wording it that way - without the word "performance" that implies more frame-rate -doesn't provide the buzz words for those that will project the information incorrectly IMO, I don't think that is accidental in this situation because of the lack of transparency about the exact amount of memory saved.
 
Last edited:

Riky

$MSFT
I'm sorry but I'm seeing nothing but a strawman here. Could you explain these certification requirements you are referring to?

The statement is also pretty straightforward and doesn't need much more consideration. As quoted "This gives developers more control over memory, which can improve graphic performance in memory-constrained conditions." There is no subtext here, no need to add any asterisk. It means what it says.

You can't make it simple enough for some people, like with Turn 10 when they said Ray Tracing "On Track" they had to clarify that it meant during gameplay as some people simply refused to believe them.
 

Topher

Gold Member
Not by name but the new features which the api exposes were which is what we are talking about. Go to the 28 to 31 minute mark and actually listen to what they are saying.

No, we are talking about updates to the GDK that helps make XSS more efficient. Cerny did not discuss developer kits or their internal APIs at all. If you are referring to the hardware features he discussed well, of course, but that isn't what we are talking about here. The only thing of note in the section of the video you referenced was talking about shaders which is not specific to any API or DK. That is a generic term, not some pitch about how their API has improved or does some unusually great thing. We are not talking about the same thing at all.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
on SX it could actually inprove performance even more because of the 2 tiered ram setup, with more free fast ram it would be more perfomant

True, true. The more of the faster RAM they can free up, the better for the games.

We know they allocate 13.5 for games. I have to imagine they already try to squeeze as much of the faster RAM for games as possible with the slower RAM handling things like the dash and DVR.


IMHO if the statement is true - without a VRR addendum -, it is an acknowledgment by Xbox that they are letting games wrongfully pass their certification that aren't meeting the targeted performance of 30fps or 60fps well enough - that this improvement makes a meaningful difference - and are in essence selling goods that should be held back until the performance is stable.

It is like saying you can get more speed out of a car with speed limited enabled at 30 or 60mph. THe only way that can be true is if the car isn't at those speeds to begin with. Personally I think they just haven't consider the statement well enough to realise what they were actually saying, rather than what they meant, as more memory will likely improve the graphics, just not the performance, outside of VRR

billy-madison-dumber.gif
 
Last edited:

Three

Member
Yes like the fact that lots of XSS commentary here wasn't done sincerely but to score platform war points.

No one said there wasn't a challenge in developing games on the XSS or ANY platform for that matter. Game development is hard. The question is are the complaining developers using all the tools available to them to address their concerns. No is the only answer I've seen.
The only people I see that I wouldn't consider sincere are those who try to oversell it with "it will give Ps5 a run for its money with RDNA 2" and the like. People who belong in marketing. The people who say "The ram situation is giving some devs trouble" are sincere. The people saying that's a lie are not. You're one of them though. The most ardent defender of all things XSS on here.
 

adamsapple

Or is it just one of Phil's balls in my throat?
The only people I see that I wouldn't consider sincere are those who try to oversell it with "it will give Ps5 a run for its money with RDNA 2" and the like. People who belong in marketing. The people who say "The ram situation is giving some devs trouble" are sincere. The people saying that's a lie are not. You're one of them though. The most ardent defender of all things XSS on here.


Xbox Series S RAM Not an Issue, But GPU Performance Presents Challenges for Future Titles – 4A Games

Bold of you to call one of the only developers to have made a full RTGI based 60 FPS game on the current generation as liars and insincere.
 
Last edited:
The only people I see that I wouldn't consider sincere are those who try to oversell it with "it will give Ps5 a run for its money with RDNA 2" and the like. People who belong in marketing. The people who say "The ram situation is giving some devs trouble" are sincere. The people saying that's a lie are not. You're one of them though. The most ardent defender of all things XSS on here.
Liars claim the XSS was holding back the generation or that it has power similar to the Switch. I also noticed there has been no evidence provided that devs are fully utilizing the systems' features and still had problems. I defend an honest discussion. You should try it.
 

MarkMe2525

Gold Member
You are entitled to that opinion, but I don't agree with it, and it looks like technical info modified for marketing purposes IMO - because they didn't give us the exact amount of memory saved, as hundreds sounds better, than say 101MB or 199MB.

The saved memory in reality helps improve graphical fidelity of fx used or final resolution in 99 out of 100 situations, but because the XsS is aimed at an audience that is flexible on losing next-gen fidelity features and insensitive to resolution, wording it that way - without the word "performance" that implies more frame-rate -doesn't provide the buzz words for those that will project the information incorrectly IMO, I don't think that is accidental in this situation because of the lack of transparency about the exact amount of memory saved.

Dude..... What the are you even saying? There is no opinion to even have.
"Technical info modified for marketing"?
You do realize this presentation is for game developers and not consumers right? It's posted on the Microsoft Game Dev YouTube channel. Are you aware that marketing generally is directed at the MARKET?
The saved memory in reality helps improve graphical fidelity of fx used or final resolution in 99 out of 100 situations, but because the XsS is aimed at an audience that is flexible on losing next-gen fidelity features and insensitive to resolution, wording it that way - without the word "performance" that implies more frame-rate -doesn't provide the buzz words for those that will project the information incorrectly IMO"
I'll post the statement again.
"This gives developers more control over memory, which can improve graphic performance in memory-constrained conditions."

It says it will free up memory and help performance in some cases. That's literally it. It's pretty clear. What is this 99 out of 100 devs will do this vs that? You're literally just making that up.

Since this simple statement seems to be confusing you, I'll take the time to give you an easy to understand scenario. Say a game is memory constrained by 104MB during a heavy action scene and drops frames as a result. This update allows Dev to free up additional resources, say 115MB. As a result, no more frame drops during said scene. THAT'S IT. No opinion necessary. No subtext. NO "buzzwords" needed.
 
Last edited:

PaintTinJr

Member
Dude..... What the are you even saying? There is no opinion to even have. You do realize this presentation is for game developers and not consumers right? It's posted on the Microsoft Game Dev YouTube channel. Are you aware that marketing generally is directed at the MARKET?
I'm not suggesting it is only marketing, but just like every word of the Road to PS5 would have been checked against marketing objectives, and the wording of this will have been checked in the same way too - and IMO tweaked to be on message in the way previously described.
....

Since this simple statement seems to be confusing you, I'll take the time to give you an easy to understand scenario. Say a game is memory constrained by 104MB during a heavy action scene and drops frames as a result. This update allows Dev to free up additional resources, say 115MB. As a result, no more frame drops during said scene. THAT'S IT. No opinion necessary. No subtext. NO "buzzwords" needed.
Why would that scenario be realistic - when the GDDR6 is the base of the memory pyramid below GPU caches?

A shortfall of such amounts - compared to the ~6GB(?) for the GPU on the XsS overall - isn't likely to be impacting performance negatively because anything critical to performance will be resident already, and it will just mean missing mipmap levels and missing higher LoDs, on less important things which would actually reduce the GPU workload and increase rendering performance.
 
Last edited:

MarkMe2525

Gold Member
Why would that scenario be realistic - when the GDDR6 is the base of the memory pyramid below GPU caches?

A shortfall of such amounts - compared to the ~6GB(?) for the GPU on the XsS overall - isn't likely to be impacting performance negatively because anything critical to performance will be resident already, and it will just mean missing mipmap levels and missing higher LoDs, on less important things which would actually reduce the GPU workload and increase rendering performance.
Unrealistic? Ray tracing memory usage scales with scene complexity. Acceleration structures, denoising and other temporal effects all are stored to ram and change scene to scene. If this data along with other game data would exceed the memory pool (like mentioned in my scenario) it causes the gpu to miss it's 33.3ms window (for 30fps) while waiting on the new data to come in. Right there, what happens? Either the game drops resolution or, in the case it's at it's lower res bounds, it drops frames.

How can you make the claim that having more ram available, during times of "memory-constrained conditions" not improve performance?

As mentioned before over and over, MS statement to game devs (not consumers) is "This gives developers more control over memory, which can improve graphic performance in memory-constrained conditions."
 
Last edited:

MarkMe2525

Gold Member
Great update, from the XNC podcast, Colteastwood claims that the Series S will compete and even surpasses the PS5 when it comes to CPU heavy games, given that the Series S has a faster processor and Fidelity Resolution and full RDNA 2 across all feature sets.

Any truth to that?

x373QNh.jpg
The game in question would have to be tailor made for such a scenario. So no

To elaborate, if someone developed a game that ran at 480p max with minimal post processing effects that ran at 120hz, you might see a scenario where it's cpu bound which would result in them having similar performance.
 
Last edited:
Maybe take it up with DF, he seemed pretty sure of his figure, probably got it from developers under NDA so isn't going to say who, just like he doesn't say who was talking about Xbox Series S memory, people accept that though.

Again it just doesn't make sense unless he's comparing the PS5's OS to the Series S's OS implementation. MS uses virtualized hypervisor layers for their OS which naturally will incur more resources, especially when the OS is in 4K like on Series X.

So he either needs to provide some sources or expect people to question some of his claims. It is what it is.

The API is irrelevant, there is a hardware difference in the Series consoles, confirmed. SFS is touted as a fundamental part of the VA, therefore hardware adjustments. Sony probably feel they don't need it due to raw throughput instead. It wasn't mentioned in the Road To PS5 unlike primitive Shaders.

The hardware adjustments for SFS are slight adjustments and the biggest of those being the mip-blending logic customization on the Series GPU. Sony's approach is different partly for the reason you just listed, however their SSD I/O management will always require some sort of software alongside the hardware to make use of it.

If they have any aspects on the software side that do things similar to SFS, it wouldn't be called SFS.

Not true. 360 os was much smaller than ps3. Also ps5 screen is 4K and series is lower res which probably saves a lot of memory.

Yes, compared to Series S. Series X OS UI was updated to 4K earlier this year IIRC.

Ssd is never going to be able to be a substitute for actual ram. Just look at the bandwidth difference. Gpu ram bandwidths is in the hundreds of gigabytes while the ssd is single digits. Don’t believe that guys bullcrap.

We are talking about background OS tasks. They are not bandwidth-intensive, they do not need a lot of bandwidth to operate. How fast do you think that 1 GB of DDR3 in PS4 Pro was compared to the GDDR5? Yet it still served as a pool for background OS tasks.

Peak compressed bandwidth for PS5's SSD I/O is 22 GB/s which is higher than a single DDR3-2166 module (peak bandwidth 17 GB/s...btw they have already produced reliable compressed bandwidths for their SSD I/O of 17 GB/s this was over a year ago, thanks to algorithm improvements). It can absolutely mimic the functionality of PS4 Pro's extra 1 GB of DDR3 RAM via some portion of the SSD space and the I/O subsystem's bandwidth capability.

NXGamer NXGamer wasn't speaking BS, he actually understands how this stuff works xD
 

BeardGawd

Banned
Again it just doesn't make sense unless he's comparing the PS5's OS to the Series S's OS implementation. MS uses virtualized hypervisor layers for their OS which naturally will incur more resources, especially when the OS is in 4K like on Series X.

So he either needs to provide some sources or expect people to question some of his claims. It is what it is.



The hardware adjustments for SFS are slight adjustments and the biggest of those being the mip-blending logic customization on the Series GPU. Sony's approach is different partly for the reason you just listed, however their SSD I/O management will always require some sort of software alongside the hardware to make use of it.

If they have any aspects on the software side that do things similar to SFS, it wouldn't be called SFS.



Yes, compared to Series S. Series X OS UI was updated to 4K earlier this year IIRC.



We are talking about background OS tasks. They are not bandwidth-intensive, they do not need a lot of bandwidth to operate. How fast do you think that 1 GB of DDR3 in PS4 Pro was compared to the GDDR5? Yet it still served as a pool for background OS tasks.

Peak compressed bandwidth for PS5's SSD I/O is 22 GB/s which is higher than a single DDR3-2166 module (peak bandwidth 17 GB/s...btw they have already produced reliable compressed bandwidths for their SSD I/O of 17 GB/s this was over a year ago, thanks to algorithm improvements). It can absolutely mimic the functionality of PS4 Pro's extra 1 GB of DDR3 RAM via some portion of the SSD space and the I/O subsystem's bandwidth capability.

NXGamer NXGamer wasn't speaking BS, he actually understands how this stuff works xD
Didn't the PS4 OS takeup 3.5 GBs of RAM at 1080p? So the PS5 having more features plus 4k at the same size of 3.5 GBs is a nice improvement and is reasonable.
 

Three

Member
Xbox Series S RAM Not an Issue, But GPU Performance Presents Challenges for Future Titles – 4A Games

Bold of you to call one of the only developers to have made a full RTGI based 60 FPS game on the current generation as liars and insincere.
And this is exactly what I'm talking about when I say there is a fair few of the obvious gang who are not being sincere. You being another obvious one. When did I call 4A liars? This is what your mate Ricky said:
... developers are happy with the GPU but memory amount and bandwith have been the major problem.

Now apply it to your 4A quote. Is it contradictory regarding GPU and memory? Yes. Now learn the definition of 'some devs'. Devs have different needs for different engines but it's OK to accept that whatever low specs makes it difficult for devs instead of just pretending the specs are made by god himself and if they're stuggling they just aren't doing it right or don't have access to Series S or whatever. What's even worse is flip flopping on this idea day by day depending on what your beloved company has done most recently.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
And this is exactly what I'm talking about when I say there is a fair few of the obvious gang who are not being sincere. You being another obvious one. When did I call 4A liars? This is what your mate Ricky said:

This is what you said:

The people who say "The ram situation is giving some devs trouble" are sincere. The people saying that's a lie are not.

You're directly calling anyone who disagrees with Series S having RAM issues a liar and/or insincere. All I did was point to probably one of the most technically proficient multiplatform developers who outright state Series S RAM is not an issue. Make of that what you will.

DarkMage619 DarkMage619 already answered your question to the most satisfactory manner earlier in the thread that different developers may have different experiences and there is no absolute. For any developer that you can find that will say they're having problems, there's also developers who have said they aren't.

Also, stop getting so hung up over what Riky may or may not have said months ago, frankly it's very very weird.
 
Last edited:

Hoddi

Member
Peak compressed bandwidth for PS5's SSD I/O is 22 GB/s which is higher than a single DDR3-2166 module (peak bandwidth 17 GB/s...btw they have already produced reliable compressed bandwidths for their SSD I/O of 17 GB/s this was over a year ago, thanks to algorithm improvements). It can absolutely mimic the functionality of PS4 Pro's extra 1 GB of DDR3 RAM via some portion of the SSD space and the I/O subsystem's bandwidth capability.

NXGamer NXGamer wasn't speaking BS, he actually understands how this stuff works xD
I don't think he does if that's his take. You're basically describing using the SSD as writeable RAM while encoding 4k60 video. Leaving aside that the SSD could never keep up then you'd be looking at multiple terabytes of writes in a single hour.

Then again, it could be used strictly to read data. The SSD can certainly be used to quickly unload and reload data as necessary. This very thread is about Xbox doing exactly that.
 

Three

Member
This is what you said:



You're directly calling anyone who disagrees with Series S having RAM issues a liar and/or insincere. All I did was point to probably one of the most technically proficient multiplatform developers who outright state Series S RAM is not an issue. Make of that what you will.

DarkMage619 DarkMage619 already answered your question to the most satisfactory manner earlier in the thread that different developers may have different experiences and there is no absolute. For any developer that you can find that will say they're having problems, there's also developers who have said they aren't.

Also, stop getting so hung up over what Riky may or may not have said months ago, frankly it's very very weird.
Don't be dense that's not what I'm doing. Let me show you what you quoted from me saying I'm calling people liars for disagreeing :

The people who say "The ram situation is giving some devs trouble" are sincere. The people saying that's a lie are not

Some devs having trouble with the ram situation is a fact. A fact some don't like to accept so they call the fact a lie, or the dev had no access to Series S so they're ignorant or some other nonsense when the fact is, yes, the small low bandwidth RAM situation is giving some devs trouble. It's you who again is dealing with absolutes and pretending I am. I'm not the one calling anyone liars.

The Riky quote is from yesterday and is what started this conversation so drop the you're being wierd act for pointing out your 4A quote contradicts Rikys quote not mine if you're dealing with absolutes the way you are and you don't understand the definition of some devs.
 
Last edited:
Nah....you are buying into marketing way too much here, dude. Sony doesn't market their APIs like Microsoft does. Sony doesn't sell their APIs like Microsoft does. So you are never going to see videos explaining how Sony is improving their console. That doesn't mean it isn't happening. XSX and PS5 are evenly matched even if Xbox has the edge. I think both are trying to make improvements and that's a good thing.
Correct, This is all marketing…..

This is there way of addressing the main concerns people have about the series s “being an underpowered next gen console”. This is Microsoft basically saying “see the series s is more powerful and we will make it better over time so buy it”.

I am honestly sick of the smoke and mirrors, acquisition talks, pointless comparisons, etc. Let’s get to these next gen games already.
 

adamsapple

Or is it just one of Phil's balls in my throat?
The Riky quote is from yesterday and is what started this conversation so drop the you're being wierd act for pointing out your 4A quote contradicts Rikys quote not mine if you're dealing with absolutes the way you are and you don't understand the definition of some devs.

Your literal first post in this topic is invoking a 3 month old post to stoke fire and get reactions, please don't play coy.

You were the one who was saying the memory was fine and lazy devs should be using SFS and RDNA 2.
It will definitely help with BVH raytracing.

What happened to all this talk you were doing here though?:

https://www.neogaf.com/threads/digi...-memory-issues.1636514/page-16#post-266168185


You've posted little to nothing about the actual topic at hand and mostly as some kind of vendetta and gotcha against Riky. Ease up on picking on individual members relentlessly, it's not a good look.
 
Last edited:

Three

Member
Your literal first post in this topic is invoking a 3 month old post to stoke fire and get reactions, please don't play coy.




You've posted little to nothing about the actual topic at hand and mostly as some kind of vendetta and gotcha against Riky.
Not playing coy at all, that's you regarding what's being discussed. I pointed out how quick he was to flip from calling others liars for pointing out the low RAM situation to now saying a 'major problem' has been addressed and GPU is great just yesterday, that was the point.

If you don't want to discuss that then why are you quoting and replying to me by pretending I'm the one who called anyone a liar? Follow the conversation and don't gaslight people.
 
Last edited:

Riky

$MSFT
Again it just doesn't make sense unless he's comparing the PS5's OS to the Series S's OS implementation. MS uses virtualized hypervisor layers for their OS which naturally will incur more resources, especially when the OS is in 4K like on Series X.

So he either needs to provide some sources or expect people to question some of his claims. It is what it is.



The hardware adjustments for SFS are slight adjustments and the biggest of those being the mip-blending logic customization on the Series GPU. Sony's approach is different partly for the reason you just listed, however their SSD I/O management will always require some sort of software alongside the hardware to make use of it.

If they have any aspects on the software side that do things similar to SFS, it wouldn't be called SFS.

You expect DF to hang out developers who are under NDA? Naive or what.
DF didn't do that over the Tier 2 VRS hardware claim that people said similar things about, it's just a DX12U term, it wasn't and they were proven to be correct, so they have a track record of being correct.

Just like hardware assisted VRS and Mesh Shaders if it wasn't mentioned in the road to PS5 then there is a reason for that, SFS goes back to Turing cards, it isn't an MS term and anybody can use it as it's found on non MS hardware and non MS API.

It is what it is.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
Riky Riky thicc_girls_are_teh_best thicc_girls_are_teh_best


The 13.5 GB RAM being available has been known since 2020 when DF first got their hands on the Series X specs from MS.



In terms of how the memory is allocated, games get a total of 13.5GB in total, which encompasses all 10GB of GPU optimal memory and 3.5GB of standard memory. This leaves 2.5GB of GDDR6 memory from the slower pool for the operating system and the front-end shell. From Microsoft's perspective, it is still a unified memory system, even if performance can vary.


This actually dovetails quite neatly into Microsoft’s declaration that developers will have 13.5GB of that GDDR6 RAM pool available to them, while the remaining 2.5GB sits in the background, dealing with the shell, UI and other non-obvious tasks.



How this all splits out for the developers is fascinating. Of the total 16 GB memory capacity, Microsoft’s solution essentially reserves the vast majority for the actual running of games. A total of 13.5 GB – 10 GB of GPU optimal memory and 3.5 GB of standard memory – is dedicated to games themselves, while the remaining 2.5 GB from the slower pool is used for the operating system and background functions, which means games get more potential bandwidth, the results of which, in theory, should be more than tangible.
 
Last edited:
He said nothing about split memory he said low bandwidth memory. Split memory is still there too if they freed it from the OS.
No need to spread misinformation.

"The much lower amount of memory and the split memory banks with drastically slower speeds will be a major issue."

He specifically complains about the split memory, which, as we know, doesn't apply to games.
 
Top Bottom