• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox June GDK: More dev memory for Series S and more.

PaintTinJr

Member
..

Will having more ram available positively impact performance, specifically fps, in a situation where the software is memory constrained?

..
In the XsS for next-gen game experiences, going by the resolution and cutbacks of the UE5 matrix demo, the answer will be no. The game is already running at sub -HD-ready resolutions at times IIRC to hit the 30fps target, so the demo is clearly GPU processing constrained, not CPU constrained - where memory could help fps if the demo was CPU/memory bound.

The statement is to suggest the XsS is getting better for its bottlenecks, which it is, sort of, but the GPU is the bottleneck that they'd need to alter to improve fps "graphic performance" IMO.
 

MarkMe2525

Gold Member
They think it was because of improvement in development tools/drivers. They directly say that the system getting updated over time is what resulted in the game performance being improved.
I understand that others are using the statement to make some ridiculous claims. I am not in that camp.

To be honest, the convo got in the weeds but if you go back a page or two, some bizarre claims were made by painttinjr.
 

MarkMe2525

Gold Member
In the XsS for next-gen game experiences, going by the resolution and cutbacks of the UE5 matrix demo, the answer will be no. The game is already running at sub -HD-ready resolutions at times IIRC to hit the 30fps target, so the demo is clearly GPU processing constrained, not CPU constrained - where memory could help fps if the demo was CPU/memory bound.

The statement is to suggest the XsS is getting better for its bottlenecks, which it is, sort of, but the GPU is the bottleneck that they'd need to alter to improve fps "graphic performance" IMO.
Even though you didn't answer my question, you cherry picked a situation instead, I'll digress as it's really unimportant and the conversation as already been derailed enough.

I have noticed though for the 3rd time you have failed to answer my question about the "certification requirements MS would be admitting to not enforcing" but we both know that was trick question as you nor I know what those are. I'm sure you would not post that claim if you could go back.

Ill admit I was wrong in that I was hammering you about statements you made that you may have not clearly thought out. It was unnecessary as I should have just disagreed and moved on.
 
It's already confirmed for Forza Motorsport.
Nobody will blame or demand raytracing on a Series S game, so it's actually low on most developers list of priorities.

Forza Motorsport is the exception rather than the rule and it's also first party.
 

Riky

$MSFT
Nobody will blame or demand raytracing on a Series S game, so it's actually low on most developers list of priorities.

Forza Motorsport is the exception rather than the rule and it's also first party.

It shows it's possible though, DF describe it in the Weekly as the best implementation of RT on console so far and Series S is running it.
Since it doesn't affect gameplay I'm personally not bothered about it appearing in Series S versions but it's nice to know the hardware can do it.
 

DeepEnigma

Gold Member
It shows it's possible though
I think we have to wait for the results before we know "how possible" it will be. Something I am looking to be pleasantly surprised if it's competent, and not 640p or almost so bad they should not have included it. Software wise, I am definitely eager to see the fruits of their labor.
 
So I guess those rumors and DF reporting on devs complaining about the series s memory issues were true after all. I was told on this forum that it was not an issue.

Too little too late. Hundreds of extra MB is nothing when you are still bound by memory bandwidth and an extremely low tflops count for a next gen console. They shouldve always targeted a 6 tflops console to match the X1x tflops count and given it the same memory allocation as the x1x with an SSD and CPU upgrade. Too many cost cutting measures just to hit a $299 price point with no regard for how watered down the experience would be for their consumers. Now they are scrambling and hoping to find ram but it wont be enough.
Consoles are always going to be bound by something. The Switch fails at all levels(CPU, GPU, Memory capacity, Memory speed, IO speeds) which is why it struggles the way it does. The PS4 had an awful CPU so bad even the Switch could run PS4 games fine enough with it's CPU. The PS5 and Series have 2 big weakenesses the lack of dedicated RT hardware on the level of Nvidia GPUs and very weak memory gains.

They top out at 16GB when the One X had 12GB and the PS4/Xbox One had 8GB(due to the price of ram at the time being super cheap) while the 360 had 512mb and the PS3 had 256mb, it's a very anemic increase(due to the price of ram at the time being very expensive) which is why SSD tech is being pushed to cope with it, but as we all know SSDs are incredibly slow compared to ram so they can't replace RAM is most scenarios. These 2 aspects will haunt current gen consoles through their lifetime and will be significantly improved should we receive enhanced versions.
 

PaintTinJr

Member
Even though you didn't answer my question, you cherry picked a situation instead, I'll digress as it's really unimportant and the conversation as already been derailed enough.
I didn't answer the question directly because I was merely mirroring your lack of sincerity that you had shown by moving the goal posts when your own scenarios weren't going to result in more fps.

My orthogonal answer at least represents the real situation of most XsS AA-AAA games being made with an engine that has already highlighted how bottom heavy the XsS is and how it struggles to be the XsX, just at a lower resolution, as was claimed by the same engineering team at launch IIRC.

Freeing additional ram to give a mostly GPU (performance) bound console an even bigger bottom heavy imbalance isn't going to transform that typical and expected situation, is it?

Anyway, to actually answer your question this time - as I always intended - I would say in most of those edge situations it probably won't - specifically in regards to the two words "graphic performance" which were in the sentence you kept quoting- which given you asked as a rhetorical question I suspect this wasn't the answer you expected.

If we are being VRAM constrained suggests you are also going to be over burdening your GPU caches too with the cached data being replaced too frequently from the fully filled VRAM you are using, so the additional VRAM will just be alleviating VRAM bandwidth updates which are the reason for needing to free up more VRAM,- in this fictional scenario - and may even increase GPU cache misses and mildly hamper fps too because the ratio of VRAM data wanting to be used with GPU caches has actually worsened AFAIK.

I don't see a scenario where needing more VRAM to house more data - because the VRAM is being filled and thrashed with updates - doesn't just move the bottleneck to the GPU caches. Meaning you would need bigger - or more - GPU caches or need to lower the rendering complexity and data in VRAM to let the GPU caches help output more frames per second.
I have noticed though for the 3rd time you have failed to answer my question about the "certification requirements MS would be admitting to not enforcing" but we both know that was trick question as you nor I know what those are. I'm sure you would not post that claim if you could go back.

Ill admit I was wrong in that I was hammering you about statements you made that you may have not clearly thought out. It was unnecessary as I should have just disagreed and moved on.
Yet again, it just came across as disingenuous posting and I wasn't bothered by letting you insincerely ask, over and over; especially as I've never signed an NDA for such info.

The info is baked into the release documents of the consoles, anyway - about the resolutions and frequencies they support - and we can thank outlets like NXGamer and DF for holding a flame to the feet of publishers for the last 15years so that even ignoring TRCs we have de facto requirements like stable 30/60, no tearing - unless sellable to DF - stable frame-pacing and TV spec upscaled or native resolutions.

It is all obvious stuff that on GAF no one would probably be interested in seeing your "secret" document covered by an NDA, because it is a "no shit sherlock?" type situation. It is hardly mind blowing that they have to make games that update at compatible rates(30/60) and don't have artefacts that cause eye strain or epilepsy like tearing, and resolutions for TVs that are in all the homes in the world, is it?
 

MarkMe2525

Gold Member
If we are being VRAM constrained suggests you are also going to be over burdening your GPU caches too with the cached data being replaced too frequently from the fully filled VRAM you are using, so the additional VRAM will just be alleviating VRAM bandwidth updates which are the reason for needing to free up more VRAM,- in this fictional scenario - and may even increase GPU cache misses and mildly hamper fps too because the ratio of VRAM data wanting to be used with GPU caches has actually worsened AFAIK.
Got you, more available memory during memory constrained situations = same or worse performance. Makes sense

The info is baked into the release documents of the consoles, anyway - about the resolutions and frequencies they support - and we can thank outlets like NXGamer and DF for holding a flame to the feet of publishers for the last 15years so that even ignoring TRCs we have de facto requirements like stable 30/60, no tearing - unless sellable to DF - stable frame-pacing and TV spec upscaled or native resolutions.
Ok so earlier in the thread when you said that by stabilizing frame rates that were dropping below 30 or 60 fps, MS would be admitting to not enforcing their "certification requirements". These "certification requirements" where actually just the listed supported resolutions and display frequencies that MS mentioned in advertisments and press releases. I'm glad that's cleared up.

It is all obvious stuff that on GAF no one would probably be interested in seeing your "secret" document covered by an NDA, because it is a "no shit sherlock?" type situation. It is hardly mind blowing that they have to make games that update at compatible rates(30/60) and don't have artefacts that cause eye strain or epilepsy like tearing, and resolutions for TVs that are in all the homes in the world, is it?
Im not trying to be dismissive, but what are you even going on about? NDA's? Secret documents? Also, I don't think anyone on the entirety of neogaf ever questioned if games should work on modern displays. Why is this even being brought into the discussion because it definitely is a "no shit Sherlock" type of situation.

How about this. I agree to disagree with pretty much all of this.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
I think we have to wait for the results before we know "how possible" it will be. Something I am looking to be pleasantly surprised if it's competent, and not 640p or almost so bad they should not have included it. Software wise, I am definitely eager to see the fruits of their labor.

I mean, if Metro EE has to drop to 1080p or thereabouts when stressed to get RGTI + 60 FPS on SX and PS5, we should expect a similar level of DRS on Series S too if they want all the same features right ..
 

PaintTinJr

Member
Got you, more available memory during memory constrained situations = same or worse performance. Makes sense
My scenario wasn't vague, it was a genuine attempt to replicate the hypothetical and put a face on the scenario their sentence describes in the GPU - because they are implying it is GPU memory bound with the "graphic performance" and even your two earlier examples of Ray tracing and denoising would logically be getting done on the GPU, so I assumed it was established this memory limitation is VRAM.
More VRAM is more memory, but unlike CPU ram, which is part of a memory hierarchy which impacts the entire system performance adding more VRAM only impacts the flow of data in and out of VRAM and the GPU caches which get filled from the VRAM.

Are you saying the exact scenario I described wouldn't be GPU cache limited? And therefore have a different take on it?

If you are now saying that you believe the sentence is more about being system ram bound - either doing Ray tracing/denoising on the CPU, or more logically not a direct "graphic performance" related task -and the extra memory is just to improve "program performance", then I'm not arguing against that, just the wrong use of the term "graphic" in "graphic performance".
 
Last edited:

PaintTinJr

Member
I mean, if Metro EE has to drop to 1080p or thereabouts when stressed to get RGTI + 60 FPS on SX and PS5, we should expect a similar level of DRS on Series S too if they want all the same features right ..
Most definitely if consoles were still targeting a large audience of SD tv owners with s-video, scart and component video, but the drop in resolutions you are describing along with the fx still being compromised outputing on TVs with 1080p or better native panels just illustrates the XsS as a next-gen system conundrum.

Previous gen faked graphic fx at 1080p will likely provide a superior image with far less noise, because the XsS isn't doing better pixels in a clear cut way to make them worth while. Which then begs the obvious question: Which consumers does the XsS existing actually solve a problem for?
 

Three

Member
Will having more ram available positively impact performance, specifically fps, in a situation where the software is memory constrained?
No, not really unless you have a broken game where it ends up HDD thrashing like skyrim on PS3. Memory constraint will not help with fps, it can help with res if the framebuffer size is being held back due to memory size though though I suspect most games are GPU limited and not memory limited there.
 
Last edited:
Series S and Neogaf users triggered by its mere existence. Name a better duo.

In any other place this news would be positive, yet here we are.

200w.webp
 

MarkMe2525

Gold Member
My scenario wasn't vague, it was a genuine attempt to replicate the hypothetical and put a face on the scenario their sentence describes in the GPU - because they are implying it is GPU memory bound with the "graphic performance" and even your two earlier examples of Ray tracing and denoising would logically be getting done on the GPU, so I assumed it was established this memory limitation is VRAM.
More VRAM is more memory, but unlike CPU ram, which is part of a memory hierarchy which impacts the entire system performance adding more VRAM only impacts the flow of data in and out of VRAM and the GPU caches which get filled from the VRAM.

Are you saying the exact scenario I described wouldn't be GPU cache limited? And therefore have a different take on it?

If you are now saying that you believe the sentence is more about being system ram bound - either doing Ray tracing/denoising on the CPU, or more logically not a direct "graphic performance" related task -and the extra memory is just to improve "program performance", then I'm not arguing against that, just the wrong use of the term "graphic" in "graphic performance".

No, not really unless you have a broken game where it ends up HDD trashing like skyrim on PS3. Memory constraint will not help with fps, it can help with res if the framebuffer size is being held back due to memory size though though I suspect most games are GPU limited and not memory limited there.
tw7I8mh.jpg
FlWJqm8.jpg
Hc6hxU1.jpg


I don't know man. Comparing the dual channel setups I see an increase in avg fps with increased capacity. Obviously these are pretty large increases but a few hundred MB of additional capacity may help stabilize frame rates. MS's statement was pretty simple, these changes "CAN" increase graphical performance in memory constrained situations. MS isn't promising the moon here, they are saying it can help. I don't understand the pushback.
 

Three

Member
tw7I8mh.jpg
FlWJqm8.jpg
Hc6hxU1.jpg


I don't know man. Comparing the dual channel setups I see an increase in avg fps with increased capacity. Obviously these are pretty large increases but a few hundred MB of additional capacity may help stabilize frame rates. MS's statement was pretty simple, these changes "CAN" increase graphical performance in memory constrained situations. MS isn't promising the moon here, they are saying it can help. I don't understand the pushback.
Yeah those are large system RAM capacity differences and really low frametimes you're dealing with. you might be IO bound for brief moments with the OS dealing with increased I/O activity and background processes to sway the average (is it rolling average or total though?) but notice the instantaneous frametimes are pretty much the same for the scenes. At 16ms or 33ms typical frametimes these I/O spikes would have less of an effect on averages too. You wouldn't typically get noticeable framerate improvements with more RAM in the same configuration. 'Graphics performance' here refers more to resolution and better sampling I would say. With 120fps mode possibly it will make things more stable and lower averages very slightly (we aren't talking about 8GB here even) but not much difference to your instantanous frametime.
 
Last edited:

Three

Member
Consoles are always going to be bound by something. The Switch fails at all levels(CPU, GPU, Memory capacity, Memory speed, IO speeds) which is why it struggles the way it does. The PS4 had an awful CPU so bad even the Switch could run PS4 games fine enough with it's CPU. The PS5 and Series have 2 big weakenesses the lack of dedicated RT hardware on the level of Nvidia GPUs and very weak memory gains.

They top out at 16GB when the One X had 12GB and the PS4/Xbox One had 8GB(due to the price of ram at the time being super cheap) while the 360 had 512mb and the PS3 had 256mb, it's a very anemic increase(due to the price of ram at the time being very expensive) which is why SSD tech is being pushed to cope with it, but as we all know SSDs are incredibly slow compared to ram so they can't replace RAM is most scenarios. These 2 aspects will haunt current gen consoles through their lifetime and will be significantly improved should we receive enhanced versions.
Both the PS3 and 360 had 512MB. The 360 had unified memory though but the PS3 a 256+256 split memory.
 

MarkMe2525

Gold Member
Yeah those are large system RAM capacity differences and really low frametimes you're dealing with. you might be IO bound for brief moments with the OS dealing with increased I/O activity and background processes to sway the average (is it rolling average or total though?) but notice the instantaneous frametimes are pretty much the same for the scenes. At 16ms or 33ms typical frametimes these I/O spikes would have less of an effect on averages too. You wouldn't typically get noticeable framerate improvements with more RAM in the same configuration. 'Graphics performance' here refers more to resolution and better sampling I would say. With 120fps mode possibly it will make things more stable and lower averages very slightly (we aren't talking about 8GB here) but not much difference to your instantanous frametime.
I can get behind what your saying, but an fps avg is derived from a collection of instantaneous frame times. I'm not refuting you, just highlighting that one can not say having more available memory wont help performance in absolute terms. Might I add that MS in their statements were also not speaking in absolutes, they clearly said it "can" help.

That's why I took issue with PaintTinJr PaintTinJr claim that the statement MS made to developers was in fact incorrect, along with some other claims that if a game fails to lock to 30fps or 60fps well enough (whatever that means) then the game is in violation of MS "certification requirements" (an absurd statement).

IMHO if the statement is true - without a VRR addendum -, it is an acknowledgment by Xbox that they are letting games wrongfully pass their certification that aren't meeting the targeted performance of 30fps or 60fps well enough
 

PaintTinJr

Member
I can get behind what your saying, but an fps avg is derived from a collection of instantaneous frame times. I'm not refuting you, just highlighting that one can not say having more available memory wont help performance in absolute terms. Might I add that MS in their statements were also not speaking in absolutes, they clearly said it "can" help.

That's why I took issue with PaintTinJr PaintTinJr claim that the statement MS made to developers was in fact incorrect, along with some other claims that if a game fails to lock to 30fps or 60fps well enough (whatever that means) then the game is in violation of MS "certification requirements" (an absurd statement).
Just for the purpose of clarity,

Are you saying that Xbox doesn't have a certification program ?
Or are you saying they do have a certification program, but there is no means to fail certification by submitting an Xbox 1/XsS Cyberpunk day one experience?

Or are you saying that there is a program and games do fail on performance grounds, but then Xbox would never pushback against a publisher wanting to go gold, - and block a release - regardless of the technical shortfalls in tearing or judder - which may or may not cause the epilepsy and other conditions like the PlayStation bootup warning has mentioned since the end of the PS3 generation?
 
Last edited:

MarkMe2525

Gold Member
Just for the purpose of clarity,

Are you saying that Xbox doesn't have a certification program ?
Or are you saying they do have a certification program, but there is no means to fail certification by submitting an Xbox 1/XsS Cyberpunk day one experience?

Or are you saying that there is a program and games do fail on performance grounds, but then Xbox would never pushback against a publisher wanting to go gold, - and block a release - regardless of the technical shortfalls in tearing or judder - which may or may not cause the epilepsy and other conditions like the PlayStation bootup warning has mentioned since the end of the PS3 generation?
I'm not making any claim about their certification requirements, you did. I pointed out the absurdity of this statement right here.
Under all non-VRR situations the games target 30 and 60fps as is required to pass xbox certification - and other than the odd percentile dip in analyses shown by NXgamer, etc typically games on all consoles have stable 30 or 60fps frame-rates, so the frame-rate in games should be locked already with no performance gain to be had - without VRR - because games use dynamic resolution and drop features to hit those frame-rates to match the fixed display refreshes.
This right here is the strawman. You misrepresent what it means to have a target framerate. Both you and I know that a "target framerate" is the maximum number of frames per second that a game should be sending to whatever display device. This does not imply some sort of minimum frequency requirement. If a game fails to hit said target frame rate, then it is typically a less pleasing experience but by no means does it disqualify it from release on the xbox platform. This is fact per Microsoft Store policies 10.4.1. All that is required per 10.4.1 is products must be "compatible with the software, hardware and screen resolution requirements specified by the product"
Example: Elden Rings performance mode targets 60fps but finds itself ,in most situations, failing to hit that target. There is absolutely a "performance gain to be had" and the fact that this game stays mostly below 60fps does not in any way imply that Xbox are "letting games wrongfully pass their certification" as you claim above and below. To be clear, before my statement gets misrepresented, I am not implying that this GDK update will fix Elden Rings issues.

IMHO if the statement is true - without a VRR addendum -, it is an acknowledgment by Xbox that they are letting games wrongfully pass their certification that aren't meeting the targeted performance of 30fps or 60fps well enough - that this improvement makes a meaningful difference - and are in essence selling goods that should be held back until the performance is stable.
 
Last edited:
Riky Riky thicc_girls_are_teh_best thicc_girls_are_teh_best


The 13.5 GB RAM being available has been known since 2020 when DF first got their hands on the Series X specs from MS.










Already knew this, that's not the point of contention tho. The point of contention is Rich claiming PS5 OS uses 3.5 GB of the 16 GB of RAM, which doesn't really make any sense, considering most account place it at 2 GB, and it has further caches for the SSD I/O as well as logic for handling decompression & file I/O routines which would not require as many tasks for those by the CPU as the Series systems nor a scratch pad of space in RAM of the OS footprint for caching data to/from the SSD (AFAIK, the Series systems' SSDs use a DRAM-less flash memory controller).

Even accounting for the OS UI being 4K native, even accounting for planned future devices like PSVR, it just doesn't make much sense for PS5's OS to reserve 3.5 GB of the 16 GB, a whole 1 GB more than Series X's, considering it's not even using hypervisors the same way Xbox has since the XBO. So I'm guessing Richard's comparison was WRT PS5's OS compared to Series S's, which would make a bit more sense (though even that is kind of dubious; Series S reserves 2 GB of the 10 GB for the OS, so his quote would still peg PS5's OS RAM footprint being larger than Series X's when it doesn't seem like there's much reason for it to be).
 
Last edited:
I thought the split memory didn't apply to the XSS. The ratio is different and I believe the full fast memory is usable on XSS for games where on the XSX both memory pools will be used for games.

It does but only in that the OS reserves 2 GB and the game logic occupies the other 8 GB. Though as T Three references, all of this is virtual addressing as far as the applications are concerned; they don't literally page physical RAM locations and a table map is used to point to a physical memory address via a virtual reference in the map called by the pointer.

So it's actually a bit tricky to know where the OS for Series S is at in the physical RAM space though one could assume it's to a single module as pulling any OS data from 2 GB of space mapped to a single contingent physical RAM module would give more than enough bandwidth for that OS data. With Series X it is more complicated; everything's still handled via virtual addressing (goes for literally every x86/x86-64/Android/iOS etc computer system in existence today; console-wise direct physical address pointing probably hasn't been a thing since the end of 7th gen at large, except in cases where a game needed to use some hand-written assembly, and AFAIK only some of Sony's games have really done that the past few years), but the OS is literally made to reside within a virtual address space that points to physical address spaces within lower-bound 1 GB physical address space of the six 2 GB modules.

They do this because if the GPU actually needs the full RAM bandwidth, it won't be blocked from that due to OS data being physically written to the 1 GB modules by the virtual address pointer (same reason why for non-graphics data the pointer for the virtual address points to physical addresses within the lower bound of the 2 GB modules, to prevent bandwidth starving of GPU-critical operations).

Series s has 10 GB in total. I think 8 GB usable was a given, so maybe they have up to 8.5 now on series s with disabled not used features if the game doesn’t need these features. The real question is : If they reduced the OS footprint maybe they also reduced the OS for series X, freeing more memory up for that machine also.

If they did, then it won't mean much for GPU-bound operations. The upper-bound 1 GB physical addresses of the six 2 GB modules are already reserved for graphics data, as are the 1 GB modules, that way the GPU can have maximum bandwidth when it needs it. Freeing up some of the virtual address space reserved for the OS (all virtual addresses MUST point to a physical address btw when actually being actively mapped to a physical address space, you can't just "make up" a virtual memory address and it points to nothing) doesn't mean the GPU now has 512 MB of extra addressable space, unless something else is having its addressable space reduced.

It also means that, since this only happens when background processes deemed not vital for OS functionality can be shut off, effectively what additional capacity is afforded will be variable and random. So game applications can't really code in a way where they can reliably count on that additional 512 MB or whatever of RAM to play with; they can maybe code a few things in mind for less-critical functions if the space is available, that the OS can then handle in the background I suppose. But it's not going to be anywhere near the same as if the system had a hard-set 14 GB of RAM for games afforded to it, as an example (same applies to Series S).

I think that, correct me if I'm wrong, OS ram allocation isn't always 2.5gb on both series consoles its just that the OS can only pull 2.5gb maximum allocation thus if the a game needs, let's say on a game on Series X it needs 13.8gb Microsoft "can" allow the extra .3gb that comes from OS allocation if it's really needed but I think there will be a certain limit on how much allocation can be borrowed from the OS so it wouldn't crash when running a game that needs the extra memory to prevent performance hiccups.

The OS may not always use 2.5 GB but games absolutely cannot be programmed in a way as if to expect something higher than 13.5 GB being afforded to them by default. Whatever extra RAM is provided will be random and variable, that will impact what type of benefits it can provide for game applications.

The reason is because this isn't the way you guys are thinking. The extra RAM being provided is in the case the OS has background services and utilities that no longer need to run, that can be shut down, effectively freeing up more of the virtual addressable space (since now there is more physical address space available to be pointed to by the pointer in the virtual address space).

Split RAM is fine. There is nothing wrong with split RAM except the added complexity for devs and cost cutting for platform holders. The alternatives are a more expensive system or a system with lower RAM so why not split RAM. That's a good thing.

I don't see why when a veteran engine dev says the low RAM was giving them trouble and they release Doom Eternal with missing raytracing on Series S due to it they should be dismissed as ignorant because they mentioned it's also split RAM when it is though. 🤷‍♂️


I see you edited to ask this. what do you mean by virtual memory? As in SSD space used as RAM space? yeah I'm sure SSD space is reserved by the OS and used as a pagefile to move data for less demanding not always needed processes.
I mean in terms of memory allocation and virtual address space though. This basically

https://docs.microsoft.com/en-us/windows-hardware/drivers/gettingstarted/virtual-address-spaces

People are oversimplifing memory managment to physical memory trying to suggest this chunk of physical memory is for this Game or component (GPU) and that chunk of physical memory is for that, when that's not really how it is. The OS doesn't need a 2GB chunk of physical memory like that and it is dynamic even for the game with a lower limit.

If you are trying to suggest VAS doesn't apply to xbox due to 'coding to the metal' or something wierd you can look at the other listed improvement which makes it clear:

Really appreciate the clarification on "virtual addressability" in this context, because I think some people are not aware of what that actually means and, yeah, might be thinking of virtual RAM paging or something like that.

Which is absolutely not the case with the way MS are describing it in the document you linked.

Actually helped me catch a slight mis-reference of my own in some of the other replies: you don't need a physical address space for a virtual address space to exist, but chunks of virtual address spaces that spill outside of the physical address footprint basically have to be mapped into a physical space of memory overwriting some other existing virtual addresses pointing to that physical address space currently (so the data being overwritten has to be written out in a page file to disk/storage).

It's basically a continuation of bank switching techniques older cartridge console games often used, especially on systems like the NES, IIRC.
I've never seen anyone confirm how much RAM the PS5's OS reserves.

Well Richard is saying it uses 1 GB more than Series X, at least that's what was mentioned somewhere on the first page, which is what I have a point of contention with.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Already knew this, that's not the point of contention tho. The point of contention is Rich claiming PS5 OS uses 3.5 GB of the 16 GB of RAM, which doesn't really make any sense, considering most account place it at 2 GB

Can you please share where these accounts are that place it at 2GB ?

Cause I'm also finding sources dated 2020 that have it at 12.5GB.

PS5
  • 256 –bit Memory interface.
  • 448 GB/s bandwidth
  • 3.5 Gb for OS
  • 12.5 GB for developers.


And the only mentions I can find about 2GB memory usage are all labeled as rumors and lead back to GAF theads.

I would take what Richard says above unconfirmed rumored leaks, personally.
 
Last edited:
Both the PS3 and 360 had 512MB. The 360 had unified memory though but the PS3 a 256+256 split memory.
I'm well aware but it's not the same. If your PC has a 3090 with 24GB VRAM plugged in and a single stick of 2GB of system ram it does not have 26GB of Ram. Video memory is not system memory, you're still gonna have all the limitations that having only 2 GB of ram brings no matter how much VRAM u have. The 360 didn't suffer from this issue because it had a single pool of memory(it technically did have ESram on top of that).
 
95% of the games art still low res, the icons in the UI are also low res, only live text and basic UI elements are 4K
Game art is low rrs for the same reason PS4Pro had sub 4k game art for games that came out before it's launch and some games that came after. Ime if devs don't update it it's not gonna change on its own. Idk about the rest last I heard the actual OS UI was 4k in the same way your windows UI is 4k if you switch the resolution to 4k.
 

Three

Member
I'm well aware but it's not the same. If your PC has a 3090 with 24GB VRAM plugged in and a single stick of 2GB of system ram it does not have 26GB of Ram. Video memory is not system memory, you're still gonna have all the limitations that having only 2 GB of ram brings no matter how much VRAM u have. The 360 didn't suffer from this issue because it had a single pool of memory(it technically did have ESram on top of that).
If this were the case then why did you state the 360 as having 512MB of RAM considering some (most) of it would undoubtedly be used as VRAM. As far as I know the VRAM could be used as system RAM too with some hacks but at a drastically reduced bandwidth and higher latency.
 
Last edited:

Three

Member
Can you please share where these accounts are that place it at 2GB ?

Cause I'm also finding sources dated 2020 that have it at 12.5GB.




And the only mentions I can find about 2GB memory usage are all labeled as rumors and lead back to GAF theads.

I would take what Richard says above unconfirmed rumored leaks, personally.
Frankly I wouldn't trust that site you posted more than anything. Surprised Bernd is so quick to call out id devs as illegitimate but accept and thumbs up some unknow site who says this

"[SSD] Expansion module replaces interface one."

nah.gif


Because that was just the silly assumption based on the PS4 hdd. But this unknown site knows things, right?
 
Last edited:

Riky

$MSFT
I've never seen anyone confirm how much RAM the PS5's OS reserves.

Exactly, I doubt DF would just pick a figure out the air as they know how much crap it would make. If he's said it's 12.5gb available it's fair to assume a developer or maybe more than one has confirmed that, just like all the other stats that have been proven to be correct. Just not wanting it to be true isn't going to change that.
 

PaintTinJr

Member
I'm well aware but it's not the same. If your PC has a 3090 with 24GB VRAM plugged in and a single stick of 2GB of system ram it does not have 26GB of Ram. Video memory is not system memory, you're still gonna have all the limitations that having only 2 GB of ram brings no matter how much VRAM u have. The 360 didn't suffer from this issue because it had a single pool of memory(it technically did have ESram on top of that).
The PC split comparison (DDR4 + GDDR6X) with PS3 split (XDR+ GDDR3) versus Xbox 360 unified (DDR3) isn't quite so simple, because the PC graphics card lives on the southbridge, and the GDDR6X memory mapped through the DDR4 because it lives on the northbridge whereas in the other two examples their memory is on the northbridge and that is why the PS3 was able to use the GDDR3 in place of XDR, just with different access characteristics - which I believe were eventually abstracted if developers wanted with the SPURs library IIRC.

In the PC example, the GPU driver mapping in RAM actually costs you RAM memory, so you always want a scenario where you have adequate ram to map the VRAM so any latency from the swapfile being used as part of that mapping is hidden from the GPU driver completely. I'm not even sure modern windows can install a full nvidia driverto work with just 2GBs of RAM and 24GB VRAM.
 

PaintTinJr

Member
...

Well Richard is saying it uses 1 GB more than Series X, at least that's what was mentioned somewhere on the first page, which is what I have a point of contention with.
Even though I would consider Richard to be - I think the new term is - hype-man I still expect that number to be true because of the wear and tear on the non-replaceable system SSD in the PS5 if they weren't buffering the 4K hdr gameplay recording to the GDDR6. What I suspect is different, is that the PS5 offers more memory to devs than a straight 16GB-3.5GB, where a 2GB area gets nuked and reloaded every time the OS menu button is pressed, because the high priority modes of the IO complex can allow for the eviction -of largely static - OS or game data and guarantee reloading quicker than what would produce OS or game lag.
 
So how much more memory did the Series S got?
But anyways, it's just slow memory. As the Series S only has 8GB fast memory.

I still don't understand why Xbox Series didn't go with 20GB at 320bit for Series X and 12GB at 192bit for Series S
The additional costs would be marginal. The difference it makes for ease of use for developers would be tremendous.
 

Zathalus

Member

elliot5

Member
So how much more memory did the Series S got?
But anyways, it's just slow memory. As the Series S only has 8GB fast memory.

I still don't understand why Xbox Series didn't go with 20GB at 320bit for Series X and 12GB at 192bit for Series S
The additional costs would be marginal. The difference it makes for ease of use for developers would be tremendous.
"marginal" as if these specs weren't locked in during a RAM price skyrocket and when you multiply the cost increase by 100M it becomes a bit cost adjustment. Sure MS could afford it, but they still have to budget things out.

and the amount is flexible. You get more memory from disabling system processes not in use so it can be anywhere from like 100 MB to 300 MB it sounds.
 

MarkMe2525

Gold Member
Frankly I wouldn't trust that site you posted more than anything. Surprised Bernd is so quick to call out id devs as illegitimate but accept and thumbs up some unknow site who says this

"[SSD] Expansion module replaces interface one."

nah.gif


Because that was just the silly assumption based on the PS4 hdd. But this unknown site knows things, right?
That site also lists Series X at $600 and claims it has a loud fan.
 
"marginal" as if these specs weren't locked in during a RAM price skyrocket and when you multiply the cost increase by 100M it becomes a bit cost adjustment. Sure MS could afford it, but they still have to budget things out.

and the amount is flexible. You get more memory from disabling system processes not in use so it can be anywhere from like 100 MB to 300 MB it sounds.

Sometimes you need to take risks.
Paid off greatly for PS4.

RAM is dropping extremely again. But now you have 15m consoles already out there and can't increase it.
I also don't think a revision with more RAM makes sense.

At the end of the day, it would've been 1 more RAM die for the Series S and 4* 2GB dies instead of 4*1GB dies for Series X.
That won't even be half a billion $ over the lifetime of the console.

Remember when they claimed to have spend $500m on the new Xbox One controller just for R&D?
I get very cost saving measure, just not that one. It's stupid. Thank god it's not a decision, that makes you lose a whole console generation and costs you $5bn to $50bn in revenue.
But it could've been worse. Just like the Xbox One.
 
Agreed. That's Pamela looked her best, until Hollywood ran her around the block a few times.
Nah I think she ran around the block a few times herself. As well as with Tommy Lee's boat and countless others, no doubt. Bottom line is we all get old, some age differently.

As for xss, is this the so called "secret sauce" xbone fans clamored for last Gen but never got?
 
Last edited:

MarkMe2525

Gold Member
Nah I think she ran around the block a few times herself. As well as with Tommy Lee's boat and countless others, no doubt. Bottom line is we all get old, some age differently.

As for xss, is this the so called "secret sauce" xbone fans clamored for last Gen but never got?
Some of my favorite discussions to lurk were the secret sauce threads. While I kinda secretly hope "secret sauce" discussion comes back, I don't think memory allocation optimizations has enough legs to kick that off.
 

Dolodolo

Member
What in this thread is worthy of as many as six pages of discussion?
These couple of hundred megabytes won't really change anything.
The Xbox series S will remain a backward trough that pulls two powerful consoles
down.

It is a pity for the developers of multiplatform projects

And so, from the point of view of the consumer, it is certainly very successful.
 
Last edited:

clampzyn

Member
The Xbox series S will remain a backward trough that pulls the two older consoles down.

Disagree, imo, developers will target Series X hardware on their development first then they'll try to scale down to series s hardware at a playable performance. This is what we are seeing right now, series s is not gonna pull down the premium consoles down this gen.
 
Last edited:

MarkMe2525

Gold Member
What in this thread is worthy of as many as six pages of discussion?
The usually culprits, people overstating the significance of this update, people understating the significance of this update. Discussion of how the ram works in the XSS

Then there are statements like these
"The Xbox series S will remain a backward trough that pulls two powerful consoles down."
That generally just cause a bunch of back and forth shit throwing.
 
Last edited:

TLZ

Banned
Excellent. It’s been a fantastic device for me. I’m using it near daily for eFootball and have Halo and Apex installed as well as all of the old Rare games.
How's efootball these days? I haven't touched it since "release" and waiting on the single player modes. Are they out yet?
 
Top Bottom