• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox June GDK: More dev memory for Series S and more.

If this were the case then why did you state the 360 as having 512MB of RAM considering some (most) of it would undoubtedly be used as VRAM. As far as I know the VRAM could be used as system RAM too with some hacks but at a drastically reduced bandwidth and higher latency.
It is as you say and that is the advantage of having unified memory. The PS3 like a PC was limited to only using 256MB of ram for system memory i.e the kind of stuff you see in your task manager on PC. It also had 256MB for video memory but that's just GPU memory, it can't be used as system memory under normal circumstances. This means the 360 gave devs great flexibility over how much ram their games could use with 360 games in general having a significant advantage in total ram available to games than the the PS3, in practice PS3 games were ram starved compared to 360 games and any task that relied on memory capacity was crippled on the PS3.
 

sinnergy

Member
It is as you say and that is the advantage of having unified memory. The PS3 like a PC was limited to only using 256MB of ram for system memory i.e the kind of stuff you see in your task manager on PC. It also had 256MB for video memory but that's just GPU memory, it can't be used as system memory under normal circumstances. This means the 360 gave devs great flexibility over how much ram their games could use with 360 games in general having a significant advantage in total ram available to games than the the PS3, in practice PS3 games were ram starved compared to 360 games and any task that relied on memory capacity was crippled on the PS3.
Yet it had great looking games .. so if devs want they can do wonders.
 

oldergamer

Member
That the Series X has a full GB of RAM over the PS5 is surprising.
I dunno if it is. Ms is aways good about reducing the os footprint every generation. Sony tends to stick with what was allocated at gen start.

Like someone else in the thread stated, this was similar on the ps3 / 360 hardware
 

PaintTinJr

Member
It is as you say and that is the advantage of having unified memory. The PS3 like a PC was limited to only using 256MB of ram for system memory i.e the kind of stuff you see in your task manager on PC. It also had 256MB for video memory but that's just GPU memory, it can't be used as system memory under normal circumstances. This means the 360 gave devs great flexibility over how much ram their games could use with 360 games in general having a significant advantage in total ram available to games than the the PS3, in practice PS3 games were ram starved compared to 360 games and any task that relied on memory capacity was crippled on the PS3.
The bit in bold is completely false, and anyone that used PS3 linux would confirm that, because the available system memory was larger than 256MB (412MB IIRC) even though the GPU 2D and 3D H/W acceleration was disabled for the Linux Mesa3D driver - running on the Cell BE PPE if I remember correctly - because the RSX was blocked by the hypervisor to only provide a video frame buffer.

The memory in the PS3 had unified access because even the GDDR3 was accessed via the Elemental Interconnection Bus by Cell BE, even though the RSX had higher performance direct access IIRC

The real reason it probably seemed like the PS3 had less memory, was because the gddr3 access was slower than XDR, and with the PS3 already at a PPE disadvantage to the 360 Xenon's 3 Core 6threads, it would make sense for performance reasons to copy data back to XDR memory before CPU work was done, then copy it back to the GDDR3, which would cost in memory storage for needing a buffer in XDR memory, and in northbridge bandwidth to do the two data copies via the EiB.
 
Last edited:

THE DUCK

voted poster of the decade by bots
It's so strange, whenever my son borrows the Series S for his 1080p monitor he doesn't ever complain about resultion or Ray tracing. Kids these days are blind I tell you.

Nice to see some headroom gained here, I mean when I play my S from 10 feet away on a 49" tv I'm always comment on how garbage it looks. I mean if it were 4k even though my eyes can't resolve higher than 1080p past 6 feet, it woukd look way better! Inferior crap I keep saying. So crazy MS made it with so many compromises, not thinking about customers at all.
 
Last edited:
Sometimes you need to take risks.
Paid off greatly for PS4.

RAM is dropping extremely again. But now you have 15m consoles already out there and can't increase it.
I also don't think a revision with more RAM makes sense.

At the end of the day, it would've been 1 more RAM die for the Series S and 4* 2GB dies instead of 4*1GB dies for Series X.
That won't even be half a billion $ over the lifetime of the console.

Remember when they claimed to have spend $500m on the new Xbox One controller just for R&D?
I get very cost saving measure, just not that one. It's stupid. Thank god it's not a decision, that makes you lose a whole console generation and costs you $5bn to $50bn in revenue.
But it could've been worse. Just like the Xbox One.
MS had this "genius" idea of releasing a $300 and a $500 consoles to sandwich Sony and they were willing to make any compromise to reach that price point. It doesn't seem like anything that they set out to do worked as planed, they clearly expected the PS5 to be weaker than it ended up being.

Sony PS5 DE solution proved a lot more successful and would've worked even better if there wasn't an unprecedent component crisis.
 
Last edited:

THE DUCK

voted poster of the decade by bots
MS had this "genius" idea of releasing a $300 and a $500 consoles to sandwich Sony and they were willing to make any compromise to reach that price point. It doesn't seem like anything that they set out to do worked as planed, they clearly expected the PS5 to be weaker than it ended up being.

Sony PS5 DE solution proved a lot more successful and would've worked even better if there wasn't an unprecedent component crisis.

It worked out just fine, had there been no shortage they would have just dropped the price and it would have sold fine. It was really meant to be $249 or even $199 at a point.
 


Memory allocation on Xbox Series S consoles has been optimized.
“Hundreds of additional megabytes of memory are now available to Xbox Series S developers. This gives developers more control over memory, which can improve graphic performance in memory-constrained conditions.

Improved performance for graphics allocations

‘Titles can now take better advantage of recent memory enhancements’. “We’ve addressed an issue where graphics virtual addresses were being allocated considerably slower than non-graphics virtual addresses…”

Improved PC game development experiences
Auto-synchronized cloud saves and sign-in
A simplified user model
Startup screen that is displayed before the title is rendered
Game update checks that are performed for packaged builds
Capability for debugging packages and testing a title in a retail environment
Platform now enforces DLC age restrictions
‘If you’re attempting to mount age-restricted DLC, the new API prompts age-restricted accounts for parental consent by providing a notification.’

New in-game API to manage DLC storage
‘Developers can now manage how storage affects a player within their game. They no longer have to go to their storage settings to free up space’

This is one of the more consumer-facing updates as in previous titles you often had to go out to the UI/Windows itself to change up what DLC was installed per title if you wanted to remove it.

HDR Support in Xbox Manager Remote Control view
‘View accurate HDR content when you connect to remote consoles’

This is beneficial for smartphone and PC users on HDR-capable displays as previously the signal could look off as it was tone-mapped from HDR to SDR.

Are they able to do the same on the series x?
 


I don't know what the dev's did on this game to achieve dynamic 1440p/120fps on Series S, native 4k/120 RT Reflection on Series X. But I heard Xbox had involvement in developing this game, might be VRS tier 2 active here but DF and elanalistadebits was unable to locate it because the implementation was really good, take my opinion with a grain of salt though because it's all speculation.

This also shows they could have pushed the raytracing further on the series x while maybe not max we could have probably gotten another rt effect added cause that’s a lot of headroom
 

adamsapple

Or is it just one of Phil's balls in my throat?
I bet they can but I assume it's not a priority there as the console already has basically enough memory as it is

And it already has a 1GB overhead over it's nearest competition reportedly, so they're not wanting for resources there unlike Series S.
 

01011001

Banned
Doesn’t raytracing and some graphical settings hammer memory? I also thought scene complexity hammers memory

true, but given that most games are multiplat and that the Series X already has an advantage in memory speed and capacity over the PS5, I bet most devs wouldn't really take advantage of a few more megabytes extra when they also target PS5 and usually just go for parity anyways
 
true, but given that most games are multiplat and that the Series X already has an advantage in memory speed and capacity over the PS5, I bet most devs wouldn't really take advantage of a few more megabytes extra when they also target PS5 and usually just go for parity anyways
I meant to get the series x ever so closer to pc would be ashame to know the hardware is being arbitrarily limited like that
 
DF talk about this in their new Weekly video, seem to think it will help with Series S Raytracing, although they don't know how much.
They seem to say developers are happy with the GPU but memory amount and bandwith have been the major problem.
They also talk about the bigger consoles memory and say that the Series X has 13.5gb usable Vs 12.5gb on PS5, I think this is the first confirmation we've had about the PS5 situation.
They go on to say they hope Series X will get extra RT effects and graphical upgrades rather than just higher DRS values on future titles.
They really should get the series x to an even 14gb and I think it will be in a perfect spot
 
It worked out just fine, had there been no shortage they would have just dropped the price and it would have sold fine. It was really meant to be $249 or even $199 at a point.
Where are you getting that from? I never heard about it and modern consoles seem to hardly drop in price. I love how "it was meant to be a $199 console" is actually an actual excuse being use, it kinda makes my point for me.



I don't know what the dev's did on this game to achieve dynamic 1440p/120fps on Series S, native 4k/120 RT Reflection on Series X. But I heard Xbox had involvement in developing this game, might be VRS tier 2 active here but DF and elanalistadebits was unable to locate it because the implementation was really good, take my opinion with a grain of salt though because it's all speculation.

Isn't that the game that was developed by one guy?
 
Last edited:
yeah, and for something completely useless.

the Guide menu on the Xbox 360 was 480p, NOONE cared... and it was 480p in order to be as snappy and low profile as possible.

and that's what a good UI should be first and foremost.
easy and fast to navigate + light on the hardware

the optics are a secondary concern
Wait it was 480p? I thought it was my tv this entire time and I never paid attention to it
 
The id Software guy always gets pulled up as an example of devs complaining, and it is clear that the dude never even saw a Series S before making those tweets. The issue got entirely overblown, purely for console war reasons.
Despite some biases I’ve felt you have had I must say you make a lot of well versed in-depth posts that get people like me informed and it’s massively appreciated, it’s stuff like this that motivated me to make an account on this site
 

THE DUCK

voted poster of the decade by bots
Where are you getting that from? I never heard about it and modern consoles seem to hardly drop in price. I love how "it was meant to be a $199 console" is actually an actual excuse being use, it kinda makes my point for me.


Isn't that the game that was developed by one guy?

It was designed from day one to cut costs and deliver a low cost gaming machine at a mass market price but still had some next gen benefits. If you look back at ps4 and xbox one the biggest majority of sales were when the consoles hit the magic price point of $199. The whole idea was to offer that years earlier and still have a flagship product too.

This generation has been decidedly different, between covid, component shortages and inflation, things are different than the original plan.
 
Last edited:
Consoles are always going to be bound by something. The Switch fails at all levels(CPU, GPU, Memory capacity, Memory speed, IO speeds) which is why it struggles the way it does. The PS4 had an awful CPU so bad even the Switch could run PS4 games fine enough with it's CPU. The PS5 and Series have 2 big weakenesses the lack of dedicated RT hardware on the level of Nvidia GPUs and very weak memory gains.

They top out at 16GB when the One X had 12GB and the PS4/Xbox One had 8GB(due to the price of ram at the time being super cheap) while the 360 had 512mb and the PS3 had 256mb, it's a very anemic increase(due to the price of ram at the time being very expensive) which is why SSD tech is being pushed to cope with it, but as we all know SSDs are incredibly slow compared to ram so they can't replace RAM is most scenarios. These 2 aspects will haunt current gen consoles through their lifetime and will be significantly improved should we receive enhanced versions.
The biggest flaw with the consoles is by far the zen 2 cpu when zen 3 was available and the memory bandwidth
 

Hoddi

Member
The biggest flaw with the consoles is by far the zen 2 cpu when zen 3 was available and the memory bandwidth
Zen 3 was released too late but what makes you think bandwidth will be an issue?

The PS4 had ~20GB/s of CPU bandwidth and the PS5 likely has ~2.5x that which is way more than it realistically needs. Even the most high-end game running at 100fps on a PC only needs around 25-30GB/s of CPU bandwidth to do so. That game being AC Odyssey.
 
Zen 3 was released too late but what makes you think bandwidth will be an issue?

The PS4 had ~20GB/s of CPU bandwidth and the PS5 likely has ~2.5x that which is way more than it realistically needs. Even the most high-end game running at 100fps on a PC only needs around 25-30GB/s of CPU bandwidth to do so. That game being AC Odyssey.
How was zen 3 too late but rdna 2 wasn’t?
 

Hoddi

Member
How was zen 3 too late but rdna 2 wasn’t?
That's a good question. I don't really have an answer to that.

But if they could have included Zen 3 then I'm pretty sure they would have done so. They didn't just do it for laughs. Maybe RDNA2 was ready earlier because they otherwise wouldn't have had that either.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
There’s no way they didn’t know it was in development

Right, knowing in development is one thing but the consoles are finalizing specs a long time before release, I don't know if AMD had, or would even have wanted, to share something so early in a prototype phase with consoles to release before they (AMD) released it officially themselves. That's just my speculation.
 
Last edited:

clampzyn

Member
Where are you getting that from? I never heard about it and modern consoles seem to hardly drop in price. I love how "it was meant to be a $199 console" is actually an actual excuse being use, it kinda makes my point for me.


Isn't that the game that was developed by one guy?
The 1st game was developed by 1 guy, this one was developed by a small studio with the help of Xbox studio i think.
 

clampzyn

Member
This also shows they could have pushed the raytracing further on the series x while maybe not max we could have probably gotten another rt effect added cause that’s a lot of headroom

Yea if you look at spiderman remastered on PS5, performance RT mode is locked 60fps, if you turn on VRR you'll get average 85-100+fps. This consoles have a lot of headroom which people doesn't seem to understand on games that have specific target framerates / resolution.
 
That's a good question. I don't really have an answer to that.

But if they could have included Zen 3 then I'm pretty sure they would have done so. They didn't just do it for laughs. Maybe RDNA2 was ready earlier because they otherwise wouldn't have had that either.
I genuinely don’t think we would need pro models if we had zen 3 that’s how significant this is
 
Right, knowing in development is one thing but the consoles are finalizing specs a long time before release, I don't know if AMD had, or would even have wanted, to share something so early in a prototype phase with consoles to release before they (AMD) released it officially themselves. That's just my speculation.
That’s fair but man is it not a missed oppurtunity i hope for the pro models (which I expect in 2024-2025) they don’t make the same fatal mistake zen 5 should be just released then so I hope they go lower end zen 5 instead of mid tier zen 4 which would be an equivalent situation to what we have now
 

adamsapple

Or is it just one of Phil's balls in my throat?
That’s fair but man is it not a missed oppurtunity i hope for the pro models (which I expect in 2024-2025) they don’t make the same fatal mistake zen 5 should be just released then so I hope they go lower end zen 5 instead of mid tier zen 4 which would be an equivalent situation to what we have now

I agree that the consoles just barely missing Zen 3 is a bit of a downer.

But still, this generation of consoles have a MUCH more favorable CPU at time of release compared to the PS4/XBO gen, the Jaguar cores were already a low-power core system before the consoles came out.
 
I agree that the consoles just barely missing Zen 3 is a bit of a downer.

But still, this generation of consoles have a MUCH more favorable CPU at time of release compared to the PS4/XBO gen, the Jaguar cores were already a low-power core system before the consoles came out.
TBf i dont like bringing up the ps4 and one cause of how crap they were on that front
 
Yet it had great looking games .. so if devs want they can do wonders.
So did the Genesis, doesn't mean the SNES didn't have hardware advantages over it that manifested in improvements in its games. Every console is like this except maybe the early 8bit consoles like the Atari 2600.
 
The biggest flaw with the consoles is by far the zen 2 cpu when zen 3 was available and the memory bandwidth
Nah Zen 2 is decent. Zen 3 is better but it launched around the time the consoles did, I'm sure if MS and Sony could have had Zen 3 they would have but given the timing, budget and difference between the two I don't think it was a big deal. I'd sooner wish that these consoles had more RAM than having Zen 3 in them.
 

clampzyn

Member
I agree, zen 2 is decent until we see this consoles struggles to run next/current gen only games. UE5 tech demo is not really the deciding factor that this consoles will run at lower resolution/fps, we'll just have to see and wait.
 

reksveks

Member
A little bit of more information in the last DF weekly video.

There is an base of memory saved that at the os level, then devs can optional turn off certain features (what these features are, I have no clue) to save more memory.
 
Nah Zen 2 is decent. Zen 3 is better but it launched around the time the consoles did, I'm sure if MS and Sony could have had Zen 3 they would have but given the timing, budget and difference between the two I don't think it was a big deal. I'd sooner wish that these consoles had more RAM than having Zen 3 in them.
i disagree with you we see how cpu intensive some of these games are like spiderman
 

DaGwaphics

Member
i disagree with you we see how cpu intensive some of these games are like spiderman

Spiderman can't be that CPU intensive on console. It runs on the last-gen CPUs, plus even low-end CPUs can run it fine on PC, RT and all (so long as the GPU is good enough).

Only current-gen exclusive software can really push these CPUs.
 

Hoddi

Member
Spiderman can't be that CPU intensive on console. It runs on the last-gen CPUs, plus even low-end CPUs can run it fine on PC, RT and all (so long as the GPU is good enough).

Only current-gen exclusive software can really push these CPUs.
Yes and no. Spiderman without RT isn’t too CPU intensive but adding RT both increases the CPU load as well as the CPU bandwidth load. As it happens, maxing out RT only adds a fairly moderate load on the CPU in Spiderman (~50%) but the added CPU bandwidth it needs is twofold.

CPU bandwidth is one of those things that tends to be forgotten about. Most of the time it doesn’t matter a whole lot but then we have cases like Spiderman where can it matter even more than the physical CPU itself.

Edit: Here’s an example that shows it pretty plainly. Both of these scenarios are running at 60fps but one of them has a significantly higher load on the CPU and bandwidth subsystems. And I’ll stress that neither of them has anything to do with the GPU.
 
Last edited:
I agree, zen 2 is decent until we see this consoles struggles to run next/current gen only games. UE5 tech demo is not really the deciding factor that this consoles will run at lower resolution/fps, we'll just have to see and wait.
They are not going to struggles to run anything because of the CPU, games will keep being made targeting consoles like always. That's hardly any AAA developer targeting PC first like in the old days.
 
Last edited:

Hoddi

Member
They are not going to struggles to run anything because of the CPU, games will keep being made targeting consoles like always. That's hardly any AAA developer targeting PC first like in the old days.
Zen 2 isn’t just decent but more than decent. I’ve no idea where this nonsense started but these CPUs were perfectly competitive when these consoles launched. And they still are.
 

DaGwaphics

Member
Yes and no. Spiderman without RT isn’t too CPU intensive but adding RT both increases the CPU load as well as the CPU bandwidth load. As it happens, maxing out RT only adds a fairly moderate load on the CPU in Spiderman (~50%) but the added CPU bandwidth it needs is twofold.

CPU bandwidth is one of those things that tends to be forgotten about. Most of the time it doesn’t matter a whole lot but then we have cases like Spiderman where can it matter even more than the physical CPU itself.

Edit: Here’s an example that shows it pretty plainly. Both of these scenarios are running at 60fps but one of them has a significantly higher load on the CPU and bandwidth subsystems. And I’ll stress that neither of them has anything to do with the GPU.

RT hits the CPU a bit harder, but an i5 8400 can still power through with RT settings at maximum (with the right GPU), far from the most demanding thing we've seen.
 

01011001

Banned
Zen 2 isn’t just decent but more than decent. I’ve no idea where this nonsense started but these CPUs were perfectly competitive when these consoles launched. And they still are.

Zen 2 is not that great tho, and the Zen 2 CPUs in the consoles are laptop grade ones.

no game should ever struggle to hit 60fps on a Zen 2 CPU tho, so for that they'll be fine
 
Zen 2 isn’t just decent but more than decent. I’ve no idea where this nonsense started but these CPUs were perfectly competitive when these consoles launched. And they still are.
Specially since to this day most games are made to run on ancient Jaguar CPUs, it was a massive leap in performance.

Not to mention that these consoles have dedicated hardware that take some of the load off their CPU.
 
Last edited:

Hoddi

Member
RT hits the CPU a bit harder, but an i5 8400 can still power through with RT settings at maximum (with the right GPU), far from the most demanding thing we've seen.
Look at the link I added in my edit. Running the game without RT averages ~3.8 fully maxed out cores on my 9900k while pushing RT to the maximum increases that to ~5.8 cores while running at the exact same 60fps. But the added CPU bandwidth is significantly higher than that.
 
Top Bottom