• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dusk Golem reiterates that the Xbox will be more powerful than the PS5 (Admitted to starting console wars, demodded)

THEAP99

Banned
Cerny is an overrated idiot. Not sure why he gets so much praise.

the ps-4 is loud as fuck, ps-4 pro is weaker than the Xbox one x, and now the ps-5 is apparently significantly weaker than the Xbox IX (series x) because he decided he wanted to put all his marbles into the SSD. And for what?

Good luck showcasing the advantage of the SSD over the advantage of resolution, because showing SSD advantages is much harder to the average gamer. And if the ps-5 is more expensive, then it’s over and they fucked up everything they built up this past gen & more.

they’re banking timed or exclusive moneyhatt content to try and sway consumers with content while Microsoft will try and sway them with power.

Sony believes content > power will work
 
I don't understand why people still feel that Series X is NOT stronger than PS5? It's going to be the 900p vs 1080p situation.

If you want the stronger console and/or gamepass, buy Series X. If you want the Playstation exclusives and ecosystem, buy the PS5. If you can't make up your mind, buy both.

And when it comes to RE8, it doesn't matter where the fuck you play it, since if you're not doing it in VR, you're doing it wrong anyway.
I don't see ppl claiming PS5 is stronger. What we do see is continued FUD about XsX being this vastly superior console, which is blatantly false propaganda.

Yes, the GPU is beefier with compute units, but clocked slower than PS5 GPU. At max theoretical performance yes you can get 15% better performance.

Console gamers have to be honest and realize most 3rd party games like madden are going to look and run almost identically.

How many times do gamers need to be disappointed by the hyped up FUD coming from Xbox "insiders"??

This is why most of community is shitting on Microsoft and their fake boasting
 

onQ123

Member
Cerny is an overrated idiot. Not sure why he gets so much praise.

the ps-4 is loud as fuck, ps-4 pro is weaker than the Xbox one x, and now the ps-5 is apparently significantly weaker than the Xbox IX (series x) because he decided he wanted to put all his marbles into the SSD. And for what?

Good luck showcasing the advantage of the SSD over the advantage of resolution, because showing SSD advantages is much harder to the average gamer. And if the ps-5 is more expensive, then it’s over and they fucked up everything they built up this past gen & more.

they’re banking timed or exclusive moneyhatt content to try and sway consumers with content while Microsoft will try and sway them with power.

Sony believes content > power will work


LOL we will see which console has the better games
 

Dabaus

Banned
Right now as we speak, Boomstick gaming which is an alternative universe xbox truther (for lack of a better term) is hosting podcast on youtube this very second titled: "PS5 More expensive & less powerful says trusted industry insider." Incredible how all of this bad ps5 bad news dam breaks the day after Halo gets delayed. More than mere coincidence I might add.
 

PresetError

Neophyte
The PS5 does have a lot more power then the One X, but it won't be rendering One X games at 4K, it will be rendering "NEXT GEN" games at 4K which is much harder to do so.

Performance and resolution are two very different things. Next gen games can be more complex but 4K is 4K no matter the image on screen.

For some games you'll need to choose between framerates and resolution, as usual.
 
Last edited:
On what size TV?
I have a 55" OLED and I have a hard time telling the difference between 1800p and 2160p when playing from couch. Dunno about how much difference there is for bigger panels but in motion I don't think it's anything significant unless you zoom in 250% and compare side by side. Stable frame rates are much more important than 18% increase in resolution.
 
Last edited:

Mister Wolf

Gold Member
I have an 55" OLED and I have a hard tell the difference between 1800p and 2160p when playing from couch. Dunno about how much difference there is for bigger panels but in notion I don't think it's anything significant unless you zoom in 250% and compare side by side. Stable frame rates are much more important than 18% increase in resolution.

Can only speak for myself but when optimizing my settings on PC i can definitely see a difference when I go from native 4K then use the internal resolution scaler and drop it to 70% of 4K. Whether that was RE2 or RDR2.
 

Thirty7ven

Banned
Can only speak for myself but when optimizing my settings on PC i can definitely see a difference when I go from native 4K then use the internal resolution scaler and drop it to 70% of 4K. Whether that was RE2 or RDR2.

On a desktop it's much easier to notice. It's much easier to notice different graphical settings too. You're sitting so close to the monitor.
 
Can only speak for myself but when optimizing my settings on PC i can definitely see a difference when I go from native 4K then use the internal resolution scaler and drop it to 70% of 4K. Whether that was RE2 or RDR2.

That's perfectly fine. I'm not as sensitive with resolution as I'm with frame rate but I do imagine how much would the average people who only plays CoD, FIFA will notice anyway. They just don't care.
 
On a desktop it's much easier to notice. It's much easier to notice different graphical settings too. You're sitting so close to the monitor.

Correct. That's exactly why I was emphasizing the phrase 'playing from couch' so much.
And on PC NVIDIA is doing God's work with DLSS. It's frighteningly good.
 

Deto

Banned
Jesus. No, percentage is just one metric that doesn't tell the whole story. Xbox's TF advantage is at least 2 TF. Nothing will change that. On top of that there's around 40% more compute units which are critical when it comes to raytracing.

Digital Foundry vs. the Xbox One architects

"There's a lot of misinformation out there and a lot of people who don't get it. We're actually extremely proud of our design."
Article by Richard Leadbetter, Technology Editor, Digital Foundry
Updated on 24 September 2013




Microsoft says that game performance doesn't scale with the number of compute units you have.

"Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."



but now Clock doesn't matter, only the number of CUs.






Sony has come to terms with this and that's why they're shoveling money at 3rd parties to differentiate PS5 versions in some other way because the graphics comparisons will be lost.

I've seen this narrative on another site, did you combine the same narrative when?


Discord is at it again I see.

Halo hit them really bad huh.


the narrative of "Sony paid for third content, and that proves that the PS5 is inferior" I read on a website in Portugal.

too much of a coincidence not to be orchestrated.
 
Last edited:
You are correct, because technically speaking no modern GPU will run at fixed clocks all the time. I was thinking only about demanding gaming scenarios where GPU will need to run at max clocks for extended period of time. XSX will offer sustained level of performance in such scenario for sure.

That's the idea. It still comes down to how good the cooling system is, but I'm basing off One X that they can handle it.

It would be ironic out of this if Series X is the louder of the two systems though, even if it's only slightly. But again, just comes down to how good the cooling is and MS nailed that already with One X. Sony's the one who needs to prove they have a very solid cooling system this time that'll keep things quiet, I'm fairly confident they will though.

Dat PS5 is a big boy xD.

Digital Foundry vs. the Xbox One architects

"There's a lot of misinformation out there and a lot of people who don't get it. We're actually extremely proud of our design."
Article by Richard Leadbetter, Technology Editor, Digital Foundry
Updated on 24 September 2013




Microsoft says that game performance doesn't scale with the number of compute units you have.

"Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."



but now Clock doesn't matter, only the number of CUs.

Both matter.

Also keep in mind saturation of CUs on 1st-gen GCN was...kinda terrible. That's part of the reason Sony made the customizations they made in PS4's GPU, and probably why MS upped the clocks on XBO. They were both valid choices for the respective designs, but RDNA1 (let alone RDNA2) utilization of CU saturation absolutely shits all over GCN's.

I honestly was not sure when I read your post since you seemed to add customisations to the Tflop delta between the two machines which implies that the PS5 has none.

Happy to get the clarification though :)

As to the actual customisations to the PS5 you list several that I would consider sophisticated BS right now - just like yourself.

I expect a significantly different GE since the GE that comes with the current RDNA2 design out of the box cannot do culling/prioritisation in the way Cerny talked about it in his speech. I believe this comes with both pros and cons. For developers utilising it it will give significant advantages to the entire rendering pipeline, however I have question marks how a standard PC-centric engine will manage this (i.e. might be a disadvantage in multi-platform titles with some serious eye candy in first party titles). How well Sony has developed the API will determine how it will fare in multi-platform titles.

Secondly, a lot points towards customisations regarding the cache/ memory side of things. Early information hinted at some weird soldering of memory chips to the board. This is one of those things that intrigue me the most but also one of those things that I am the most uncertain about. The person with dev-kit access was rather specific though.

Thirdly, it is the RT. Sony has been very tightlipped here and hinted towards that they did not go with the standard AMD approach. Some even interpreted that as if the PS5 would not have RT. Now that we know it has RT the question is what the silicon looks like. I assume the original information is correct that it is not the standard AMD approach. That begs the question though - what is it if that is not the case? That early information might of course be wrong and they sit with the bog-standard RDNA2 RT set-up.

And then comes the API.

Hopefully we will get to know more soon.


Yeah mentioning the GE customizations Sony have made with theirs...that's already been pretty much confirmed. And a while ago I figured that from the Matt engineer guy's statements, Sony may have made some of those customizations to the GE possibly to move some aspects of VRS further ahead in the graphics pipeline. That seems to be what the extent of it may be, but again I say "VRS" loosely as that's MS's implementation of such a technique. In Sony's case it would be something moreso related to Foveated Rendering and these customizations might have been done for next-generation PSVR but that also benefits non-VR games too.

As far as the cache and memory stuff are concerned, I don't think there's anything wild there we don't already know. They're using GDDR6, I know they have some patent for stacked RAM but that could've been for a hypothetical system design using HBM, which we know PS5 is not using, and while there are some prototype stacked GDDR6 designs around there's nothing in commercial sense. If Sony took such an approach it's likely for cooling purposes because if it gave a massive performance advantage I think they'd of mentioned it in Road to PS5, even if they didn't spend a great deal of time on the GPU in that presentation. There's also the SRAM cache on the I/O block, etc., and the cache scrubbers, but we've already known of this stuff by March. There isn't much else of high probability they have left to reveal in this area that would be a big surprise or anything outside of the conventional, except maybe some increase to GPU cache sizes or CPU cache sizes I guess 🤷‍♂️

Ray-tracing I feel is kind of similar. Where did Sony hint about their approach not being AMD's standard? I must've missed it. Truth is though we don't even know exactly how AMD's RT works! We just know it's based somewhat on the CUs which is where some of the figures come from. The way MS have described some of their DXR RT also seems to indicate they might've done some alterations there beyond whatever the standard is, but to what extent is up for debate, as usual.

Sony should be doing a teardown either later this month or early September going by rumors, so yeah I hope we get a (further) deep dive into the architecture around then including exactly how the variable frequency works in real-world gameplay scenarios, etc. And of course there's still the Hot Chips presentation for MS on the 17th, so not long to go on that end 👍

Well 1st. Integrated Graphics.... lol kidding. But as for that, we know that in the past they let the frequency stay sort of set and let the power fluctuate. We also for the most part know that you can in fact change a frequency multiple times in a scene. I personally don't think PC is the right comparison and it's also worth noting since this is relatively new and different this is all in theory until it's in front of us. However based on what we know such as they have indeed said they have the power value at a set amount and don't let it vary, and wen't with a varying frequency based on load (i.e what the scene requires).

So I admit it's certainly theoretical until we obviously have hands on experiences but I don't think it's improbable. I just as I said see it as them allowing devs to use what their engine needs while trying to get as close to that theoretical number everyone loves throwing around as possible. It's just new, and all we really can do is theorize. It's just well, different.

I do apologize if I have left some stuff out, today is a busy one, but i'm just trying to chime in on my free time, or perhaps I just misunderstood something you asked.

Fair points, and yeah tell me about IG xD. Upgrades are imminent once I settle on a build worth pursuing.

There's one very critical thing with Sony's approach I don't think has yet been answered: just how sophisticated is the monitoring software, and how is it managing for figuring the power load of game code to then determine what components get what amount of power? Is it fully reactive, waiting for the code to get crunched by the processor components, drawing the power and then using some kind of flag exceptions or whatever to know that power budget is being exceeded?

Dunno; it seems like they would need some kind of custom microcontroller and current sensors constantly noting power draws and having some fixed logic to regulate the PSU based on some fixed settings they gate through to automate the amount of current sent through the system, or something like that :S. Like I said, this ain't an area I'm well-versed in so I'm spitballing.

VRS isn't MS branded term. Nvidia used that crap first

A few nVidia GPUs also use executeIndirect but that is a MS-derived technology.

Nvidia GPUs were first to market and able to leverage the tech by building the required hardware to implement the feature, but that doesn't mean nVidia created that feature set.
 
Last edited:

Deto

Banned
LOLOLOLOL

Digital Foundry vs. the Xbox One architects






"Everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance," he says, "but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that can cause you not to get the performance you want if your design is out of balance."

MS: increasing 17% TF by increasing the number of CUs is not as good as increasing the clock by 6.6%
 

Jason28

Has a tiny dick and smaller e-peen
This Dusk Golem guy is full of shit. First of all, ''fake4k'' already sounds like some warrior type fuel. Second, what can he knbow about the price ? Nothing. He is just spreading FUD because he is wants to be in the center of news, even if his posts are lies.
 

Deto

Banned
A revised history:



fW47peQ.png




Q6VHcZH.png




QTI3Y9I.png
 

pawel86ck

Banned
That's the idea. It still comes down to how good the cooling system is, but I'm basing off One X that they can handle it.

It would be ironic out of this if Series X is the louder of the two systems though, even if it's only slightly. But again, just comes down to how good the cooling is and MS nailed that already with One X. Sony's the one who needs to prove they have a very solid cooling system this time that'll keep things quiet, I'm fairly confident they will though.

Dat PS5 is a big boy xD.
It looks like series x noise level is similar to xbox one x.
 

Seph-

Member
Fair points, and yeah tell me about IG xD. Upgrades are imminent once I settle on a build worth pursuing.

There's one very critical thing with Sony's approach I don't think has yet been answered: just how sophisticated is the monitoring software, and how is it managing for figuring the power load of game code to then determine what components get what amount of power? Is it fully reactive, waiting for the code to get crunched by the processor components, drawing the power and then using some kind of flag exceptions or whatever to know that power budget is being exceeded?

Dunno; it seems like they would need some kind of custom microcontroller and current sensors constantly noting power draws and having some fixed logic to regulate the PSU based on some fixed settings they gate through to automate the amount of current sent through the system, or something like that :S. Like I said, this ain't an area I'm well-versed in so I'm spitballing.

Ha, totally understandable. I'm in the same boat, time for some pc upgrades, the ol 1080 isn't what it used to be. You're also not wrong, frankly I feel like we may never actually get that answer, it'll likely just be lumped in with the "smart shift" comments which I also feel like is most definitely custom and not just the off the shelf AMD version.

As per power we really only have to go off of Cerny's comments that there is enough power to feed both at max. Which again likely ties to Smart Shift. This will have to be one of those things we either must assume on our own, or hopefully once NDA's are up we can get more info about. Unless a patent is floating around somewhere. As for this pertaining to the current thread, will again I simply just don't see this Magic Difference that apparently exist. They either just had (because the info is months old) a older dev kit or there is a issue in their code.
 

Deto

Banned
How old is this?

"my dad says"

Windows Central FUD is April / 2020

Jeff's account is an old print, to show that he has been saying "sony shit, junk playstation" on the internet for years ... and the ridiculous level of "my dad works at sony, told me that she will go bankrupt"


It is a very old joke in Brazil, from the time that there was no internet and there was always a lying friend at the school saying:

"my friend's cousin has an Uncle who works at Sony in japan, and said that she is going to launch a video game with 512 bits that will destroy the N64 with only 64 bits "
 
Last edited:

Md Ray

Member
Cerny is an overrated idiot. Not sure why he gets so much praise.

the ps-4 is loud as fuck, ps-4 pro is weaker than the Xbox one x, and now the ps-5 is apparently significantly weaker than the Xbox IX (series x) because he decided he wanted to put all his marbles into the SSD. And for what?

Good luck showcasing the advantage of the SSD over the advantage of resolution, because showing SSD advantages is much harder to the average gamer. And if the ps-5 is more expensive, then it’s over and they fucked up everything they built up this past gen & more.

they’re banking timed or exclusive moneyhatt content to try and sway consumers with content while Microsoft will try and sway them with power.

Sony believes content > power will work
He didn't design PS4's cooling solution. That's a different department. Pro came out a full year before One X. And PS5 isn't significantly weaker than SX. Both SX and PS5 are, literally, the closest two competing machines have ever been to each other.
 

Elog

Member
Ray-tracing I feel is kind of similar. Where did Sony hint about their approach not being AMD's standard? I must've missed it. Truth is though we don't even know exactly how AMD's RT works! We just know it's based somewhat on the CUs which is where some of the figures come from. The way MS have described some of their DXR RT also seems to indicate they might've done some alterations there beyond whatever the standard is, but to what extent is up for debate, as usual.

This whole story originated from the leaks from AMD last year. Arden and Sparkman had specific codes for the RT block as well as RT test results as output. Oberon did not have this.

Some people took this as 'PS5 does not have RT' - that is where that rumour came from.

Someone with DevKit access at the time claimed instead that the RT solution in the PS5 was more powerful than what was seen in the leaked specs regarding Arden and Sparkman.

Then Sony confirmed that they have an RT solution but still to this day there is no leaked document from AMD with RT connected to the PS5 chip which is weird. Who knows - might be a complete nothingburger - but it is odd that all AMD leaks had no RT solution for PS5 at the same time as the devkits had it. Confused.
 
Last edited:

Md Ray

Member
LOLOLOLOL

Digital Foundry vs. the Xbox One architects






"Everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance," he says, "but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that can cause you not to get the performance you want if your design is out of balance."

MS: increasing 17% TF by increasing the number of CUs is not as good as increasing the clock by 6.6%
Interesting find. Thanks for sharing. I had forgotten about this.
 

THEAP99

Banned
He didn't design PS4's cooling solution. That's a different department. Pro came out a full year before One X. And PS5 isn't significantly weaker than SX. Both SX and PS5 are, literally, the closest two competing machines have ever been to each other.
Then why are we hearing about so many big discrepancies ?
 
Cerny has said 2GHz clock was already too much for PS5 GPU with fixed clock strategy, so why do you think PS5 GPU can sustain even higher clock (2.2GHz) now? Since you know everything then please explain me how technology works 😀.
From the horse’s mouth:

"The time constant, which is to say the amount of time that the CPU and GPU take to achieve a frequency that matches their activity, is critical to developers," adds Cerny. "It's quite short, if the game is doing power-intensive processing for a few frames, then it gets throttled. There isn't a lag where extra performance is available for several seconds or several minutes and then the system gets throttled; that isn't the world that developers want to live in - we make sure that the PS5 is very responsive to power consumed. In addition to that the developers have feedback on exactly how much power is being used by the CPU and GPU."

”So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."

Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.

But what if developers aren't going to optimise specifically to PlayStation 5's power ceiling? I wondered whether there were 'worst case scenario' frequencies that developers could work around - an equivalent to the base clocks PC components have. "Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing," Mark Cerny counters. "I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."


Cerny also discusses how developers can get even more performance by optimizing around power consumption:

Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level. "Power plays a role when optimising. If you optimise and keep the power the same you see all of the benefit of the optimisation. If you optimise and increase the power then you're giving a bit of the performance back. What's most interesting here is optimisation for power consumption, if you can modify your code so that it has the same absolute performance but reduced power then that is a win. "

In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."


 
Last edited:

Journey

Banned
LOLOLOLOL

Digital Foundry vs. the Xbox One architects






"Everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance," he says, "but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that can cause you not to get the performance you want if your design is out of balance."

MS: increasing 17% TF by increasing the number of CUs is not as good as increasing the clock by 6.6%


Xbox One leveraging clocks to match more CUs in PS4 = Lie

PS5 leveraging clocks to match more CUs in XSX = Lie

Thanks for the confirmation :messenger_beaming:



More CU's > Higher clock speeds.
 

pawel86ck

Banned
From the horse’s mout
Cerny also discusses how developers can get even more performance by optimizing around power consumption:
I know smartshift will massively improve power consumption, but I want to know how such technology can make unstable clock 2GHz stable at 2.2GHz. PS5 GPU had not enough power to run at 2GHz with fixed clock strategy?
 
That 15% extra power sure makes miracles happen it seems.

In the PC space can nvidia cards with 15% more power and a little extra bandwidth quadruple resolution and double framerates?
Doesn't have to quadruple or double anything.. just make it more consistent. If the difference is 60fps vs 50fps it could make a big difference in enjoyment.
 
Doesn't have to quadruple or double anything.. just make it more consistent. If the difference is 60fps vs 50fps it could make a big difference in enjoyment.

Can it be possible for both versions framerates to be the same?

I'm assuming developers will just drop the resolution to stabilize the framerates on the PS5.
 

onQ123

Member
Xbox One leveraging clocks to match more CUs in PS4 = Lie

PS5 leveraging clocks to match more CUs in XSX = Lie

Thanks for the confirmation :messenger_beaming:



More CU's > Higher clock speeds.

The problem with Xbox One vs PS4 is that xbox One only had 16ROPs vs PS4 32ROPS & then there was the DDR + ESRAM vs PS4 using GDDR.

PS5 actually get real advantages from having the higher clock rate.
 
I know smartshift will massively improve power consumption, but I want to know how such technology can make unstable clock 2GHz stable at 2.2GHz. PS5 GPU had not enough power to run at 2GHz with fixed clock strategy?

I think your question is addressed in the parts of the article I quoted. If you do not understand that, maybe you do not even understand what you are asking?

Higher clocks are possible because the processor runs more efficiently, therefore consuming less power, just like what you said SmartShift accomplishes. So, under a fixed clock scenario (as mentioned in the article) without the ability to throttle, the APU can draw too much power, which leads to automatic shutdown.

Not only does this variable frequency paradigm improve overall system performance, it enables a new opportunity for optimization; the more efficient use of power allows real world performance to closer approach the PS5’s theoretical capability.
 
Top Bottom