• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

TrippleA345

Member
To counter confusion
The concept of Amd Smartshift is as follows:
A Control Unit measures workloads. Based on the workloads, Smartshift decides to shift energy between Cpu and Gpu. When this happens the frequencies of the Cpu/Gpu go down. But not to much. In total, more transistors are used more often without using more energy and getting more performance.

This year two devices will get Amd Smartshift. The PS5 and a laptop from Dell. Amd claims up to 14% more performance in games on the laptop (they still have to prove it). We'll have to wait and see how much this will be for the PS5.
Besides, Amd Smartshift is free. Free not in the sense of dollars but in the sense that it doesn't take up (very little) space on the device, as well as an automatic process that is done by the system, so that no software developer has to worry about this process.
Only problem. It must be implemented in advance, i.e. when the device is developed, because it is implemented deep in the system.

By the way, the counterpart of Amd Smarshift from NVIDIA is called "dvanced Optimus".
These are both brand new technologies.
 

jose4gg

Member
Because CPU/GPU don't get 100% usage all the time. Simple as that. When CPU is at 80% and GPU at 100%, more performance goes to the GPU, less to the CPU and vice-versa, it's pretty simple.

Yea and this can happen hundreds of times per second, this is not a situation where a specific "level" or scene uses more cpu/gpu... This happens operational per frame, what is happening in this specific frame, at this specific moment.
 

IntentionalPun

Ask me about my wife's perfect butthole
Because CPU/GPU don't get 100% usage all the time. Simple as that. When CPU is at 80% and GPU at 100%, more performance goes to the GPU, less to the CPU and vice-versa, it's pretty simple.
If your CPU is 80% utilized, and you lower the clocks, what is running on that CPU will lose performance.

If you could leave it at max clocks, to not lose that performance.. you would.

But.. you can't... because the PS5 is configured with clocks that aren't sustainable, on purpose..
 

geordiemp

Member
If your CPU is 80% utilized, and you lower the clocks, what is running on that CPU will lose performance.

If you could leave it at max clocks, to not lose that performance.. you would.

But.. you can't... because the PS5 is configured with clocks that aren't sustainable, on purpose..

No

CPU and GPU share the bus to RAM, they take turns. CPU tells GPU what to do. They take turns,

Why do posters think in averages, think in whats happening at every clock cycle, then it will become clearer.

This is even more true of an APU. So much stuff posted about TF and clocks and not much consideration of what happens in a frame of rendering.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Ok, saw your reply after posting. I think you're hanging on the "all times" here which is kind of a straw man. Nobody mentioned all times (or at least I didn't see that). Depending on the workload, the CPU and GPU will run at the max clocks, but in specific workloads there will be the power shift, in order to avoid certain bottlenecks.

There's no need to be at max clocks "at all times" because that would go against the logic of using SmartShift
There is no reason NOT to be at max clocks at all times if your hardware can maintain it.

That is the entire point of variable clocks.

How in the world am I getting "hung up on" the "at all times" thing? It's at the core of why you ever choose a variable clock setup.
 

Elog

Member
WTF are you talking about?

I've never heard of a game console doing thermal throttling. They are designed to avoid that.. and if they don't, they fry if anything.

Here you see a graph (credits to beyond3d forum) where both power and temperature gives variability of frequency on a fixed frequency GPU. So if you do not think this happens on a fixed frequency console I do not know what to say.

zQhkWM1.png
 
Last edited:

jose4gg

Member
Whatever code is running on the CPU, will lose performance, if you downclock the CPU...

The PS5 decided which would be better to give increased power to, favoring the GPU, as it is more likely to have intense workloads.

But when it does that, whatever is running on the CPU will run slower.. AKA lose performance.

2 things...

1- Power does not scale linearly with clock speed.
2- If the CPU in a specific frame (this is the most important thing) ALREADY finished the operations it needed to complete, or the operations left do not need all the CPU power because they are very simple operations, then no, you won't lose performance, you can redirect this power to the GPU and gain performance, in a place where it's really needed.
 
There is no reason NOT to be at max clocks at all times if your hardware can maintain it.

That is the entire point of variable clocks.

How in the world am I getting "hung up on" the "at all times" thing? It's at the core of why you ever choose a variable clock setup.

That is so wrong, you have no idea how wrong that assumption is. It cuts to the core of why this entire misconception of seeing SmartShift as some "boost clock frequency" of the typical GPUs.
 

Mr Moose

Member
Link?

Maybe he's badly paraphrasing things that are correct.. but what he said is absolutely not correct.

You can't magically not lower performance when lowering clocks.. like who would ever make that claim?

You can get higher average performance than the same design at fixed clocks though. But... the XSS clocks are fixed, higher than the max clocks of the PS5... so when comparing them, the XSS has the clear advantage at least overall theoretical TF.

PS5 has advantage for maximum clock speed per CU on the GPU side though.
*Cough* 👀
 

IntentionalPun

Ask me about my wife's perfect butthole
That is so wrong, you have no idea how wrong that assumption is. It cuts to the core of why this entire misconception of seeing SmartShift as some "boost clock frequency" of the typical GPUs.

Nope, it's not wrong.

(apparently this is valid here, since it's all anyone here has ever been able to say to me)
 

DrDamn

Member
I simply don’t get this way of thinking. The XsX can run at its max clocks all the time. On a leveled playing field Variable clocks can’t match fixed max clocks.

The point is about fixed Vs variable, not PS5 Vs XSX. If XSX implemented smart shift it could go faster than it does, and balance between CPU and GPU power demands.

Let’s say that the CPU is under heavy load due to AVX usage. As far as I know the XsX CPU will handle the load and the PS5 would have to underclock to stay under the power budget. No?

See above, it's not a about PS5 Vs XSX. It's whether the tech is beneficial or not. Yes it is.

The entire point of making clocks variable is because that same chip/setup/power budget/etc. could NOT handle max clocks at all times. He makes the statement that it isn't why they exist, but that isn't true at all. It's literally the reason you make clocks variable, to push beyond what would be possible by picking a static number. That was flat out wrong, and his stuff about bottlenecks is just sort of overly confusing too.

Fixed clocks: Pick a static number that you believe all code can achieve without causing power/heat issues.
Variable clocks: Let the clocks go higher than they could "at all times" because there are specific times when that is feasible without causing issues.

Smart shift as implemented in PS5 is about a power budget not just speed. They can both run at full speed if the load allows. Full speed does not equal a fixed power draw, that depends on load too. This is there whole point of the tech. If the load is not maximum - and in the vast majority of cases it shouldn't be - a chip can run faster within the same power budget.
 

jose4gg

Member
Yes

What magic would cause code running on a 1GHZ CPU to run the same performance as code running on a 1.5GHZ CPU?

You need an operation to finish in 1/30 seconds, if a small operation takes to complete 1/100 seconds, yea, you don't care if you are using a 1GHZ CPU or a 1.5CPU because you finished on time in both cases, however, your comparison is extreme, the reality will be 1.3GHZ vs 1.5GHZ.
 
Last edited:

geordiemp

Member
Yes

What magic would cause code running on a 1GHZ CPU to run the same performance as code running on a 1.5GHZ CPU?

Ok, lets give a sily example if it maybe sinks in a little.

CPU runs at 3.5 GHz and gPU is at 0.1 GHz, does lots of calculations and works out what to draw accessing memory,. It then tells GPU what to draw. It winds down to 0.1 Ghz and has a nap, orders a pizza and watches TV.

GPU, which has been out to the nightclub, wakes up and runs at 2,.23 Ghz and draws the new information like a madman. Frame finshes

2.23 and 3.5 Ghz, but but its not sustained. :messenger_beaming: Thats how shit works. Now look at a frame in real life. Blue is CPU taks, brown is GPU tasks. What is sustained exactly ?

nVQSQys.png
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
You need an operation to finish in 1/30 seconds, if a small operation takes to complete 1/100 seconds, yea, you don't care if you are using a 1GHZ CPU or a 1.5CPU, however, your comparison is extreme, the reality will be 1.3GHZ vs 1.5GHZ.
Sure, that's a best case scenario where the performance of your GAME is not effected due to a specific piece of code not needing to be completed at the same speed it could be with a higher clock.

It's not always going to be best case scenario though. And either way, the performance of THAT code is lowered. So outside of best case scenarios, it can effect a game's performance.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Ok, lets give a sily example if it maybe sinks in a little.

CPU runs at 3.5 GHz and gPU is at 0.1 GHz, does lots of calculations and works out what to draw accessing memory,. It then tells GPU what to draw. It winds down to 0.1 Ghz and has a nap, orders a pizze and watches TV.

GPU, which has been out to the nightclub, wakes up and runs at 2,.23 Ghz and draws the new information like a madman. Frame finshes

2.23 and 3.5 Ghz, but but its not sustained. :messenger_beaming: Thats how shit works.
What you said "no" to was absolutely false.

If you guys want to have a different convo, have a different convo. You can back off with this calling me wrong thing when I'm factually not though.
 
Last edited by a moderator:

Elog

Member
No I'm not confusing power with frequency.

The PS5 lowers the CLOCK SPEEDS of the CPU and GPU, not just shifting power.

This conversation is about variable clocks. What conversation do you think is being had?

Fixed frequency system:

CPU at 70%, GPU at 100% and system taps out on power -> net result is down clocking of both CPU and GPU

Variable frequency system such as PS5:

CPU at 70%, GPU at 100% and systems taps out on power -> down clock CPU with 20% with maintained frequency for the GPU

Unless you make the assumption that the PS5 has an undersized cooling and power solution compared to the XSX (why would you assume that?) - the ability to divert power is only upside.
 

Brudda26

Member
Whatever code is running on the CPU, will lose performance, if you downclock the CPU...

The PS5 decided which would be better to give increased power to, favoring the GPU, as it is more likely to have intense workloads.

But when it does that, whatever is running on the CPU will run slower.. AKA lose performance.
The clocks arent important here it's the power budgets. Different tasks require different amounts of power. The whole point is diverting excess power away from tasks that dont need it and putting that power to use someplace else.
 

ToadMan

Member
I found this quote from someone at Sony called M Cerny. Not sure if he knows his stuff or not but here’s what he had to say:

“There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.”
 

jose4gg

Member
No I'm not confusing power with frequency.

The PS5 lowers the CLOCK SPEEDS of the CPU and GPU, not just shifting power.

This conversation is about variable clocks. What conversation do you think is being had?

I think you don't get the port of smartshift, so let me explain it, in a PC world perspective

You have two machines, EQUAL MACHINES:

- SAME CPU
- SAME GPU
- SAME RAM

The one with Smartshift will have performance benefits... It will run better, ¿Why?

Because
frame per-frame basis, is distributing the power it has where it's needed, sometimes, in fact, a lot of times, FRAME PER-FRAME, there is a lot of power that is left on the table, that one of the components do not use.

Smartshift is not taking a PC with 10% less power and making it compete with a PC that is 8%/9%/10% stronger. It's taking the same PC and making it work better than it counterparts because it knows how to optimize the power distribution between the components...

.
 
Last edited:
If your CPU is 80% utilized, and you lower the clocks, what is running on that CPU will lose performance.

If you could leave it at max clocks, to not lose that performance.. you would.

But.. you can't... because the PS5 is configured with clocks that aren't sustainable, on purpose..
Man just read what you wrote.

You basically said
If your CPU is running to 80% then you can use more energy in the GPU affecting the CPU.

Is not true.

The only moment when this happens is when the utilization combine of your GPU and CPU is
close/equals to 100% and is above the power budget, something like this can happens but is very weird,
that means you workload in both chips is the most heavier possible and usually this kind of
workloads happen in very specific moments.

The key is understand your game is not a stress test.

When you are working in the optimization you always try left enough free space in your target device because
if not your game/software is basically will always fail the target has soon something "unpredictable" happens.
 
Last edited:

geordiemp

Member
What you said "no" to was absolutely false.

If you guys want to have a different convo, have a different convo. You can fuck off with this calling me wrong thing when I'm factually not though.

Anyone who goes on about sustained I just laugh, please yourself. Most dont even want to understand how CPU and GPU work and just trying to FUD Ps5 very very poorly its not worth my time,.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Here you see a graph (credits to beyond3d forum) where both power and temperature gives variability of frequency on a fixed frequency GPU. So if you do not think this happens on a fixed frequency console I do not know what to say.

zQhkWM1.png
Yes game consoles historically have had absolute fixed frequencies.
I think you don't get the port of smartshift, so let me explain it, in a PC world perspective

You have two machines, EQUAL MACHINES:

- SAME CPU
- SAME GPU
- SAME RAM

The one with Smartshift will have performance benefits... It will run better, ¿Why?

Because
frame per-frame basis, is distributing the power it has where it's needed, sometimes, in fact, a lot of times, FRAME PER-FRAME, there is a lot of power that is left on the table, that one of the components do not use.

Smartshift is not taking a PC with 10% less power and making it compete with a PC that is 8%/9%/10% stronger. It's taking the same PC and making it work better than it counterparts because it knows how to optimize the power distribution between the components...

.

Why are you explaining shit to me I already know, and have already basically stated?

The variable frequency clock system can push beyond the frequencies of that same system which used fixed clocks.

Maybe read the 2-3 posts where I said that.
 
Watch every game console from here on have some kind of variable frequency/power budget because console manufacturers will see it's actually a good thing for such a machine.

Not only that but having variable clocks is going to help with power draw and cooling as well! I am surprised that Microsoft didn't go that route as well, it seems like a natural progression for the tech...
 

jose4gg

Member
Sure, that's a best case scenario where the performance of your GAME is not effected due to a specific piece of code not needing to be completed at the same speed it could be with a higher clock.

It's not always going to be best case scenario though. And either way, the performance of THAT code is lowered. So outside of best case scenarios, it can effect a game's performance.

The problem is you keep talking about the game, like it works per game, or per scene, or per level, or the amount of npc in the map. This works, per frame... Whatis happening on each frame, but multiple times, even in the same frame, the idea, that "is a best-case scenario" it's absurd, because we do know, for experience, games do not take the TOTAL of the things available to them not even per-frame basis, see any graph of game usage, the usage various A LOT in a single second, that's why Smartshift exists.
 
Last edited:
I found this quote from someone at Sony called M Cerny. Not sure if he knows his stuff or not but here’s what he had to say:

“There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.”
Oh, but I bet he (the one who designed the hardware) is wrong and IntentionalPun IntentionalPun is right.
 
No I'm not confusing power with frequency.

The PS5 lowers the CLOCK SPEEDS of the CPU and GPU, not just shifting power.

This conversation is about variable clocks. What conversation do you think is being had?

I'm telling you, you really are mixing the 2 things. There is a relation between power and frequency, but that's not the only relation you need to consider.

Let me try and elaborate with an example.

Different workloads consume different amounts of power to sustain a specific frequency.

When you are rendering a frame there will be several calculations that need to occur, both at the CPU and GPU level. What you need is to have enough time to do all the calculations in order to render your target frame within the "FPS" you're aiming at (don't want to use another frequency here, just to not confuse further). So, if the GPU needs to wait a few ms because the amount of calculations required of the CPU is quite taxing, a power shift could occur in order to prioritize the CPU. A few ms later, once the CPU finished it's job, the GPU kicks in and if it needs to do a ton of calculation (rays, particles, etc), the power could again shift once again to the GPU.

This is what people mean when they say it helps eliminate bottlenecks. A typical brute force approach, like the PC, requires a lot of overhead. You will nearly never have a 100% utilization of both CPU and GPU on any game, plus the power draw is a lot higher to feed those needs.

Sorry, I probably did a terrible job explaining this, but I can see where you're mixing stuff. Probably other people will be able to better clarify :)
 

IntentionalPun

Ask me about my wife's perfect butthole
Anyone who goes on about sustained I just laugh, please yourself. Most dont even want to understand how CPU and GPU work and just trying to FUD Ps5 very very poorly its not worth my time,.

Yes.. such FUD:

IntentionalPun said:
Variable clocks: Let the clocks go higher than they could "at all times" because there are specific times when that is feasible without causing issues.

I explained outright that variable frequencies allow a system to push farther than the same system with fixed clocks...

Never denied it... in fact it's at the VERY CORE OF THE FACT IVE BEEN REPEATING:

They are variable because the same chip COULDNT maintain max at all times. It's saying the exact same thing and it isn't FUD:

The PS5 would be slower if they fixed the clocks as they'd have to fix the clocks at a lower speed than they can achieve with variable clocks.

You guys just knee jerk assume everything is anti-PS5 FUD
 
Last edited by a moderator:

kyliethicc

Member
Yes it is.

Why wouldn't Sony max the clocks at all times if it was possible within their power budget/not cause thermal issues?
For some games/instructions, PS5 can run both CPU+GPU at full clocks. Call that “X”.

Sony creates fans/heat sink to cool for X so console is reasonably quiet and doesn’t draw too much power.

So what if a developer wants to run instructions or creates a game that is X+1 ?

You let them down clock to stay within the thermal limits of the cooling solution and max draw.

It’s let’s a game do X+1 while they can design the console’s fans and heatsink around X.
 

IntentionalPun

Ask me about my wife's perfect butthole
I'm telling you, you really are mixing the 2 things. There is a relation between power and frequency, but that's not the only relation you need to consider.
I'm telling you I'm not.

Here's your option:

Respond to the words I'm saying, or get ignored by me.

I'm well aware of Sony's workload best method for calculating where to give power/frequency to the CPU/GPU.

That wasn't being discussed.

So read words, respond to words.. or have a nice day.
 

Brudda26

Member
The whole point of the variable clock solution is efficiency. With fixed clocks your wasting power on stuff that may not need all that power. With variable it's about distributing the power you need for the tasks as hand. The CPU will not need max power to run 3.5ghz at all times its dependant on the tasks. Whilst there will be scenarios where you may need to drop clocks a small % for very heavy tasks that's the worst case scenario. With a fixed system you would be wasting power on tasks that dont need that power. So the GPU and CPU will exceed the power budget on a fixed system and if left for to long you eventually get thermal shutdown.

If you GPU is at 100% and CPU at 80% it will require less power to sustain that 80% so they can then go oh we have this excess power let's divert that to the GPU so its able to do more.

TLDR PS5 is able to give the tasks the power required and doesnt leave excess power on the table to waste.
 

geordiemp

Member
Yes.. such FUD:



I explained outright that variable frequencies allow a system to push farther than the same system with fixed clocks...

Never denied it... in fact it's at the VERY CORE OF THE FACT IVE BEEN REPEATING:

They are variable because the same chip COULDNT maintain max at all times. It's saying the exact same thing and it isn't FUD:

The PS5 would be slower if they fixed the clocks as they'd have to fix the clocks at a lower speed than they can achieve with variable clocks.

You guys just knee jerk assume everything is anti-PS5 FUD because you are seriously kind of mentally ill.

It's counterintuitive but processing dense geometry typically consumes less power than processing simple geometry which is I suspect why "Horizon"s map screen with its low triangle count makes my PS4 Pro heat up so much.

PlayStation 5 is especially challenging because the CPU supports 256 bit native instructions that consume a lot of power.

These are great here and there but presumably only minimally used or are they if we plan for major 256 bit instruction usage we need to set the CPU clock substantially lower or noticeably increase the size of the power supply and fan.

36CUs at 2.23 GHz is 10.3 Teraflops and we expect the GPU to spend most of its time at or close to that frequency and performance.

Similarly running the CPU at 3 GHz was causing headaches with the old strategy. But now we can run it as high as 3.5 GHz.

What this means, he gave examples, is running game play ps5 will be at 2.23 and 3.5 Ghz most of the time, the down clocks will occure on CPU for AVX 256 and when doing simple triangles in map screens or game code written like that like furmark.

THAT IS WHAT HE MEANT BY FIXED STRATEGY NOT WORKING, what if avx + Furmark is run, you need to downclock and fixed is not good. But who cares ? XSX would also heat up...but why ?

Some think fixed strategy not working must be cant sustain 2 GHz, thats just crap and shows NO UNDURSTANING of anything and basic reading skills. Cerny made it so simple yet I just dont get it..

If you wish to interpret it differently, its up to you . Most either do not understand the above or seem to read it upside down, its fucking mindboggling.
 
Last edited:
I'm telling you I'm not.

Here's your option:

Respond to the words I'm saying, or get ignored by me.

I'm well aware of Sony's workload best method for calculating where to give power/frequency to the CPU/GPU.

That wasn't being discussed.

So read words, respond to words.. or have a nice day.

Ok... I'm pretty sure I'm addressing the specific points you raised but probably didn't express myself well. I didn't mean to get you upset over this.

I assure you there's no need to use any ignore here, I'll let it go, it's fine.
 
Status
Not open for further replies.
Top Bottom