"Potentially". And you're confusing clocks with workloads. I can lock my CPU at max frequency in Windows. Even if I let the PC sit idle, it will run at its max clock, but will barely consume any power.
If I put the CPU workload is at 100%, the power consumption will rise significantly.
The same can be done for the GPU.
yes "potentially" as in "potential" is there if you want to use it
you can have your pc idle or you can have your pc idle with less clocks
you can do something small with full clocks you can do the same small task with less clocks
at the end of the day we intend to run a game and they run in frames
And more importantly, Cerny mentioned they had trouble maintaining locked 3 GHz on the CPU and locked 2 GHz on the GPU, doing things the traditional way. But somehow the console can handle both the GPU and CPU at max workloads and clocks at the same time? Yeah right.
do you have the link?
That does not address the question in the slightest.
in question 1 you ask about profiles, those are for devkits
in question 2 you are mixing two things, when developers make optimization they dont want variable clocks because want to test how many clocks it takes a algorithm the devkit have profiles for that, when developer dont need to optimize the can simply run the game without a profile and the system will just automatically adjust itself for what they intend to use to save power, if they dont use much CPU then it can be reduced if they require more CPU the it will use more they can test and run in a profile with specific clocks or just as the system run normally that depends what they need to optimize or check
games have a tendency to use more GPU than CPU
that is why AMD do this
SmartShift Gaming Power
AMD SmartShift technology allows the processor and the graphics to consume power from a shared power budget by dynamically shifting power depending on the task at hand. In the above gaming performance example, the power has been shifted from the processor to the graphics to enable improved gaming performance.
- The PS5 has a max power budget based on the PS5 cooling capabilities.
"We don't use the actual temperature of the die, as that would cause two types of variance between PS5s," explains Mark Cerny. "One is variance caused by differences in ambient temperature; the console could be in a hotter or cooler location in the room. The other is variance caused by the individual custom chip in the console, some chips run hotter and some chips run cooler. So instead of using the temperature of the die, we use an algorithm in which the frequency depends on CPU and GPU activity information. That keeps behaviour between PS5s consistent."
Inside the processor is a power control unit, constantly measuring the activity of the CPU, the GPU and the memory interface, assessing the nature of the tasks they are undertaking. Rather than judging power draw based on the nature of your specific PS5 processor, a more general 'model SoC' is used instead. Think of it as a simulation of how the processor is likely to behave, and that same simulation is used at the heart of the power monitor within every PlayStation 5, ensuring consistency in every unit.
"The behaviour of all PS5s is the same," says Cerny. "If you play the same game and go to the same location in the game, it doesn't matter which custom chip you have and what its transistors are like. It doesn't matter if you put it in your stereo cabinet or your refrigerator, your PS5 will get the same frequencies for CPU and GPU as any other PS5."
"As for the details of the cooling solution, we're saving them for our teardown, I think you'll be quite happy with what the engineering team came up with."
- That power budget needs to be divided between the GPU and the CPU.
"The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two. If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU."
- The max clocks can be reached for both as long as that power budget is not exceeded.
"There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz"
if required they will run at their fastest clocks and yes that also means using it for calculations
- The actual power used by the system depends on the workload of the components.
we can say the same for a PC, microwave, refrigerator or even a washingmachine, the amount of power used at specific time depend what parts of their hardware is in use
- If the power budget is at risk of being exceeded, the clocks of the hardware that has the lower workload, be it the CPU or the GPU, is lowered in order to not exceed the max power budget.
- It is rare that both the CPU and GPU are at 100% workload at all times.
- The system cannot handle both the CPU and the GPU having max workload, but it can handle both of them having max clocks.
"I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."
- Developers can choose to limit the workload on one to max out the clock speed of the other.
in the Devkits yes
"Regarding locked profiles,
we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power."