• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Stock vs Overclocked PS3

Gaiff

Member
So, a youtuber uses a custom firmware in his PS3 (the OG fat model) to overclock the RSX which results in substantially higher performance. To make up for the higher temps, he ramped up the fan speeds. I really like revisiting old hardware to see what can be done on it with modern tools and mods.


FPS Results:

Standard PS3:

Min: 15.00
Max: 60.00
Average: 39.40
1st Percentile: 28.00
5th Percentile: 30.00

Overclocked PS3:

Min: 17.00
Max: 60.00
Average: 46.25
1st Percentile: 32.00
5th Percentile: 35.00

Game runs in 1152x640 resolution. Using the unlock framerate option in the game settings.

The overclocked PS3 is running a custom firmware with the clock speeds on the RSX boosted from 500mhz to 600mhz on the core and from 650mhz to 750mhz on the memory.

Note: This only overclocks the RSX/GPU, the Cell CPU is left at default speeds.

17% higher average fps and 13% higher minimums.


FPS Results:

Standard PS3:

Min: 18.00
Max: 31.00
Average: 27.39
1st Percentile: 21.00
5th Percentile: 23.00

Overclocked PS3:

Min: 22.00
Max: 31.00
Average: 29.31
1st Percentile: 24.00
5th Percentile: 26.00


Game runs in 1152x640 resolution. Using the unlock framerate option in the game settings.

The overclocked PS3 is running a custom firmware with the clock speeds on the RSX boosted from 500mhz to 600mhz on the core and from 650mhz to 750mhz on the memory.

Note: This only overclocks the RSX/GPU, the Cell CPU is left at default speeds.

7% higher average (would be more but the fps is capped at 31). 22% higher minimum.


Standard PS3:

Min: 16.00
Max: 29.00
Average: 22.43
1st Percentile: 17.00
5th Percentile: 18.00

Overclocked PS3:

Min: 19.00
Max: 35.00
Average: 26.72
1st Percentile: 20.00
5th Percentile: 22.00

Game runs in 1280x720 resolution.

The overclocked PS3 is running a custom firmware with the clock speeds on the RSX boosted from 500mhz to 600mhz on the core and from 650mhz to 750mhz on the memory.

Note: This only overclocks the RSX/GPU, the Cell CPU is left at default speeds.

19% higher average and 19% higher minimum.

Overall, the difference isn't insane but it certainly makes for a smoother experience and you need all the frames you can get when the game is bottoming out at 15fps.
 

adamsapple

Or is it just one of Phil's balls in my throat?
This reminds me of a topic ages ago on GAF where a user claimed to have a "Super PS3" with faster specs ..
 
GTA IV 19fps on PS3 🤮🤦‍♂️
5NLNRWF.gif
 

Klosshufvud

Member
That's actually a pretty big boost. I wonder how loud the fans are. If not, then Sony left a lot of free performance on the table here. RDR in particular is borderline unplayable due to its sub 30 performance.
 

SScorpio

Member
That's actually a pretty big boost. I wonder how loud the fans are. If not, then Sony left a lot of free performance on the table here. RDR in particular is borderline unplayable due to its sub 30 performance.
It's likely just due to the later 45nm RSX chips using less power. The first models had 90nm chips and the heat and cooling cycles led to the YLOD. There are Frankenstein mods people are doing where you can install a 45nm RSX into an OG PS2 full hardware backward compatible PS3.

A lot of hardware isn't running at its limits, as higher clocks start requiring much more power, which in terms is a lot more heat for minor improvements. The RTX 4090 for example can be undervolted losing 1-3% in performance but drawing 25-30% less power.
 

Klosshufvud

Member
It's likely just due to the later 45nm RSX chips using less power. The first models had 90nm chips and the heat and cooling cycles led to the YLOD. There are Frankenstein mods people are doing where you can install a 45nm RSX into an OG PS2 full hardware backward compatible PS3.

A lot of hardware isn't running at its limits, as higher clocks start requiring much more power, which in terms is a lot more heat for minor improvements. The RTX 4090 for example can be undervolted losing 1-3% in performance but drawing 25-30% less power.
That's a reasonable take on the thing. And yeah it's always interesting to see how manufacturers balance power draw with performance. Sometimes the user profiles make more sense.
 

01011001

Banned
lol wtf? why did they lock the PS3 version of GTA4 to 31 fps while letting the 360 version run unlocked?

weird decision
 

Rudius

Member
PS3 and 360 would have been much better consoles from a technical standpoint if we went with CRT TVs at SD resolution. I played the 3 Uncharted games on a CRT and they looked beautiful. If they were to target 480p, 60fps would have been the standard for that generation.

In fact I think we should have stayed one generation behind in terms of TV resolution: 720p (with native 720p monitors) for PS4 and 1080p for PS5, but with all the other features, like HDR, Oled, 120hz etc. that are available today.

At least for the PS5 generation the use of reconstruction technology is helping the consoles to make a reasonable use of 4K displays.
 

Majormaxxx

Member
That's actually a pretty big boost. I wonder how loud the fans are. If not, then Sony left a lot of free performance on the table here. RDR in particular is borderline unplayable due to its sub 30 performance.
I played it absolutely fine. Don't see how that is unplayable. Mind you, I copied the disc to the HDD. Other than that, it ran Fine. The problem with RDR is the low resolution and textures, not the FPS.
 

radewagon

Member
I played it absolutely fine. Don't see how that is unplayable. Mind you, I copied the disc to the HDD. Other than that, it ran Fine. The problem with RDR is the low resolution and textures, not the FPS.
Agreed. I enjoyed it on PS3 via disc. Also, I can confirm that the biggest problem is definitely the sub-HD resolution.
 

SScorpio

Member
That's a reasonable take on the thing. And yeah it's always interesting to see how manufacturers balance power draw with performance. Sometimes the user profiles make more sense.
The whole dynamic clocks with auto over and down clocking between both the CPU and GPU to maintain a power budget is too new of a thing for the PS3/360 generation.

With consoles, you also want a standard performance level to target, though having options of unlocked framerates is great in terms of running on more powerful next-gen hardware.


PS3 and 360 would have been much better consoles from a technical standpoint if we went with CRT TVs at SD resolution. I played the 3 Uncharted games on a CRT and they looked beautiful. If they were to target 480p, 60fps would have been the standard for that generation.

In fact I think we should have stayed one generation behind in terms of TV resolution: 720p (with native 720p monitors) for PS4 and 1080p for PS5, but with all the other features, like HDR, Oled, 120hz etc. that are available today.

At least for the PS5 generation the use of reconstruction technology is helping the consoles to make a reasonable use of 4K displays.

People were in the process of upgrading to HDTVs at that time and wanted something to show them off. While not every game runs perfectly, both the PS3 and 360 generally run well at 720p. At the time of their launches, PC gamers were playing at 720+ resolutions for over a decade. So consoles always looked like blurry messes at 240p/480i on a CRT.

The PS4 was also perfectly good at 1080p, and PS5 is fine at 4k as long as you don't do raytracing as AMD still hasn't gotten their act together and supporting that to Nvidia's level.

People always argue about higher framerate, but I still remember interviews with Insomniac about 30fps instead of 60 for Ratchet and Clank. Their claim is still that they did focus testing and more people prefer 30fps with more complex graphics versus 60fps with concessions. There will always be people who prefer more fps, but the majority of the market had VCRs that always flashed 12:00, and didn't turn on their TV's game mode, or disable motion interpolation. And now more than ever there is an option for people who demand the best, get a PC.
 

dave_d

Member
The youtuber should have done a comparison of Dragon's Dogma. I tried that on the PS3 and it was kind rough so I eventually got it on the PC which has a much nicer frame rate.
 

Majormaxxx

Member
Two of the roughest ps3 ports - FEAR 1 and Shadow of Mordor. FEAR, even overclocked, won't look better. But at least Mordor might be finally playable on ps3?
 

lmimmfn

Member
Unplayable

Why is quoting replying/editing posts on the site shit?
It reembeds your changes into the quote of post you're replying to, grrrr
 

Rudius

Member
The whole dynamic clocks with auto over and down clocking between both the CPU and GPU to maintain a power budget is too new of a thing for the PS3/360 generation.

With consoles, you also want a standard performance level to target, though having options of unlocked framerates is great in terms of running on more powerful next-gen hardware.




People were in the process of upgrading to HDTVs at that time and wanted something to show them off. While not every game runs perfectly, both the PS3 and 360 generally run well at 720p. At the time of their launches, PC gamers were playing at 720+ resolutions for over a decade. So consoles always looked like blurry messes at 240p/480i on a CRT.

The PS4 was also perfectly good at 1080p, and PS5 is fine at 4k as long as you don't do raytracing as AMD still hasn't gotten their act together and supporting that to Nvidia's level.

People always argue about higher framerate, but I still remember interviews with Insomniac about 30fps instead of 60 for Ratchet and Clank. Their claim is still that they did focus testing and more people prefer 30fps with more complex graphics versus 60fps with concessions. There will always be people who prefer more fps, but the majority of the market had VCRs that always flashed 12:00, and didn't turn on their TV's game mode, or disable motion interpolation. And now more than ever there is an option for people who demand the best, get a PC.
480p looks blurry on a HD TV, but on a CRT it looks perfectly fine. And take a look some old Digital Foundry and see how bad many games ran o PS360, failing both at 30 and 60 often, HD was too much for those machines.

As for the PS4, it's good for 1080p 30, but I'd prefer to play at 720p 60, as long as the TVs of the time were also native 720p panels of good quality in other aspects, like the Switch Oled for example.
 
The whole dynamic clocks with auto over and down clocking between both the CPU and GPU to maintain a power budget is too new of a thing for the PS3/360 generation.

With consoles, you also want a standard performance level to target, though having options of unlocked framerates is great in terms of running on more powerful next-gen hardware.




People were in the process of upgrading to HDTVs at that time and wanted something to show them off. While not every game runs perfectly, both the PS3 and 360 generally run well at 720p. At the time of their launches, PC gamers were playing at 720+ resolutions for over a decade. So consoles always looked like blurry messes at 240p/480i on a CRT.

The PS4 was also perfectly good at 1080p, and PS5 is fine at 4k as long as you don't do raytracing as AMD still hasn't gotten their act together and supporting that to Nvidia's level.

People always argue about higher framerate, but I still remember interviews with Insomniac about 30fps instead of 60 for Ratchet and Clank. Their claim is still that they did focus testing and more people prefer 30fps with more complex graphics versus 60fps with concessions. There will always be people who prefer more fps, but the majority of the market had VCRs that always flashed 12:00, and didn't turn on their TV's game mode, or disable motion interpolation. And now more than ever there is an option for people who demand the best, get a PC.
What? No they don't. Tons of unreal engine games have aged SO poorly on 360 and specifically on PS3 because they ran at nearly N64 levels of choppiness. Consoles shouldn't suffer because the average consumer is a boob. We can have both these days anyway with just a little effort from the devs because the options are all already there on the platform games are developed for first--PC. The consoles fit pretty easily into 1080-1440p 120fps mode and 4k (whether it be reconstructed or native) and 60 fps. They're pretty well rounded with the exception of the Series S
 
Last edited:
480p looks blurry on a HD TV, but on a CRT it looks perfectly fine. And take a look some old Digital Foundry and see how bad many games ran o PS360, failing both at 30 and 60 often, HD was too much for those machines.

As for the PS4, it's good for 1080p 30, but I'd prefer to play at 720p 60, as long as the TVs of the time were also native 720p panels of good quality in other aspects, like the Switch Oled for example.
But almost all high end TV's at the start of that era were at least 1080p. Some 4k oleds were even around during the early PS4 days. No one is going to craigslist a 720p Panasonic plasma to play PS4/Xbox one games...Switch isn't a good example because pixel density isn't much of an issue. At 50 inches or more, 720p is absolutely a problem
 

Majormaxxx

Member
Bayonetta says Hi!

Picked it up for 3 euro a year or soun after the PS4 release , was completely
Well, in my example, FEAR was playable. Just ugly, compared to 360 and PC. Mordor, however, was extremely ugly and choppy and I had to turn the game off. I have heard that Bayo was choppy as well. I just looked at ps3 gameplay and saw some screen tearing. Which is not a deal breaker but still..
 

LordOfChaos

Member
That's actually a pretty big boost. I wonder how loud the fans are. If not, then Sony left a lot of free performance on the table here. RDR in particular is borderline unplayable due to its sub 30 performance.

Not exactly. There's always a silicon lottery with chips, you pick the point where most of them work reliably and set that as the clock speed, some chips would be able to clock higher, but doing so would cause differences in performance customers wouldn't be happy about as well as vastly drop the amount of chips you could viably use because more wouldn't hit the higher clock speed, and back then APIs and dev tools were more stringent on changing the clock speeds.

When Microsoft combined the CPU and GPU in later 360s into a single package, they actually had to add artificial latency to keep things exactly the same. We're more flexible now.


And this is before the consideration of being on a later die shrunk RSX and not the first generation that set the clocks for the generation to come.
 
Last edited:

Gaiff

Member
Not exactly. There's always a silicon lottery with chips, you pick the point where most of them work reliably and set that as the clock speed, some chips would be able to clock higher, but doing so would cause differences in performance customers wouldn't be happy about as well as vastly drop the amount of chips you could viably use because more wouldn't hit the higher clock speed, and back then APIs and dev tools were more stringent on changing the clock speeds.

When Microsoft combined the CPU and GPU in later 360s into a single package, they actually had to add artificial latency to keep things exactly the same. We're more flexible now.


And this is before the consideration of being on a later die shrunk RSX and not the first generation that set the clocks for the generation to come.
Not to mention that you also significantly increase the failure rate of your components with higher clocks. It may not look like much but even a failure rate of 1% more is potentially thousands upon thousands of consoles and that could cost millions of dollars.

They went with likely what were the safest and best results. I just think devs were a bit too ambitious during the PS360 era and went "fuck performance as long as my game looks good".
 
Top Bottom