Is the bolded really the case? I mean PS3's SPU's yes, but the Xenon was pretty comparable if you were to ask me. Remember (Well you imply to be a programmer so i reckon you know haha) that Jaguar was a netbook core. Power/IPC deficiencies aside, PPC could hold its own.
I'm not a programmer, I read a lot about it in my time, but I have no first hand inside knowledge.
This said Xenon and the PPE weren't comparable to Jaguar in general purpose. Benchmarking is of course difficult because one can't simply pull antutu or geekbench there, but you can attempt to.
For that you have to separate the two CPU strengths or weaknesses: Dhrystones and Whetstones, Dhrystones is general purpose so CPU duties, Whetstones is floating point operations. So we need to find a metric that measures dhrystones.
Best method is dmips because it was widely used by developers and even console manufacturers themselves so there's data which isn't subjective. I remember with the Wii U we even managed here at GAF to have a developer run those benchmarks for us.
Xenon and PPE+SPE's were no slouches in floating point, but for that GPU is more effective. Now, general purpose was a mess. They didn't even manage to double the performance in general purpose of Gamecube/Xbox per core. despite being over 4 times the clock. This happened because they didn't have cache miss prediction and had a pipeline with a lot of stages. It's everything we don't do now.
PS3 Cell PPE: 1879.630 DMIPS @ 3.2 GHz (SPE's not taken into account because they can't run general purpose code)
X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz (each 3.2 GHz core performing the same as the PS3)
PowerPC G4: 2202.600 DMIPS @ 1.25GHz
8 core Bobcat: 4260*8 = 34080 DMIPS @ 1.6 GHz (Bobcat is the CPU generation that preceded Jaguar, same foundation but worst IPC, Jaguar is best case scenario 22% better which are the official figures)
The 8-core Jaguar was shitty against anything else on the PC market when it launched in 2013 (1 performance core on an i7 these days is better than 8 jaguar cores combined...), but at "CPU things" it was still leaps above PS3 and Xbox 360. We're talking easily double the performance per core (possibly near 3 times on Jaguar) and more cores. 8 core vs 1 to 3.
Consider that Xbox One was able to emulate the Xbox 360 cpu despite it using a different architecture. You need quite a bit of performance overhead to do that.
The 7th generation was on the General Compute prowess of the CPU's, akin to a wet fart. It had volume, Mhz, generated heat... but was conventionally crap.
Floating point power of Jaguar was also comparable to X360's Xenon, albeit on a 3 cpu vs 8 cpu scenario this time, Jaguar on PS4: 102 GFlops, Xenon 3 core Floating point peak: 115 Gflops. Cell with SPE's was 205 Gflops. But that was never an issue with the GPU's we got on PS4 and Xbox One, everything you were doing on the CPU you could move elsewhere this time.
I agree with you on the latter parts though. The PS360 generation pioneered a lot of exciting tech and even achieved PBR. The only thing remotely comparable to this semi-defined pushing the limit envelope is the software-based raytracing solution on PS4 Pro/Xbox One X employed by Crytek for Crysis Remastered.
Yes, developers were motivated and they couldn't be demoted not to do things that generation, they were playing with advanced physics, destroyable objects, AI, deformable liquid, dynamic light sources running on the CPU floating point habilities, anti aliasing on the CPU, you name it... This while running on Potatos. They were excited with the possibilities I guess. With PS2/GC/Xbox we also saw some of that with developers being really crazy with things that seemed next gen. From crazy amounts of enemies on-screen, shaders on fixed function hardware, deffered rendering, you name it.
With PS4/Xbox One we got very little of devs trying to do what the hardware couldn't on any front bar special frame reconstruction/accumulation methods, it felt like an encore of the previous gen without the need to spend thousands of hours so you can do something you are not supposed to. PS5/XBox Series S/X is looking better already, but I still don't feel that eagerness to push the envelope in every direction. Developers seem confortable with the process of how to do a game, and doing it over and over in a linear fashion. No things on the bucket list to implement (or no time).
I also consider Alien Isolation (The only FP game with PBR and PBS, yes Black Ops did Physically Based Shading and games like Remember Me/Beyond Two Souls also attempted PBR, but Alien was the only first person title) a graphical tour the force on PS360 for having these features and was forever disappointed that Digital Foundry didn't pick up on it back then and haven't done so in their revisits on the game. It is lower resolution, lower everything, but it showed that a physically based rendering model could work on these machines. The Cathode Engine was/is amazing.
Interesting. I avoided the X360 version because of the performance.
NO PBS, but didn't Metal Gear Solid V pull a very good PBR on X360/PS3? I felt it was the most balanced/performing implementation I've seen back then.
It's impossible for most games to not be held back by last gen. Even if it is something as simple as object density (how many non static objects available on screen), last gen CPUs were turds at release and nothing is going to change that.
Those are easy to cut though, and nobody will/would complain.
We should be seeing PS5 games with lots of objects and PS4 versions with way fewer.
Cross platform PS3 vs PS4 games often had a huge difference in foliage for instance, for the same reasons.