• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF - RETRO: Sony PlayStation 3: Chasing the 1080p Dream (2006/2007) (UPDATE: PART 4)

PaintTinJr

Member
Says unavailable. Maybe they re-uploaded:



Another interesting retro video with some great takes by John. My only question is do we know if this is all brand new footage captured on a modern super slim with latest firmware - with a cheap £20 SSD for storage - or still on the old Phat testkit using iffy mechanical HDD Richard used in the past.

From watching it looks like the graphs don't really match the footage stress points in places, and that it is new footage with Richard's old analysis graph data. Hopefully in part three John will add in a bit more details about his testing setup for this retro.
 

ZywyPL

Banned
Just caught up with both parts, and damn, so many great games, so many great memories.

And I have to say, I just can't fight the feeling PS3 games had better lighting than PS4/5 titles, I can't recall a single game with the so-called "flat lightning" modern games are plagued with.
 

Redefine07

Member
I just got a PS3 since the support for it on PS5 is still streaming and looks like ASS , was dirt cheap 50$ for a Super Slim 500gb and 2 controllers , so far I got Killzone 2/3 , Resistance 1/2/3 , Motorstorm 1 , God of War 1/2/3 , Ascension and Volume 1/2 , Heavenly Sword , I try to find Folklore but there is no where to be found in my country >.<.
 
Last edited:

PaintTinJr

Member
Two games(Hamsterball and Go! Sports Skydiving) that are hopefully in the next part (2010,2011) because they feel very responsive, so expecting them to be 60fps and as they looked razor sharp, expecting them to be 1080p (or something around that). Both are at least proper 3D games to tax the hardware fill-rate.

When he eventually gets to Rage, it is going to be really disappointing if he just does the demo level again, which was marred by a 360 better-or-equal clause contract marketing deal. I would really like to see the performance beyond that where the game felt technically flawless from what I recall, and would have the latest art/code, so should be well within rendering budget for 1080p60.
 

PaintTinJr

Member
Just caught up with both parts, and damn, so many great games, so many great memories.

And I have to say, I just can't fight the feeling PS3 games had better lighting than PS4/5 titles, I can't recall a single game with the so-called "flat lightning" modern games are plagued with.
It depends what you play on, but too aggressive use of Microsoft's BC compression for textures isn't ideal, and even worse when used with lossy compressed normal maps, or even worse if lossy bumpmaps.

In games, compression used too liberally kills lighting fidelity, because the noise from the compression approaches levels of the signal from the lighting equations.

In the early days of s3Tc/DXT5/3dc people used compressed textures when they needed the headroom, or when the native texture resolution was too low and the bigger compressed texture - occupying the same VRAM - was definitely a win in signal quality.

Going by Microsoft's DirectX documentation compression is advised without any consideration to the rendering problem, now and is getting worse with techniques like variable rate shaders, These techniques should only be used when they result in better signal and less noise.

Given that the lighting is far more advanced, now it speaks volumes that you are seeing it as flatter and suggests the compression is being used wrongly.

PS3's RSX being the only one of the two consoles to support 16bit floating point channel textures that gen meant it regularly used higher quality data for lighting which does still look very precise IMO.
 

mrMUR_96

Member
Those nonsense faceoffs - reduced to 2D resolution and 0.5% percentile frame-rate dips - also never mention the difference in 3rd dimension resolution, where in Batman the 360 version uses less depth precision - like most 360 versions of games and was a common cheat on ATI hardware - and has a slightly different near/far frustum plane setup to make it look almost identical, but still results in a more claustrophobic world space than the PC and PS3 versions.

The difference in frustum setup also changes where depth cueing fog starts, so some mid scene textures on the 360 might look sharper - than they should - because they aren't depth cued correctly and may be projected closer to the near plane than on PS3 in comparative shots and either selecting a different mip, or just projected to a large number of pixels on screen.

It never did for quality pixels, and everything from Heavenly Sword, Uncharted, MGS4, Last of Us were all UE3 and modified to be their own "engines" even if large money was paid so they never list them as UE.
"Heavenly Sword, Uncharted, MGS4, Last of Us were all UE3 and modified to be their own "engines" even if large money was paid so they never list them as UE."
Uh, source?
 

assurdum

Banned
You'll need to list the ones you are referring to. I don't remember the major ones tearing on PS3, instead of occasional slow downs, because why would any competent dev do that on a system that can easily double or triple buffer?
Uh Arkham Batman, Bulletstorm the most notorious if I'm not wrong also the reboot of Medal of Honor on UE3, but I assure to you there are many games with soft vsync probably there are more than triple buffered on ps3 multiplat (which most of them are first parties), I can even bet about it and practically all Ubisoft games tear more on ps3.
 
Last edited:

PaintTinJr

Member
"Heavenly Sword, Uncharted, MGS4, Last of Us were all UE3 and modified to be their own "engines" even if large money was paid so they never list them as UE."
Uh, source?
Back in the day develop magazine ran adverts for (over a year IIRC for) Sony Cambridge Studio with every job being for Unreal development and was for PlayStation's core technology listed in the adverts. Heavenly Sword would need to have had its development restarted for it not to have been UE3 originally. Interestingly, some of the side parts with catapults from Heavenly Sword later got put in AC3 wholesale.

The other stuff has been discussed in earlier comments
 
Last edited:

PaintTinJr

Member
Uh Arkham Batman, Bulletstorm the most notorious if I'm not wrong also the reboot of Medal of Honor on UE3, but I assure to you there are many games with soft vsync probably there are more than triple buffered on ps3 multiplat (which most of them are first parties), I can even bet about it and practically all Ubisoft games tear more on ps3.
Bulletstorm had an 360 marketing deal IIRC, and as Xenos had to tear, for parity I guess PS3 would too, because clearly the developers are smart enough to know a hard vsync was the better option.

As for Batman, IIRC I only played it after the Stereoscopic 3D patch, so I can't say before, but the game wasn't tearing. I'd have to buy a copy to check it again. But tearing in a game with stereoscopic 3D seems like a feature you wouldn't add. If you don't have the performance to render normally, as it was an additional +10% GPU burden, even when they used the technique of just using one frustum to capture both eyes and then post processing into two viewpoints.
 

GAF machine

Member
Sony literally had a PowerPoint slide on stage saying the PS3 has 2 Teraflops of compute power, which is complete and utter nonsense of the highest order. even if you combine CPU and GPU and then multiply by 4 you're not at 2TF

then there was that "rumble is a last gen feature" nonsense when the real reason was a patent/licensing issue which also made them release rumbleless PS2 controllers for a period of time lol.

then there were all the fake trailers that were falsely claimed to be in engine when they were literally done by animation studios.

Sony lied like crazy ahead of the PS3's launch

Eurogamer's report wasn't about any of those things you listed. It was about Kutaragi supposedly claiming that PS3 would run its games at 120 fps, at which you lol'd and called "some major bullsh!t". Gibson with Eurogamer's blessing put that bullsh!t in Kutaragi's mouth, and helped propagate a fallacy.

Now, with regards to the things you listed. MS issued a press release from GDC '05 in which J. Allard lied, saying that the X360 would "deliver more than a teraflop of targeted computing performance". Bill Gates then repeated that lie in an E3 '05 skit, saying that the console had a "teraflop of targeting computer power". SIE countered MS's lie with one of its own. What's the problem?...

Rumble was a DualShock 2 feature, so it was last gen. Sixaxis motion-sensing was next gen and that's where SIE wanted to focus as it worked to resolve the patent/licensing dispute with Immersion. SIE's dismissal of rumble as "a last gen feature" doesn't make the statement a lie.

As for the trailers, Jack Tretton said one thing. Phil Harrison said another:

Eurogamer: How representative of what we're actually going to be seeing in PS3 games were those videos?
Phil Harrison: I think very.
I think depending on the game,
different games took a different approach to their way of expressing what the games are like - but clearly, something like MotorStorm uses more cinematic, replay-like cameras than you would ever enjoy in-game. So that makes a big difference... But everything is done to spec. -- Phil Harrison

Tretton lied about the trailers, but I agree with Harrison that the games released for the trailers in question were very representative of the look and feel that their respective trailers conveyed. The volumetric lighting, volumetric particle effects, procedural texturing and underlying simulations that affected movement, motion blur, lens flare, blended animations, rag doll physics, etc. portrayed in some of the trailers, were present in-game at a very high level and calculated in real-time to spec (to the required level of PS3 performance).
 
Last edited:

GAF machine

Member
I'm sure it was possible, but they couldn't even get their flagships running at decent fps.

That's beside the point. The headline said 'PS3 could run at 120 fps'. I provided a one-off showing that it in fact could. The headline was true; but the claim Eurogamer attributed to Ken Kutaragi that PS3 would run games at 120 fps as if such performance would be commonplace, was false.
 

assurdum

Banned
Bulletstorm had an 360 marketing deal IIRC, and as Xenos had to tear, for parity I guess PS3 would too, because clearly the developers are smart enough to know a hard vsync was the better option.

As for Batman, IIRC I only played it after the Stereoscopic 3D patch, so I can't say before, but the game wasn't tearing. I'd have to buy a copy to check it again. But tearing in a game with stereoscopic 3D seems like a feature you wouldn't add. If you don't have the performance to render normally, as it was an additional +10% GPU burden, even when they used the technique of just using one frustum to capture both eyes and then post processing into two viewpoints.
Dude Unreal Tournament of Epic has more tearing too on ps3...and yeah both Akham Asylum and Arkham City tearing more on ps3. You should go to check the old comparison. Many games had soft vsync on ps3 if not most of multiplat.
 
Last edited:

Romulus

Member
That's beside the point. The headline said 'PS3 could run at 120 fps'. I provided a one-off showing that it in fact could. The headline was true; but the claim Eurogamer attributed to Ken Kutaragi that PS3 would run games at 120 fps as if such performance would be commonplace, was false.


I'm sure it could, it just wouldn't really matter. The hardware was so gimped the best devs on the planet couldn't pull off acceptable 60fps or 30fps often times.
 
Last edited:

PaintTinJr

Member
Dude Unreal Tournament of Epic has more tearing too on ps3...and yeah both Akham Asylum and Arkham City tearing more on ps3. You should go to check the old comparison. Many games had soft vsync on ps3 if not most of multiplat.
I think you are right that I'm misremembering about hardware vsync on ps3 for arkham because so many games had a parity contract for 360, so the PS3 must have had soft vsync too, and now I think about it, I think it was only anaglyph 3D on arkham so didn't have the rendering cost of 3D stereoscopy.

What I would dispute is how much tearing was on PS3 version of Arkham. Richard at DF used to test the 5% demo level experience of games and then extrapolate his conclusion for the full games, and wasn't even consistent in how the games played out, so triggering tearing always seemed misrepresented in his videos. So holding up DF analysis and making blanket statements is still misleading IMO.

I played primarily on PS3 that gen and experienced very little tearing on the console in beating most games I played because in general I didn't buy games that tore on a system that didn't need to @720p, unlike the 360 which had an edram size designed for 1024x768 and was short on polygon rendering so tearing was to be expected even below 720p on the 360 versions of games.

At times it feels like people forget the state of tearing on 360 games in the year prior to it getting hdmi - which was 12months before the PS3 launched - and the amount of tearing and subhd resolutions deemed acceptable on the console first party games even 3-4years on when the PS3 1st party games were all 720p30 double buffered with vsync on, and in KZ3 case, had 3D stereo and move controls too, and split screen support.
 

assurdum

Banned
I think you are right that I'm misremembering about hardware vsync on ps3 for arkham because so many games had a parity contract for 360, so the PS3 must have had soft vsync too, and now I think about it, I think it was only anaglyph 3D on arkham so didn't have the rendering cost of 3D stereoscopy.

What I would dispute is how much tearing was on PS3 version of Arkham. Richard at DF used to test the 5% demo level experience of games and then extrapolate his conclusion for the full games, and wasn't even consistent in how the games played out, so triggering tearing always seemed misrepresented in his videos. So holding up DF analysis and making blanket statements is still misleading IMO.

I played primarily on PS3 that gen and experienced very little tearing on the console in beating most games I played because in general I didn't buy games that tore on a system that didn't need to @720p, unlike the 360 which had an edram size designed for 1024x768 and was short on polygon rendering so tearing was to be expected even below 720p on the 360 versions of games.

At times it feels like people forget the state of tearing on 360 games in the year prior to it getting hdmi - which was 12months before the PS3 launched - and the amount of tearing and subhd resolutions deemed acceptable on the console first party games even 3-4years on when the PS3 1st party games were all 720p30 double buffered with vsync on, and in KZ3 case, had 3D stereo and move controls too, and split screen support.
You have really a bad memory. Most of the multiplat have lower resolution on ps3 compared x360. Just first parties games has 720p and not all to be fair. About triple buffering again only first parties have it but for multiplat I would say there are more with soft vsync than triple buffered, but at worst it's a tie. Just to say I owned all the 3 models of the ps3 and I watched regularly videos comparison.
 
Last edited:
You'll need to list the ones you are referring to. I don't remember the major ones tearing on PS3, instead of occasional slow downs, because why would any competent dev do that on a system that can easily double or triple buffer?

While not an UE3 game, Assassin's Creed II come to my mind. It teared massively more on PS3 than its 360 counterpart since its targeted 30 fps wasn't nearly reached as frequently as the 360 version and it did not have double buffering nor triple buffering. I played both versions from start to finish. I started with the PS3 version when it came out. Played it like crazy from A to Z (loved that game). Played the 360 version many years later and this is something I instantly noticed. I then powered on my PS3, inserted the disc just to be sure I wasn't going crazy and sure enough, I wasn't. Make no mistake, the 360 version also drops fps too, but it's able to reach 30 fps much more frequently than the PS3 version and because of that, it has significantly less tearing.

I've also found the old DF article discussing this:

Assassin's Creed II Ps3 vs 360 Face-off

The original Uncharted game is also a tearing fest, but it's not multiplatform so heh...

I have no doubt there's plenty more examples to be found.


Edit: Just wanted to say I love this type of thread. The comparisons between consoles of these generations and before are always fascinating due to their different approaches in their visions and architectures. The 7th gen consoles truly were the last consoles where things were very different between a manufacturer versus another one. Also love the takes of everyone here, whether I agree with you or not. It's fun to talk about theorical performances, revisit their historical performance against each other, etc. Bring in the love!
 
Last edited:

nowhat

Member
Just wanted to say I love this type of thread. The comparisons between consoles of these generations and before are always fascinating due to their different approaches in their visions and architectures. The 7th gen consoles truly were the last consoles where things were very different between a manufacturer versus another one.
Kind of a shame, innit? I think console warring, spec-wise, has become a bore. Sure, we can argue over minute specs, but in reality the differences are quite minimal when it comes to actually playing the games. Where's my blast processing?
 

GAF machine

Member
I'm sure it could, it just wouldn't really matter. The hardware was so gimped the best devs on the planet couldn't pull off acceptable 60fps or 30fps often times.

I didn't say it would matter for games (plural), only that it mattered for one game.
 

PaintTinJr

Member
While not an UE3 game, Assassin's Creed II come to my mind. It teared massively more on PS3 than its 360 counterpart since its targeted 30 fps wasn't nearly reached as frequently as the 360 version and it did not have double buffering nor triple buffering. I played both versions from start to finish. I started with the PS3 version when it came out. Played it like crazy from A to Z (loved that game). Played the 360 version many years later and this is something I instantly noticed. I then powered on my PS3, inserted the disc just to be sure I wasn't going crazy and sure enough, I wasn't. Make no mistake, the 360 version also drops fps too, but it's able to reach 30 fps much more frequently than the PS3 version and because of that, it has significantly less tearing.

I've also found the old DF article discussing this:

Assassin's Creed II Ps3 vs 360 Face-off

The original Uncharted game is also a tearing fest, but it's not multiplatform so heh...

I have no doubt there's plenty more examples to be found.


Edit: Just wanted to say I love this type of thread. The comparisons between consoles of these generations and before are always fascinating due to their different approaches in their visions and architectures. The 7th gen consoles truly were the last consoles where things were very different between a manufacturer versus another one. Also love the takes of everyone here, whether I agree with you or not. It's fun to talk about theorical performances, revisit their historical performance against each other, etc. Bring in the love!
For my sins I was still using a CRT TV at the time I played UC1 so I didn't see the tearing running at 576i (upscaled to the TVs native 1280x720 panel @ 200Hz) on the Sony KD32-DX200 I had back then, but I take your word for it.

As for AC2 - and all the AC games for that matter - the frustum/z precision setups, fog equation and geometry levels, and lighting equations weren't an exact match on 360 - and obviously the incorrect 360 gamma - and there is a stark difference in AC1 where the pseudo HDR on 360 is just flat as a pancake, with zero sense of desert heat like the PS3 version, but then again AC1 was before the parity clause that had been in effect all along, that we all found out about from Ubisoft - in an interview with EG - when PS4 would be required to match XB1's settings of 900p.

So the tearing on PS3 AC2 only exists because the Xenos tears with HD-Ready resolutions IMO.
 

Tazzu

Member
PTJ reminds me of the Sony fanboys who couldn't accept till the very last days that multuplats were better on 360, and they were better for a reason. But somehow Oblivion disproved everything.
 

assurdum

Banned
For my sins I was still using a CRT TV at the time I played UC1 so I didn't see the tearing running at 576i (upscaled to the TVs native 1280x720 panel @ 200Hz) on the Sony KD32-DX200 I had back then, but I take your word for it.

As for AC2 - and all the AC games for that matter - the frustum/z precision setups, fog equation and geometry levels, and lighting equations weren't an exact match on 360 - and obviously the incorrect 360 gamma - and there is a stark difference in AC1 where the pseudo HDR on 360 is just flat as a pancake, with zero sense of desert heat like the PS3 version, but then again AC1 was before the parity clause that had been in effect all along, that we all found out about from Ubisoft - in an interview with EG - when PS4 would be required to match XB1's settings of 900p.

So the tearing on PS3 AC2 only exists because the Xenos tears with HD-Ready resolutions IMO.
Can you stop to post nonsense console war conspiracies theories. A lot of games tearing on ps3, most of the games runs worse and with worse resolution. We can stop here not trying to rewrite the story of the perfomance of such console?
 
Last edited:

PaintTinJr

Member
Can you stop to post nonsense console war conspiracies theories. A lot of games tearing on ps3, most of the games runs worse and with worse resolution. We can stop here not trying to rewrite the story of the perfomance of such console?
Are you saying Ubisoft didn't confirm that there was a better-or-equal arrangement with AC on Xbox with the 900p debacle on PS4?
 
Last edited:

Chiggs

Member
I remember the comments on here and in the gaming media mocking Sony for supporting 1080p for the small percentage of people who had 1080p tvs in 2005. As short sighted as ever.

I've been here since early 2005 and don't remember this being the general consensus whatsoever. I think people were saying that Nintendo was smart for not making its focus about resolutions that weren't widely adopted, but not that HD wouldn't eventually be the standard.
 
Last edited:
Top Bottom