• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF - RETRO: Sony PlayStation 3: Chasing the 1080p Dream (2006/2007) (UPDATE: PART 4)

01011001

Banned
Bought the 360 on release - november 2005 I think it came out and bought a PS3 in late 2008., mostly for the Blu-ray player.

December 2nd in europe... I still remember the day it released. my cousin took a day off from work, I had school, I gave him my money that I saved up for months and months, and he went into the war that was the rush to buy the system.
we preordered it both of course.
he said the storm to the games was ridiculous lol and they sold out immediately, and then bundled the Xbox 360 Core with the 20GB HDD for the same price as the 360 Premium sold... also everyone who preordered got a second wireless controller for free

after I came home from school, I instantly took my bike over to him to get my 360. the first thing we did was play Perfect Dark Zero over LAN and play some Kameo.

before having the system in my hands I was completely unaware of the fact that the Xbox orb in the middle was also a button! A HOME BUTTON... such a thing was so weird back then... backing out of a game that is in your console... without taking the game out... it was wild lol
 
Last edited:

8BiTw0LF

Banned
December 2nd in europe... I still remember the day it released. my cousin took a day off drom work, I had school, I gave him my money that I saved up for months and months, and he went into the war that was the rush to buy the system.
we preordered it both of course.

after I came home from school, I instantly took my bike over to him to get my 360. the first thing we did was play Perfect Dark Zero over LAN and play some Kameo.

before having the system in my hands I was completely unaware of the fact that the Xbox orb in the middle was also a button! A HOME BUTTON... such a thing was so weird back then... backing out of a game that is in your console... without taking the game out... it was wild lol
You're right! December 2nd 2005 I was waiting outside 'Merlin' (tech store) and were one of the last to get in and I got the last Premium version. Bought NFS: Most Wanted and CoD: 2 for it. I lived in a collective with 3 other friends and when people heard I bought a 360 we probably were 12-15 gathered at the end of the night. Something magic about that generation - this was also before we started having kids - still did drugs and were genuinely excited about the future lol.
 

01011001

Banned
You're right! December 2nd 2005 I was waiting outside 'Merlin' (tech store) and were one of the last to get in and I got the last Premium version. Bought NFS: Most Wanted and CoD: 2 for it. I lived in a collective with 3 other friends and when people heard I bought a 360 we probably were 12-15 gathered at the end of the night. Something magic about that generation - this was also before we started having kids - still did drugs and were genuinely excited about the future lol.

that generation is what basically formed modern gaming. and Xbox 360 doesn't function fundamentally different to any modern console, that's what was so fascinating about it I think.
it was such a BIG jump in terms of how you use a console... like I said, a freaking home button... or "guide button" of course, friends lists always accessible, cross game voice chat, an easy to access store with downloadable games and demos right there for everyone to use, ACHIEVEMENTS!!!... all of it started here, all of it is still how consoles work today. an Xbox 360 is just a PS5 or Series X with worse graphics basically.

the Xbox 360, in one sweep, basically brought in the switch from how classic consoles work to how modern consoles work.
and I think we will never have such a paradigm shift ever again.
 
Last edited:

PaintTinJr

Member
Hmm, okay, but I'm pretty sure that in terms of FLOPs (which does matter, more so back then) the Xenon had the advantage? And of course the PS3's split memory caused issues for a lot of developers.
Comparing Nvidia's superscalar architecture against ATI's micro-architecture back then was largely a matter of benchmarking optimised titles because ATI's cheat sheet used techniques like EarlyZ - to name one - that could make huge differences in non-optimised situations. Nvidia's solution provided dumb brute force predictable results compared to ATI's less predictable gains IIRC.

But if we take optimised rendering, like properly wound meshes made from large quad strips, nvidia hardware always had the vert processing advantage. ATI cards like the ATI 970/980 - used in the Apple G4 powermac the 360 copied - topped out around 400-600Mpolys/s, whereas Nvidia cards like the RSX topped out at 1.1Billion polys/s, so in useable FLOPS, the superscalar design typically won out.
 

PaintTinJr

Member
Dont forget Xbox GPU had unified shaders and "almost free" MSAAx2. Most mutliplatform games looked much worse on PS3 until developers starting using CELL potential to offload RSX.
Given that almost no game on the 360 has accurate colour, you'd have to point to specific examples, as widespread statements like that have been made in a blanket fashion, over and over directly as a result of Richard/DF coverage from the time that doesn't hold up, now. Most 360 games don't even use exponent fog, because it was more shader ops compared to the cheaper linear fog games like AC used to use in place.

Unified shaders is just a PC directX label for using the graphics hardware differently, much like today's DX RT. A console GPU customised for game devs not using that label isn't necessarily less capable, just providing the same or better access to their hardware under a different label. The PS2 had geometry shading in shader assembly before it was even given a name and before any PC GPUs had the feature in hardware, so that is a good example of how PC labels, are just labels in a console discussion.
 

PaintTinJr

Member
That is interesting. If the arcade hardware is literally a weaker version of the ps3 gpu architecture then I'm not at all surprised at how well the PS3 port turned out. You can see how well this works in the other direction, like the switch version of cruisn blast ; it's 1/4 the power of the arcade gpu but the same Nvidia Maxwell gpu, and the graphics cutbacks are pretty minimal considering.

That just convinces me more how much more went into, or how much less work needed to be done on PS3 vs. 360.

Your point about the unified ram on 360 would be valid, if it weren't for the fact that it's only on an 128 bit bus ; the bandwidth is less than half the combined 128 bit pools on ps3 effectively making it 256 bit bus vs 128. It's just that on PS3 it requires a bit of extra work to split work between the two pools.

The rsx gpu can use memory from the xdr pool as well as its own gddr3.

Basically if you don't use the edram on 360 it can't compete with PS3 bandwidth, but of course you're supposed to use the edram.

If vf5 didn't use edram, combined with the nature of the original arcade hardware then it's absolutely no wonder why the PS3 version is superior. Even if it did use edram, that's kind of a dream, perfect scenario for a PS3 port.

I think it's pretty, pretty rare for arcade hardware to be a lesser version of a console hardware, right?
When we say it wasn't using the edram - I'm meaning to its fullest, optimally moving workloads in and out to maximise memory bandwidth when processing, obviously it is the framebuffer, so it is getting used - but I did consider the question of bus width, but these are PC style games(VT3 and VF5) - discrete CPU and GPU, data load once to VRAM per level and draw calls from the buffered objects being sent to the GPU that isn't going to overburden the bus - so II don't beleive that makes any difference in this situation.

Comparing and contrasting with Virtua Tennis 3 which is 1080p60 on both consoles, the main difference is that polygon counts in active gameplay are significantly lower in VT3 - and the camera is largely static - because of the distance the camera is from players and the simple geometry used to render a largely flat tennis court, so the main difference between VF5 and VT3 is that VF5 is maybe occasionally using more polygons per frame between the characters and background than the Xenos can manage even with lower LoDs - hence the occasional screen tearing in a combo when the camera moves IMO
 

Von Hugh

Member
The generation where everyone I knew skipped PS3 altogether and just bought a 360. Then it changed back again with the PS4.
 

Panajev2001a

GAF's Pleasant Genius
Ken read the winds of the industry wrong. It sort of reminds me of Crytek with Crysis thinking clock speeds would continue to grow instead of the multi-threaded future we were heading towards. The GPU was an afterthought in the PS3 so maybe it's more akin to Sony's version of the Saturn.

yeah, it was kinda crazy to think that the original plan was to have 2 Cell Chips, with them also handling graphics... like god damn, imagine if that system released, I think Sony's gaming division would be a goner by now.

apparently the studios were telling them that the best they can do with a Cell only console is HD PS2 graphics, and early showings were telling in this regard. look at the very first showcase where they had gameplay clips of Resistence... that game looked like absolute horseshit in those clips, and even the finished game was barely better than an HD Xbox 1 game in terms of graphical fidelity.
but Insomniac really turned that ship around last second it seems, I mean look at the 2005 fottage 😬


The GPU was not an afterthought, Sony was working on a GS successor which followed the upgrade philosophy behind PS1 to PS2: take what was the old paradigm, fix some issues, and put it on steroids, and then add a new paradigm/new capabilities.

They were following that same strategy for the PS2 to PS3 transition but it failed and they had to switch GPU supplier at the last minute, from the credible rumors that have come out over the years (vs one super early patent which had a second CELL +, and this is what people keep missing in the original patents, a full set of HW fixed function units which is the core of what people see as defining GPU’s: TMU’s for fetching and filtering of data/textures, ROP’s for blending, Z-testing, etc…, and more which is coincidentally similar to how Intel was planning LRB just with much more power hungry 512 bits vector units which became AVX-512 and more power hungry yet dev friendly cache).

Beside the very early patent, which again was not software rendering the way people imagine it but it was essentially a GPU with SPE’s doing the math work (coincidentally, look up how a modern DCU/CU looks like and what that patent was proposing the GPU to be, closer to what a 1:1 mapping than people would like to admit, but sure there are differences in some components like HW work distributor that was going to be software handled by the OS), the GPU Toshiba was preparing (original RS) was essentially both a GS on steroids (leaving vertex processing, T&L, geometry generation/tessellation to the CPU and with gobs of eDRAM, more then Xenos’s daughter die, as well as emphasis on very high pixel fillrate taking advantage of the very high local bandwidth) as well as a new paradigm (programmable pixel shaders).

Again on the CELL patent in a bit… ok now where were we?

RDNA2 CU:
seeRGkD.jpg


CELL SPE:
um5YGc3.jpg


RDNA architecture:
6jOTemh.jpg


SPE based GPU (patents only, what made it out of the labs was Toshiba’s RS):
2u3f6rg.jpg


There was a lot more HW in that GPU than the SPE’s to help with graphics as I mentioned before (PPU was there to run the OS driven work distributor, SPE’s would have been the Compute Units).
 
Last edited:
Given that almost no game on the 360 has accurate colour, you'd have to point to specific examples, as widespread statements like that have been made in a blanket fashion, over and over directly as a result of Richard/DF coverage from the time that doesn't hold up, now. Most 360 games don't even use exponent fog, because it was more shader ops compared to the cheaper linear fog games like AC used to use in place.

Unified shaders is just a PC directX label for using the graphics hardware differently, much like today's DX RT. A console GPU customised for game devs not using that label isn't necessarily less capable, just providing the same or better access to their hardware under a different label. The PS2 had geometry shading in shader assembly before it was even given a name and before any PC GPUs had the feature in hardware, so that is a good example of how PC labels, are just labels in a console discussion.
I have never noticed the difference in colors between PS3 / x360 multiplatforms. I have however noticed that the majority of multiplatform games run better on most x360, at least till around 2008. COD3, splinter cell double agent, assassins creed 1 (it had drops to 10fps although they have improved performance with patches later on), bulletstorm, bioshock 1, rainbow six vegas, batman arkham asylym, riddick dark atena, lost planet, orange box, bayoneta, red dead redemption, gta 4, dead rising 2, madden 2008, skyrim, fallout 3, FEAR, COD4, modern warfere 2-3, SF4, mafia 2, mass effect trilogy, ghostbusters.

There were few games that played much better on PS3 as well (TES Oblivion, Ridge Racer). It seams PS3 was faster overall, but developers had to optimize their games for CELL architecture.
 

8BiTw0LF

Banned
Multiplatform games were mostly equal by 2011-13, PS3 did catch up and even surpassed X360 in more titles that people seem to remember.
By that time I've had the 360 for 6 years. Little too late.
Which games were better on the PS3? - just out of curiosity - cause I remember every CoD game to have better framerate on 360 through the whole generation.
 
Last edited:
Considering the release dates of PS3 and 8800GTX, I wonder what the PS3 could have been capable of if the RSX was based on that Geforce8 rather than 7800. The Cell might have been complicated, but when stuff done on GPUs could and had to be partially offloaded to teh Cell, it was probably not the bottleneck for top programmers while the GPU (and amount of RAM) certainly was.
I also wonder if a shrinked, up to date Cell (no idea what 45nm to 5nm means for power draw) could not have been used in the PS5 for PS3 games and as added value for PS5 games as a coprocessor for something, since it seems to be able to do CPU, some GPU and possibly PPU stuff just fine. A future PS6 might finally be able to emulate, but maybe a HW based solution would have been possible without adding much cost.
 

01011001

Banned
The PS3 was obviously the future of gaming. Blue-ray, HDD, WiFi and HDMI.

This is also the reason why they lost fucking billions in that gen.

the future in hardware but not in software. in terms of the user experience the PS3 was a full generation behind, especially at launch.
 

01011001

Banned
It could have been faster and more full features when used in game, but outside of games XMB was actually really nice. Custom user made video clips for videos, DVB-T recording, etc…

it was a great multimedia machine but not a good gaming machine. the lack of party chat alone made it feel way behind times
 

scydrex

Member
It could have been faster and more full features when used in game, but outside of games XMB was actually really nice. Custom user made video clips for videos, DVB-T recording, etc…

Exactly. As a media player it was excellent. The blu ray player or dvd player was very nice. My brother had a 360 and it didn't played dvd so well. Of course in terms of software and games especially at launch it was better. I loved my ps3 i used it for everything as a media player. Of course i wanted to be better in software and UI.
 

01011001

Banned

120hz support, 2TF gaming machine those were 2 claims for the PS3... that's one of the examples lol

or the PS4 was supposed to support 4k30hz and Trine 2 devs even had an update in the works apparently to support that
 
Last edited:

DeepEnigma

Gold Member
120hz support, 2TF gaming machine those were 2 claims for the PS3... that's one of the examples lol

or the PS4 was supposed to support 4k30hz and Trine 2 devs even had an update in the works apparently to support that
You don't have to end every sentence with an, lol. Be confident, son.

We should revolt, and hold them over the coals for that.

P.S. I don't ever remember Sony saying the PS4 was going to be 4K, it wasn't even on the box. Maybe Trine 2's update was for the Pro's config?
 

01011001

Banned
You don't have to end every sentence with an, lol. Be confident, son.

We should revolt, and hold them over the coals for that.

P.S. I don't ever remember Sony saying the PS4 was going to be 4K, it wasn't even on the box. Maybe Trine 2's update was for the Pro's config?

I say lol because it's funny that they actually had the gull to claim the PS3 is a 2 Teraflop machine

also no, the PS4 was apparently planned to have a 4K mode but that never happened. Trine 2 already had a 1080p60 mode and a 3D mode, so the game didn't really tax the PS4, they had a 4K mode in mind if the update ever came.
 

DeepEnigma

Gold Member
I say lol because it's funny that they actually had the gull to claim the PS3 is a 2 Teraflop machine

also no, the PS4 was apparently planned to have a 4K mode but that never happened. Trine 2 already had a 1080p60 mode and a 3D mode, so the game didn't really tax the PS4, they had a 4K mode in mind if the update ever came.
I'm only teasing, bro, lol.

Ah yes, the Jeff Rigby threads for the same HDMI chip that had been used in 4K boxes and certified, but never updated. I remember those.
 
Last edited:
When we say it wasn't using the edram - I'm meaning to its fullest, optimally moving workloads in and out to maximise memory bandwidth when processing, obviously it is the framebuffer, so it is getting used - but I did consider the question of bus width, but these are PC style games(VT3 and VF5) - discrete CPU and GPU, data load once to VRAM per level and draw calls from the buffered objects being sent to the GPU that isn't going to overburden the bus - so II don't beleive that makes any difference in this situation.

Comparing and contrasting with Virtua Tennis 3 which is 1080p60 on both consoles, the main difference is that polygon counts in active gameplay are significantly lower in VT3 - and the camera is largely static - because of the distance the camera is from players and the simple geometry used to render a largely flat tennis court, so the main difference between VF5 and VT3 is that VF5 is maybe occasionally using more polygons per frame between the characters and background than the Xenos can manage even with lower LoDs - hence the occasional screen tearing in a combo when the camera moves IMO
Regarding the lower polygon pushing power of Xenos vs. rsx, I have honestly never heard or seen that in any platform comparison. I have actually seen more aggressive lod in a few PS3 versions of games but never the reverse. Well, apparently it's true for vf5 but I didn't know that one. Given what you have said I 100 percent believe it though.

PS3 exclusives like God of war ascension or uncharted do appear to be using more polygons than 360 equivalent games but it's probably because the Cell is doing a lot of the work leaving the rsx more room to push polys. Late gen PS3 stuff got pretty crazy, like motorstorm apocalypse too.

Do you have any examples of lower poly counts on 360 versions of games when both versions launched at the same time? Maybe I am forgetting something.
 
Considering the release dates of PS3 and 8800GTX, I wonder what the PS3 could have been capable of if the RSX was based on that Geforce8 rather than 7800. The Cell might have been complicated, but when stuff done on GPUs could and had to be partially offloaded to teh Cell, it was probably not the bottleneck for top programmers while the GPU (and amount of RAM) certainly was.
I also wonder if a shrinked, up to date Cell (no idea what 45nm to 5nm means for power draw) could not have been used in the PS5 for PS3 games and as added value for PS5 games as a coprocessor for something, since it seems to be able to do CPU, some GPU and possibly PPU stuff just fine. A future PS6 might finally be able to emulate, but maybe a HW based solution would have been possible without adding much cost.
Forget it, PS3 buil costs were already sky high. IMO RSX (PS3 GPU) was still very good GPU, but peformance was cut in half because of only 128 bit bus.

8800 GTX/Ultra were very fast GPU's. I bought PS3 and PC with 8800Ultra in the same week (August 2007), and my PC was like 3-4x times faster in multiplatform games. For example FEAR 1 on PS3 was running at 30 fps at 1280x720 (921 600 pixels), while my 8800Ultra was able to deliver 60fps at 1920x1440 (2 764 800 pixels).

84e3523757.gif
 
Last edited:

Hoddi

Member
Regarding the lower polygon pushing power of Xenos vs. rsx, I have honestly never heard or seen that in any platform comparison. I have actually seen more aggressive lod in a few PS3 versions of games but never the reverse. Well, apparently it's true for vf5 but I didn't know that one. Given what you have said I 100 percent believe it though.
Xenos had vastly higher triangle throughput than RSX. Even the previous gen midrange X1600 was faster than the 7800 GTX and Xenos was almost twice faster than that.

fqhrNfa.png
 

PaintTinJr

Member
I have never noticed the difference in colors between PS3 / x360 multiplatforms.
Which sort of makes us having a discussion about all the technical rendering differences a bit pointless, because that's such an obvious one, even back in the day when the average CRT/LCD TV was 32" and Plasma was 36-40". And you can even see the 360 gamma in the direct comparison of VT3 in John's video in this thread.
 

PaintTinJr

Member
Xenos had vastly higher triangle throughput than RSX. Even the previous gen midrange X1600 was faster than the 7800 GTX and Xenos was almost twice faster than that.

fqhrNfa.png
That is a worst case benchmark, they are not optimised quad strips and correctly wound as they are in games, as that is resulting in 1/10th of the theoretical value of the RSX. So not really helpful to the discussion me and StateofMajora StateofMajora are having.
 

fart town usa

Gold Member
Nice, its strange, but ive become nostalgic for the ps360, in a way that i didnt expect to, i think its because these consoles were so formative to me as a smelly teen lol. I remember getting the 360 for christmas in 2006, i got gears of war with it, i didnt have an hd tv, but i played the shit out of it on my tiny crt that i had in my bedroom, even on that thing, it was the best looking game id ever seen. So many amazing 360 memories, long nights playing halo 3 custom games, the midnight launches for gears 2 and 3, playing through all of re5 coop, bioshock, discovering my love of mass effect in 2007 and cementing it as one of my all time favorite games ever made, and learning to love real racing sims with forza motorsport 2 and 3

The ps3 was the first console i bought with my own cash in 2009, uncharted 2 is what sold me on it, i saved every penny i Made from my job bagging groceries, as well as my birthday, to get one. By this point we had an “hd tv” by that i mean it could display 720p and 1080i, but man uncharted 2 knocked my socks off. Then later that year and in 2010 i played infamous, ratchet and clank a crack in time, god of war 3, gt5, mgs4 and motorstorm…man the ps3 was such a rad fucking system
arnold schwarzenegger predator GIF


Hell yea, I love reading stories like yours. I didn't game much for the best years of the PS2 and at the time, the only modern system I had was a Wii, lol. I bought a PS3 for bluray, specifically Blade Runner. It was right when Dead Space and Silent Hill: Homecoming released.

It blew my friggin' mind how good Dead Space looked. The PS3 turned me into a total Sony diehard and reminded me of how much I enjoy gaming. That was a really fun generation. I have good memories too of playing GOW coop with my buddy, I have love for the 360 but I was all about the PS3 back then.
 

PaintTinJr

Member
Regarding the lower polygon pushing power of Xenos vs. rsx, I have honestly never heard or seen that in any platform comparison. I have actually seen more aggressive lod in a few PS3 versions of games but never the reverse. Well, apparently it's true for vf5 but I didn't know that one. Given what you have said I 100 percent believe it though.

PS3 exclusives like God of war ascension or uncharted do appear to be using more polygons than 360 equivalent games but it's probably because the Cell is doing a lot of the work leaving the rsx more room to push polys. Late gen PS3 stuff got pretty crazy, like motorstorm apocalypse too.

Do you have any examples of lower poly counts on 360 versions of games when both versions launched at the same time? Maybe I am forgetting something.
The other guy mentioned a whole lot of games, but Batman is probably the one I remember best. It definitely uses more polygons on the PS3 version, resulting in better silhouette edges, but it also results in marginally better lighting, because those extra polygons result in more fragments and more shader lighting fragment ops from that. But the lighting is also using more fragment ops too, anyway - IIRC - because specular highlights are bigger areas on the 360 version, which is a cheaper lighting equation of at least 1 op per fragment.

I also seem to remember some discrepancies with texturing too when I played the first level on the 360, which favoured the PS3 version, where the 360 textures almost band in the darker shades and look muddy, probably either a lower sized mip used, or one is using uncompressed or higher quality compressed mips, and the other is using aggressive block compression, or possibly PS3 is using normal mapping and the 360 is using cheaper bump mapping making the lighting flatter in the calculations.
 
Which sort of makes us having a discussion about all the technical rendering differences a bit pointless, because that's such an obvious one, even back in the day when the average CRT/LCD TV was 32" and Plasma was 36-40". And you can even see the 360 gamma in the direct comparison of VT3 in John's video in this thread.
I tought you were talking about colors (different color spaces, like rec709 vs RGB). Yes, I have noticed gamma differences in SOME multiplatform games, like for example in tomb raider 2013, however after ingame gamma correction I was able to restore crushed blacks, so IDK why you say it was such a big problem. I have played both versions on x360 and PS3, and I still thought X360 was better regardless of this gamma difference.
 
Last edited:
The other guy mentioned a whole lot of games, but Batman is probably the one I remember best. It definitely uses more polygons on the PS3 version, resulting in better silhouette edges, but it also results in marginally better lighting, because those extra polygons result in more fragments and more shader lighting fragment ops from that. But the lighting is also using more fragment ops too, anyway - IIRC - because specular highlights are bigger areas on the 360 version, which is a cheaper lighting equation of at least 1 op per fragment.

I also seem to remember some discrepancies with texturing too when I played the first level on the 360, which favoured the PS3 version, where the 360 textures almost band in the darker shades and look muddy, probably either a lower sized mip used, or one is using uncompressed or higher quality compressed mips, and the other is using aggressive block compression, or possibly PS3 is using normal mapping and the 360 is using cheaper bump mapping making the lighting flatter in the calculations.
Arkham asylum? I played both versions back then, and played the 360 version maybe a year ago. I never noticed less polygons in the character models on either version.

In fact the old digital foundry face off doesn't mention such differences as you, but the opposite - DF

Like pared back textures and normal maps here and there. I also remembered the 360 version has 2xmsaa from seeing it on the beyond 3d list. No AA on PS3. Here's that list if you're interested btw. B3D 360 and PS3 resolution list

I'm not saying you're wrong on batman but it's nothing I've seen or heard before. It would be nice if screenshot comparisons detailing what you say are out there.

I will say it seems unlikely to me, since ue3 favored 360 over PS3 architecture.
 
I tought you were talking about colors (different color spaces, like rec709 vs RGB). Yes, I have noticed gamma differences in SOME multiplatform games, like for example in tomb raider 2013, however after ingame gamma correction I was able to restore crushed blacks, so IDK why you say it was such a big problem. I have played both versions on x360 and PS3, and I still thought X360 was better regardless of this gamma difference.
I have been able to compensate for black crush on 360 as well. Never been an issue on my modern TV's.
 
that would mean you have to...

night dark GIF


CONNECT YOUR PC TO A TV!

Scared Halloween GIF by Jin
thats what i did. connected my PC to a pioneer pdp-4304 (43" 16:9 720p plasma). had both VGA and HDMI inputs.
was my first time connecting my PC to a TV. took a risk and bought it used off some weird site.

never went back to a monitor as my primary display (used a sony gdm-fw900 CRT for certain situations).

big TVs also let you use big speakers + big amplification, which is important because... every game you play has sound.
 

PaintTinJr

Member
Arkham asylum? I played both versions back then, and played the 360 version maybe a year ago. I never noticed less polygons in the character models on either version.

In fact the old digital foundry face off doesn't mention such differences as you, but the opposite - DF

Like pared back textures and normal maps here and there. I also remembered the 360 version has 2xmsaa from seeing it on the beyond 3d list. No AA on PS3. Here's that list if you're interested btw. B3D 360 and PS3 resolution list

I'm not saying you're wrong on batman but it's nothing I've seen or heard before. It would be nice if screenshot comparisons detailing what you say are out there.
Those nonsense faceoffs - reduced to 2D resolution and 0.5% percentile frame-rate dips - also never mention the difference in 3rd dimension resolution, where in Batman the 360 version uses less depth precision - like most 360 versions of games and was a common cheat on ATI hardware - and has a slightly different near/far frustum plane setup to make it look almost identical, but still results in a more claustrophobic world space than the PC and PS3 versions.

The difference in frustum setup also changes where depth cueing fog starts, so some mid scene textures on the 360 might look sharper - than they should - because they aren't depth cued correctly and may be projected closer to the near plane than on PS3 in comparative shots and either selecting a different mip, or just projected to a large number of pixels on screen.
.I will say it seems unlikely to me, since ue3 favored 360 over PS3 architecture.
It never did for quality pixels, and everything from Heavenly Sword, Uncharted, MGS4, Last of Us were all UE3 and modified to be their own "engines" even if large money was paid so they never list them as UE.
 

Panajev2001a

GAF's Pleasant Genius
Those nonsense faceoffs - reduced to 2D resolution and 0.5% percentile frame-rate dips - also never mention the difference in 3rd dimension resolution, where in Batman the 360 version uses less depth precision - like most 360 versions of games and was a common cheat on ATI hardware - and has a slightly different near/far frustum plane setup to make it look almost identical, but still results in a more claustrophobic world space than the PC and PS3 versions.

The difference in frustum setup also changes where depth cueing fog starts, so some mid scene textures on the 360 might look sharper - than they should - because they aren't depth cued correctly and may be projected closer to the near plane than on PS3 in comparative shots and either selecting a different mip, or just projected to a large number of pixels on screen.

It never did for quality pixels, and everything from Heavenly Sword, Uncharted, MGS4, Last of Us were all UE3 and modified to be their own "engines" even if large money was paid so they never list them as UE.

Uncharted = modified Unreal Engine 3? [Citation Needed]
 
Forget it, PS3 buil costs were already sky high. IMO RSX (PS3 GPU) was still very good GPU, but peformance was cut in half because of only 128 bit bus.

8800 GTX/Ultra were very fast GPU's. I bought PS3 and PC with 8800Ultra in the same week (August 2007), and my PC was like 3-4x times faster in multiplatform games. For example FEAR 1 on PS3 was running at 30 fps at 1280x720 (921 600 pixels), while my 8800Ultra was able to deliver 60fps at 1920x1440 (2 764 800 pixels).
Of course it would most likely add to the cost depending of which version of a 8800 it would have been with a comparable power draw as the RSX was allowed to have. Maybe just adding more RAM would provide a significant improvement without adding the same cost. But imho the GPU is the most glaring bottleneck, so it would be on my alternate history fantasy wishlist. Later consoles got some variant of the up to date RDNA system with at least all/most of the features while downclocked and some customisation. The ~7800 would have been more okay to pair with the up to date Cell if the PS3 were released alongside the 360, ie earlier. But as is, it came with an already outdated GPU-Gen. A GPU that was obviously not on par with the ATI counterpart, probably because R&D was invested in getting the series 8 right.
No idea if my speculation comes even close to reality, but let's say a halved custom 8800GTX could have been used, (adding another 200bucks...) but that would effectively already reduce the die size. Maybe then it would have been not only more energy efficient, still more powerful and eventually even cheaper, at least once the process get's shrinked. Or it would have been a monetary disaster...
But I guess it is save to assume that the 1080 dream would have been much more feasable, with practically all games, and a PS4 would have been a much smaller upgrade than it already was, since the main differentiator, actual FHD, would have been already done.
 

Three

Member
lol, jesus Sony really talked some major bullshit ahead of the PS3 launch... god damn...
Ken said some crazy stuff but what he was saying was actually true but just without any context. His 120hz/fps comment was true. Not many games aimed for 120hz but PS3 3D games did (4D as ken called it). The game would output at 120fps for a 60fps 3D game.
 

SkylineRKR

Member
The generation where everyone I knew skipped PS3 altogether and just bought a 360. Then it changed back again with the PS4.

At first everyone did seem to skip PS3, but by the time the Slim had launched and we went into the 2010's a lot of people jumped ship back to Sony. Because Sony simply kept supporting the PS3 with good games, like Last of Us etc. They kept supporting it until the PS4 was on the shelves and even then some games like GT6 were PS3 only. Also a few JP studios started to abandon the 360, like ArcSys. Meanwhile MS sat with a Kinect in decline and gambled on multimedia.

Sony got their audience back and this is also why the PS4 was in such a good position when announced while MS had lost touch with the core audience.
 

PaintTinJr

Member
Uncharted = modified Unreal Engine 3? [Citation Needed]
In the original UC box there was a making video, on disc or a download link and I'm 99.9% sure it is mentioned that it is UE3 in the video, or on one of the screens in the studio when they are showing the advanced systems they've added to the game. That, or maybe it was from when Cambridge Studio was making Heavenly Sword and advertising in develop magazine for staff, and because they were PlayStation's core technology group for UE back then IIRC Uncharted might have been listed as a project gaining from the core technology.

Given that they've had GDC/siggraph type presentations over the years, and were sharing know how of code examples that can easily be incorporated into other publisher's game projects, the list of engines it can be based on, is either UE or Unity, no? And you only need look back at UE3 partnering with PlayStation for new rendering at an E3 prelaunch PS3 launch for cinematics, and then look at UC cinematics of the time and how they were using the SPUs for new systems like weather in UC1 to see it either had to be a custom engine like PhyreEngine that the Souls games use - as Sony had sold their other(Neon?) engine to Codemasters - or UE, as unity wasn't at that cinematic level back then, and do we really believe it uses the PhyreEngine, given the difference in fidelity of DeS and UC1?

Short of me finding the old video to check or someone that worked on it breaking an NDA, or the PS3 emulator exposing UE libraries in UC1-4 or Last of Us that would be a good enough citation; a citation is going to be hard to produce for any debadged engine game, sadly.
 

JackMcGunns

Member
I remember the comments on here and in the gaming media mocking Sony for supporting 1080p for the small percentage of people who had 1080p tvs in 2005. As short sighted as ever.

If I remember correclty, it wasn't about whether 1080p should be supported, it's about Sony saying that it was making it a "Standard" and claiming that only the PS3 was capable of FullHD™ and the toxicity that ensued after that. Sony execs refered to 360 as a mere Xbox 1.5 when referring to Xbox 2 and at the end of the day, 720p was the best way to go during the 360/PS3 era.

Ironically it was Xbox 360 that worked out for my display at the time. I had a TV that supported upconverting 720p broadcasts to 1080i when needed, but would NOT upcovert 720p content connected via Component cables, so PS3 games that were 720p native had to be played in 480p on my TV and MANY gamer's TV at the time, in fact it applied to early adopters more than anyone else. Xbox 360 engineers had the foresight to include a chip that would upcovert 720p games to 1080i, you just needed to select 1080i in the video settings and voila. PS3 did not offer such a basic function, so even if you selected 1080i, games that were 720p would go down to 480p :messenger_neutral:
 
Last edited:

Romulus

Member
Sony probably knew the 360s specs years in advance. They should have just used their formula, but more powerful in every regard(since it was newer) or outright x86. If it weren't for the stupid expensive and inefficient Cell, the ps3 likely could have had a full GB of RAM with higher clock speeds and performance everywhere.
 

adamsapple

Or is it just one of Phil's balls in my throat?
I had 768p panel when I owned my PS3. Was gutted that the 360 actually has 768p as an output option.

The 360 was an incredible machine for supporting all kinds of outputs lol.

One of my first ever 360 experiences was with a standard 4 : 3 PC monitor, and the console had native support for 4 : 3 resolutions.

Many games would also scale to it properly without needing black bars. GTA IV was one of those games I played 4 : 3 on the 360.
 

01011001

Banned
Sony probably knew the 360s specs years in advance. They should have just used their formula, but more powerful in every regard(since it was newer) or outright x86. If it weren't for the stupid expensive and inefficient Cell, the ps3 likely could have had a full GB of RAM with higher clock speeds and performance everywhere.

it's hard to say if they knew. back then it seemed like secrets like these were not leaked as much.

they probably knew at least roughly about the CPU of the 360, given that Microsoft basically went to IBM and asked for the main CPU core design of the Cell Chip to be used in their console.

but the GPU is harder to tell if they knew, especially because they didn't work with ATi at the time.

when it comes to memory, back then consoles always used really small RAM pools compared to PCs, so I think anything more than 512MB was never in the cards as they most likely would have thought that that would be pointless.
 
The 360 Xenos is inferior to the RSX in high quality rendering and polygon throughput, the only advantage it had was the tiny edram buffer bandwidth - that couldn't fit 1280x720 at RGBA8888 with a 32bit zbuffer in double buffered mode - but was ideal for accumulation buffer fx.
Even if it was inferior in raw numbers such as polygon throughput it wasn't in features, X360 GPU, unlike it's desktop counterpart had forward looking features such as unified shaders, a tesselation unit (mostly unused), it had the ability to communicate with CPU directly with less latency than RSX/Cell, and the unified memory pool further reduced a bottleneck against PS3. Also features like like MSAA. Nvidia's solution was definitely behind, quincunx aa definetly cost more. High quality rendering is useless if you're rendering less pixels because your hardware is stalling. They also supported compression in several parts of the rendering pipeline, GPU/CPU could use compression when exchanging data, and even the SRGB gamut was "lossy" against PC and PS3 on purpose. Less coordinates/precision=less data spent. These were all good decisions by the end of the day.

The fact PS3 lacked a full scaler function (just able to stretch horizontal by hardware also fucked it massively against X360, which is why a lot of games rendered different, like rendering 1280x720 on Xbox 360 against shit like 880x1080 on PS3.

If the extra grunt was there they have could sidestepped that, either pushing more pixels without being at a disadvantage or managing to have better framerate in multiplatform games. They didn't manage to do either, so specs on paper are meaningless.
Had Microsoft needed a direct financial return on project Xbox some day - like all normal gaming companies like Atari, Sega, Nintendo, Sony - the 360 would have never launched when it did at those specs. It certainly wouldn't have had 512MB of unified RAM that Epic begged Microsoft to put in the console - instead of the original 256MB - and they would have clocked the Xenon much lower to avoid RRoD - instead of upping it to 3.2GHz to pressure the poor yields of Cell BE to remain at 3.2GHz. The PS3 original design was dual Cell BE processors, 512MB of unified XDR memory and a powerful 2D GPU (similar in flexibility to the Reality Synth of the PS2 produced by Toshiba). The last minute revised design halving the Cell BE count, halving the XDR and licensing an expensive nvidia GTX 9700 grade GPU - with no cost reduction benefit over the generation - is all why we ended up trapped with double/triple buffered deferred renderer 720p25-30 gaming on PS3 for most games - other than sim racers and fighting games that typically went forward renderer with simpler lighting and 1080p60 was used from time to time.
If they went through with that the disaster would be even higher, so I don't know if Microsoft really tilted them towards that more than the fact that it didn't work with two cell's strapped on together.

SPE's weren't magic bullets either, having more of them would actually be worse for most devs.


The GPU wasn't "nvidia GTX 9700 grade" it was between 7600GT and 7800GTX specs-wise (7800 GTX with the same ROPs as a 7600GT) and not forward looking for the time (no 8800 GTX features). that with other things cost them a lot.

I really don't think RSX was better than Xenos tbh.
 
Last edited:

JackMcGunns

Member
The 360 Xenos is inferior to the RSX in high quality rendering and polygon throughput, the only advantage it had was the tiny edram buffer bandwidth - that couldn't fit 1280x720 at RGBA8888 with a 32bit zbuffer in double buffered mode - but was ideal for accumulation buffer fx.

Had Microsoft needed a direct financial return on project Xbox some day - like all normal gaming companies like Atari, Sega, Nintendo, Sony - the 360 would have never launched when it did at those specs. It certainly wouldn't have had 512MB of unified RAM that Epic begged Microsoft to put in the console - instead of the original 256MB - and they would have clocked the Xenon much lower to avoid RRoD - instead of upping it to 3.2GHz to pressure the poor yields of Cell BE to remain at 3.2GHz. The PS3 original design was dual Cell BE processors, 512MB of unified XDR memory and a powerful 2D GPU (similar in flexibility to the Reality Synth of the PS2 produced by Toshiba). The last minute revised design halving the Cell BE count, halving the XDR and licensing an expensive nvidia GTX 9700 grade GPU - with no cost reduction benefit over the generation - is all why we ended up trapped with double/triple buffered deferred renderer 720p25-30 gaming on PS3 for most games - other than sim racers and fighting games that typically went forward renderer with simpler lighting and 1080p60 was used from time to time.


Wrong! early games reflected how inferior RSX was to Xenos, it was only later in the generation when developers started to use SPU's to offload work off of RSX in order to make speed improvements, that finally the PS3 games started to keep up with 360, but despite all the effort with techniques like "SPU-based deferred shading" (Which was no easy task btw) the PS3 failed to beat the 360 counterparts even in the best case scenarios, whereas the 360 was achieving it on Xenos alone and was still wining head to heads late in the game. Developers described working on the Cell as "sweating blood" if they were ever going to achieve anything above what the 360 was already doing without the extra effort and we still haven't even touched some of the widely unused special features in Xenos like MEMEXPORT.

Even John speaks about this at the 17:05 mark of the video.
 
Last edited:
it's hard to say if they knew. back then it seemed like secrets like these were not leaked as much.

they probably knew at least roughly about the CPU of the 360, given that Microsoft basically went to IBM and asked for the main CPU core design of the Cell Chip to be used in their console.

but the GPU is harder to tell if they knew, especially because they didn't work with ATi at the time.
They might as well have worked with ATi. Working with Nvidia means royalties if you emulate games later on, on hardware that isn't theirs (Xbox 360 paid fee's to nvidia due to Xbox emulating Xbox's Geforce 3, and that's perhaps that's one of the reasons sony hasn't delved much into PS3 emulation)

And RSX definitely felt like an afterthought being there, but still way better than a second Cell processor.

I'd say Microsoft only chose a CPU similar to the Cell due to price/GHz and being similar to the direct competition as the CPU was quite honestly not that good even for the time (general performance per core could only, on a good day, double that of the Xbox's Pentium 3 clocked at 733 MHz), but definitely better to have 3 of them than one with 7 spe's bolted in.
when it comes to memory, back then consoles always used really small RAM pools compared to PCs, so I think anything more than 512MB was never in the cards as they most likely would have thought that that would be pointless.
memory price was huge, I think they all started designing their systems around 256 MB, but then reached 512 with different solutions. Microsoft added more memory to the shared pool, Sony decided to separate it and source two types of memory (crazy).
Wrong! early games reflected how inferior RSX was to Xenos, it was only later in the generation when developers started to use SPU's to offload work off of RSX in order to make speed improvements (...)
He was talking about theoretical performance. I'm not even going to get into that as Nvidia usually inflates their specs. But even if you took both GPU's, made PC counterparts, drivers weren't a factor and the X360 GPU took a beating in that scenario, it was totally different when working embedded in their respective systems.

The X360 behaved like a SOC, PS3 behaved like an old car with a second hand nitro turbo bolted in.
If anything PS3 seems to be a bit more powerful than 360 when fully pushed (in order to truly know we'd have to see what Sony devs like Guerilla and ND could have achieved on 360) but it was a Frankenstein design which required far too much effort to extract equal performance vs Xbox 360.
If you could use the SPE's to supercharge your game's renderer then suddenly you had excess of what the X360 GPU could ever offer.

But you had less freedom in regards to what to do with it which is why it was tricky. It's not like you could do most things your wanted in the game, no... you had to find something that you could offload to the SPE and then write a very low-level program to take advantage of it. You could get free water dynamics, free physics, free AI, free sand deformation, free global lightning or free post processing that way freeing more resources elsewhere but often you were doing "next gen stuff" because they were otherwise unused.
Sony literally just should have gotten an identical gpu as 360 from ati or better yet, not accepted Nvidia's old crap and opted for the 8xxx series chipset which would have smoked 360 even without leveraging the cell for graphics.
That wouldn't solve the internal IO and lack of general purpose core count clusterfuck.

Rendering would improve a lot (perhaps they could pull 1280x720 with ease in multiplat) but a lot third party games would continue to run slow despite of that because they would be CPU-bound. PPE is more at fault than RSX a lot of the times.
 
Last edited:

Romulus

Member
it's hard to say if they knew. back then it seemed like secrets like these were not leaked as much.

they probably knew at least roughly about the CPU of the 360, given that Microsoft basically went to IBM and asked for the main CPU core design of the Cell Chip to be used in their console.

but the GPU is harder to tell if they knew, especially because they didn't work with ATi at the time.

when it comes to memory, back then consoles always used really small RAM pools compared to PCs, so I think anything more than 512MB was never in the cards as they most likely would have thought that that would be pointless.

Gears devs were already pushing MS for more RAM well before the 360's release. I can't remember the devs, but they had the idea. It would only be common sense at that point a year later. Even the previous generation, what seemed like a lot with 64MB in the Xbox, was already very taxed by the middle of the generation. Everything was moving fast.
Completely cutting the R&D cost of the CELL and subsequent production of it could have afforded a beast of a ps3 machine if they weren't a different route. Hell studio development cost of getting it running decently would have been in the low hundreds of millions considering all the studios and titles.
 
Last edited:

Riky

$MSFT
I had a retro night on my 360 a couple of days ago, the likes of Titanfall, Rise Of The Tomb Raider and Forza Horizon 2 still hold up really well.
It was great from start to finish.
 
Top Bottom