That math can't be right. Per Nvidia, the Tegra K1 slide I posted shows PS3's GPU listed at 192 GFLOPS.
Multi-way programmable parallel floating-point shader pipelines
Independent pixel (550 MHz)/ vertex shader (500 MHz) architecture
24 parallel pixel-shader ALU pipes clocked @ 500 MHz
5 ALU operations per pipeline, per cycle
- 2 vector4
- 2 scalar / dual / co-issue and fog ALU
- 1 Texture ALU
27 floating-point operations per pipeline, per cycle
8 parallel vertex pipelines @ 500 MHz
2 ALU operations per pipeline, per cycle
- 1 vector4
- 1 scalar, dual issue
10 floating-point operations per pipeline, per cycle
Floating Point Operations:
230 Gigaflops
(500MHz x 24 Shaders x 2 ops per clock per ALU)
68.0 billion shader operations / s
(8 Vertex Shader Pipelines x 2 ALUs x 500 MHz)
24 texture filtering units (TF)
8 vertex texture addressing units (TA)
24 filtered samples per clock
Peak texel fillrate:
12.0 GigaTexels per second
(24 textures x 500 MHz)
32 unfiltered texture samples per clock
(8 TA x 4 texture samples)
8 Render Output units / pixel rendering pipelines
Peak pixel fillrate:
4.4 Gigapixel / s
Peak Z sample rate:
8.0 GigaSamples / sec
(2 Z-samples x 8 ROPs x 500 MHz
Support for PSGL (OpenGL ES 1.1 + Nvidia Cg)
(note: techpowerup says it is clocked at 500MHz but most sources say it is 550MHz)
Well, to be fair to dig up this lot of information, it was never the goal .. I was to answer the previous post "why manufacturers only choose AMD and not NVidia".
And Nvidia had to promote its Tegra ...
The following are relevant facts about the RSX ...
Little Endian
8 vertex shaders at 500Mhz
28 pixel shaders (4 redundant, 24 active) at 550Mhz
28 texture units (4 redundant, 24 active)
8 Raster Operations Pipeline units (ROPs)