I'll need someone like a tech vetted dev on GAF to elaborate more on this.
I'm not vetted, and really just an ass hat in an office with a big mouth, but here 're goes
I'm not explaining shit, as it's implied from your posts you get it
Teraflops are great and all for measuring computations at a hardware level, but it's REALLY only part of the equation
The software running on the hardware is really going to be a major factor, as well as how the operating system runs and talks with the hardware
For example; you could have a full 3D rendered game, exceptionally coded, great graphics, and because the code is so exceptional and utilizes the hardware in particular ways, the hardware never even bats an eye. Put an equally coded bit coin miner on it? Gonna crank out that heat son
Then, you have to take into account that in the lifespan of the product, earlier games are not going to utilize the hardware as well as later, because of the knowledge of how the hard ware works is still being built upon.
You can argue this and that about hardware and teraflops and whatever; a huge component is still that you have to have firmware, an OS, and software running to utilize that hardware, and until we get live units, there's really not much to discuss
Software + firmware + OS + Hardware = System; Software, firmware, and OS are going to play a role too, so ignoring them in discussion would be silly
Even when we can do benchmarks, it's not going to be really something to be able to answered until the product comes out, and is under study for some time
tl;dr Don't use teraflops strictly as a basis for how well a system will perform, it's silly, and ignores a bunch of factors in the equation