But my favourite quote - just for out of context potential...
PS4 one million times faster than Xbox One confirmed.
http://youtu.be/kro3Caums2c
But my favourite quote - just for out of context potential...
PS4 one million times faster than Xbox One confirmed.
The point is that you are again making up arguments based on not directly comparable numbers, while you still dismiss all arguments based on actual comparisons at the same settings. Don't you see what's off about this?
For the comparisons I found in this thread most were either about the new current gen hardware or used graphics card way above the G3D score of the 360 GPU. And all ran on a beefier (CPU + RAM) system. The resolution was significantly higher in most (all?) cases - not just a bit higher as in the Rage example. The performance graphs posted did not even include the 360 data. Maybe I'm really blind, but I just don't see at what data you try to point me to.
If You want xbox 360 data just write 'digital foundry [game name] face off' in google.
I dont know if You are acting or are just ignorant, anymore.
You were asking for 360 performance tests to compare with posted by us examples from low end PCs.
Now You want a site that compares 7-8 years old PCs with past gen consoles in new releases? Who in a right mind would do that?
You cant even buy officially such parts anymore.
Maybe you use a roughly 2x faster PC and see if the game runs about the same.
But i provided such examples already few pages ago with Mass Effect 3, Crysis 2 and Battlefield 3.
C2D + 8800GT is around 2 times faster in gflops than past gen consoles.
http://www.neogaf.com/forum/showpost.php?p=127284908&postcount=156
And You've even quoted part of this post...
http://www.neogaf.com/forum/showpost.php?p=127286636&postcount=161
The data can be found on numerous benchmarks and youtube videos where similar specced GPU's the 8700/8800 cards, can and do run games at similar or better settings than a last gen console.
And that equivalent or near equivalent GPU silicon on PC runs current gen console multi-plats about as well as current gen consoles.
API overhead resides mostly in the CPU, where consoles can squeeze a lot more 3D rendering performance out of them (note this is STRICTLY an increase in 3D rendering related performance - not overall performance). The reason this isn't apparent vs a run of the mill gaming PC is that most run of the mill gaming PC's sport CPU's that are 2-4 times more powerful than what's on a PS4/Xbone.
http://www.tomshardware.com/reviews/geforce-8800-gts-512-mb,1743-2.html
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)
8800GT - 336 gflops
Xenos - 240 gflops
So you're telling me they are using an i7 to eliminate CPU bottlenecks, but there aren't any? Yeah I really should learn how THAT sort of things are working. I should use that reasoning in my own evaluations. It would save me from a lot of headaches.i7 is there to eliminate CPU bottlenecks in benchmark, but both Crysis 2 and Mass Effect 3 would be first GPU limited with 8800GT than CPU on C2D.
BF 3 would need Quad-core or very frequency on C2D to get to 60fps, because its very CPU heavy in multiplayer, but if You would increase resolution and graphical settings to stick to 30hz, the CPU would not bottleneck 8800GT.
And i'm done with explaining You the basic stuff. If You still dont understand the comparison, You need to learn how all of it is working.
http://www.tomshardware.com/reviews/geforce-8800-gts-512-mb,1743-2.html
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)
8800GT - 336 gflops
Xenos - 240 gflops
i7 is there to eliminate CPU bottlenecks in benchmark, but both Crysis 2 and Mass Effect 3 would be first GPU limited with 8800GT than CPU on C2D.
BF 3 would need Quad-core or very frequency on C2D to get to 60fps, because its very CPU heavy in multiplayer, but if You would increase resolution and graphical settings to stick to 30hz, the CPU would not bottleneck 8800GT.
And i'm done with explaining You the basic stuff. If You still dont understand the comparison, You need to learn how all of it is working.
Yes.People still don't know that comparing PC and console parts is a completely meaningless exercise? Developers can push performance with the consoles a whole lot further than with comparable PC parts.. Isn't this common knowledge by now?
So you're telling me they are using an i7 to eliminate CPU bottlenecks, but there aren't any? Yeah I really should learn how THAT sort of things are working. I should use that reasoning in my own evaluations. It would save me from a lot of headaches.
No, it exactly shows that You cant get 100% utilization from console GPUs and PC GPUs utilization in not in 50% range.you know damn well that gflop metric in no way represents the actual performance difference of the 2 parts. when it comes to actual performance nvidias 8800 line is roughly 2.5 to 3x faster than the gpu in xbox 360 depending on whether its gt/s/x
No, it exactly shows that You cant get 100% utilization from console GPUs and PC GPUs utilization in not in 50% range.
It shows that architecture matter a lot and bottlenecks can force developers to design their tech in games more carefully.
It also shows that late gen ports were both CPU and GPU limited quite hard, so if You had cycle on GPU, some CPU tasks could diminish Your performance. On PC You generally are limited from one side - GPU or CPU.
And its two times faster in games.
JGF your entire argument has now winded down to https://www.youtube.com/watch?v=KX5jNnDMfxA
You still cling to somehow getting 2x performance out of a console compared to an equivalent spec pc, despite being explained (incredibly patiently by absolute saints) that any low overhead "to the metal" optimisation only applies to the cpu cost of 3d rendering, and that it does nothing tangible for the gpu side of the rendering. (you know, the actual shading of those 3d objects the cpu renders and silly stuff like deciding what color the pixels should be or how many pixels you can show)
<rant goes on>
There arent any in 30hz and for 8800GT. Sure there will be CPU bottlenecks on C2D when You put GTX 570 and want to run it in 100hz
Remember it was a benchmark. No one would match 570 with C2D, because he/she wont test 570, because of C2D bottlenecks.
Its common logic.