• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Leadbetter interviews 4A's Oles Shishkovstov about current gen consoles + PC

jgf

Member
The point is that you are again making up arguments based on not directly comparable numbers, while you still dismiss all arguments based on actual comparisons at the same settings. Don't you see what's off about this?

I tried to lay out to the best of my ability how I came to this conclusion for the Rage benchmark. For example I'm unsure if the G3D number is a good metric to use in this context, or if Wikipedia tells the truth that an X1800 is a card similar to 360s GPU. If anyone can make a good argument against this, my whole comparison falls apart.

I argued about the higher resolution in the PC benchmark. And I think its a fair guess to make, that with an auto-balancer in place, the resolution will need to drop quite frequently to get from 32fps to 60fps. Thats somewhat made up, granted. But I think its a fair guess. If anyone can explain to me why thats not working that way, then I have no problem to accept that.

Also I don't use that as argument that the console version obviously must have that famous "roughly 2x performance gain". I just say that I can't use the result to draw the conclusion that PC and 360 perform about the same. If anything it looks like the 360 is punching above its weight in that comparison.

You say that the numbers are not directly comparable. I never disputed that. In fact thats what I was saying in the beginning. You can't really measure if that gain exists without running comparisons on the same hardware in the same settings. In the process of the discussion I agreed that it makes sense to at least use them as an indication, therefore I went for it.

So in my comparison the settings are not the same and the hardware is not the same I get that. Thats a fair point. But your're telling me that I ignore actual comparisons in the same setting. I really don't know which comparison you are talking about. In a previous post I mentioned all the comparisons I found in this thread. Maybe there are some very famous other comparisons that I'm not aware of. Just point me to the thread/article/google keywords to read and I'll have a look at them.

For the comparisons I found in this thread most were either about the new current gen hardware or used graphics card way above the G3D score of the 360 GPU. And all ran on a beefier (CPU + RAM) system. The resolution was significantly higher in most (all?) cases - not just a bit higher as in the Rage example. The performance graphs posted did not even include the 360 data. Maybe I'm really blind, but I just don't see at what data you try to point me to.
 

KKRT00

Member
For the comparisons I found in this thread most were either about the new current gen hardware or used graphics card way above the G3D score of the 360 GPU. And all ran on a beefier (CPU + RAM) system. The resolution was significantly higher in most (all?) cases - not just a bit higher as in the Rage example. The performance graphs posted did not even include the 360 data. Maybe I'm really blind, but I just don't see at what data you try to point me to.

If You want xbox 360 data just write 'digital foundry [game name] face off' in google.
 

jgf

Member
If You want xbox 360 data just write 'digital foundry [game name] face off' in google.

So I just googled for a "Doom 3" comparison as its another game from Carmack that I know of. I found this: http://www.eurogamer.net/articles/digitalfoundry-doom-3-bfg-edition-face-off

It contains a comparison between 360, PS3 and some PC. I read there that "The £300 Digital Foundry PC runs the game beautifully at either of our test resolutions". So far so good. What I'm missing here is the specifications of said £300 PC. I could not find them.

So I googled it "£300 Digital Foundry PC" and found this one (the date suggest its from Jun 2012, the comparison is from October 2012, so that might be the right PC): http://www.eurogamer.net/articles/df-hardware-introducing-the-digital-foundry-pc

It contains a Pentium G840, 8GB of RAM and a Radeon HD 6770 with a G3D score of 1673. Am I supposed to compare that to the 360 with its XCPU, 512mb RAM and an GPU with (supposedly) a G3D score of 140?

Maybe I googled for the wrong comparison. Any suggestions?
 

KKRT00

Member
I dont know if You are acting or are just ignorant, anymore.

You were asking for 360 performance tests to compare with posted by us examples from low end PCs.
Now You want a site that compares 7-8 years old PCs with past gen consoles in new releases? Who in a right mind would do that?
You cant even buy officially such parts anymore.
 

jgf

Member
I dont know if You are acting or are just ignorant, anymore.

You were asking for 360 performance tests to compare with posted by us examples from low end PCs.
Now You want a site that compares 7-8 years old PCs with past gen consoles in new releases? Who in a right mind would do that?
You cant even buy officially such parts anymore.

I'm not trying to be ignorant and I'm not acting. I'm honesty trying to give it my best shot. I have always been asking for the same thing. We were talking about said 2x performance statement. So in my (somehow bizzarro?) world that would mean you take two similar specced systems, a PC and a console, run tests and then compare the results. If thats not possible you try to get as close as you can to similar specs. Maybe you use a roughly 2x faster PC and see if the game runs about the same. That should work too. If you use a significantly faster PC for comparison you have at least to argue why and in which way the result of the comparison allows implications to the 2x performance statement.

Everybody says there is so much data that debunks the 2x performance myth. So where is it? I argued why I think that the two comparisons I found do not work. You are free to argue why they still do, or to point me to a comparison that does. So when I'm ignorant, where in my line of reasoning is the ignorance?
 

Kinthalis

Banned
The data can be found on numerous benchmarks and youtube videos where similar specced GPU's the 8700/8800 cards, can and do run games at similar or better settings than a last gen console.

And that equivalent or near equivalent GPU silicon on PC runs current gen console multi-plats about as well as current gen consoles.

API overhead resides mostly in the CPU, where consoles can squeeze a lot more 3D rendering performance out of them (note this is STRICTLY an increase in 3D rendering related performance - not overall performance). The reason this isn't apparent vs a run of the mill gaming PC is that most run of the mill gaming PC's sport CPU's that are 2-4 times more powerful than what's on a PS4/Xbone.
 

jgf

Member
But i provided such examples already few pages ago with Mass Effect 3, Crysis 2 and Battlefield 3.
C2D + 8800GT is around 2 times faster in gflops than past gen consoles.
http://www.neogaf.com/forum/showpost.php?p=127284908&postcount=156

This post contains a graph about the performance of Mass Effect 3. Running on an i7 in max. settings in 1680x1050 and 1920x1080. Thats what the top of the chart says. So no C2D there. The 8800GT shows a G3D score of 756 (http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+8800+GT) thats round about 5x the score of a Radeon X1800 - the card wikipedia says is comparable to the 360 GPU. The G3D score may be flawed, use some weird non-linear scale or may not be representative at all. But if this is the case you have to tell me why. The 9600GT has a G3D score of 751 but performs worse than the score might suggest in comparison to the 8800GT. So maybe it is not the right metric to use at all?

As it stands - at least for the mass effect graph - I don't see that the used pc is only 2x faster. It looks a lot faster to me.

The next graph about Crysis 2 does not contain any PC specs at all - only graphic cards. Again the slowest card is the 8800GT. I'll assume they use the same PC as in the Mass Effect 3 graph.

As also Kinthalis tells me that a 8800GT is similar to the 360 GPU it would be nice of you to explain to me why its G3D score is so vastly different, why this score is bogus or what metric to use instead. If it is in fact comparable to the 360 GPU then this benchmark obviously has some weight. \\\edit\\\ sorry my bad. You told me that the PC is 2x as performant as the 360, not similar. So in your opinion the 8800gt should be about 2x as fast as the 360 GPU. While beeing less specific Kinthalis tells me that the 8700/8800 series are roughly comparable. Please keep in mind that the difference between similar performance and 2x performance is quite critical when arguing about a 2x performance gain. Never the less the G3D score suggest quite more than a 2x performance lead over a Radeon X1800 .\\\edit\\\ Also I would be more convinced with a PC that does not contain an i7.


Yes I quoted it and I even referred to it as performance graphs of vastly superior PCs in another post. Now you're telling me that the PC used for this graphs is a C2D while the very title of the graph says its an i7. At least in the Mass Effect 3 graph.


The data can be found on numerous benchmarks and youtube videos where similar specced GPU's the 8700/8800 cards, can and do run games at similar or better settings than a last gen console.

And that equivalent or near equivalent GPU silicon on PC runs current gen console multi-plats about as well as current gen consoles.

API overhead resides mostly in the CPU, where consoles can squeeze a lot more 3D rendering performance out of them (note this is STRICTLY an increase in 3D rendering related performance - not overall performance). The reason this isn't apparent vs a run of the mill gaming PC is that most run of the mill gaming PC's sport CPU's that are 2-4 times more powerful than what's on a PS4/Xbone.

As you are the second one telling me that the 8800 GT is equivalent to the 360 GPU, I'm inclined to believe you. But how do you explain the discrepancy between the G3D score of the Radeon X1800 card and the 8800GT. Is the Wikipedia article flawed (it may not be the first), is the score plain nonsense? So why is a 8800GT considered similar to the 360 GPU?

The second part of your post shifts the argument to a pure GPU view. I think that may be a critical issue. To me a similar specced system is not equal to a system with a similar GPU. I get that in modern PCs gaming performance is GPU bound (at least in current games), as most modern CPUs are "fast enough". So GPU performance equals in-game performance. That may be a valid position - in case the 360 CPU is also already fast enough to be a non-issue in terms of game performance. It may be, but I don't know and so far nobody said so or pointed me to any source that supports this claim.

In conclusion. If the 8800GT performance is really comparable to the 360 GPU and using an old PowerPC XCPU compared to an i5 or i7 does not change the in-game performance in any significant way (not to mention the amount of RAM), then I can get behind the result of these comparisons.
 

KKRT00

Member
http://www.tomshardware.com/reviews/geforce-8800-gts-512-mb,1743-2.html
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

8800GT - 336 gflops
Xenos - 240 gflops

i7 is there to eliminate CPU bottlenecks in benchmark, but both Crysis 2 and Mass Effect 3 would be first GPU limited with 8800GT than CPU on C2D.
BF 3 would need Quad-core or very frequency on C2D to get to 60fps, because its very CPU heavy in multiplayer, but if You would increase resolution and graphical settings to stick to 30hz, the CPU would not bottleneck 8800GT.

And i'm done with explaining You the basic stuff. If You still dont understand the comparison, You need to learn how all of it is working.
 

jgf

Member

In other words the Wikipedia article is flawed that suggest a X1800 XT with around 83 gflops as a similar card. Given the gflops that indeed suggest that Xenos is comparable to an 8800GT. Its around 1.4x as fast in gflops.

i7 is there to eliminate CPU bottlenecks in benchmark, but both Crysis 2 and Mass Effect 3 would be first GPU limited with 8800GT than CPU on C2D.
BF 3 would need Quad-core or very frequency on C2D to get to 60fps, because its very CPU heavy in multiplayer, but if You would increase resolution and graphical settings to stick to 30hz, the CPU would not bottleneck 8800GT.

And i'm done with explaining You the basic stuff. If You still dont understand the comparison, You need to learn how all of it is working.
So you're telling me they are using an i7 to eliminate CPU bottlenecks, but there aren't any? Yeah I really should learn how THAT sort of things are working. I should use that reasoning in my own evaluations. It would save me from a lot of headaches.

Aside from that, if the 360 CPU is not relevant (aka no bottleneck) to the game performance, (and the 360 version performs equal or worse) then this benchmark really suggest that the 2x performance gain does not exist in practice.
 
http://www.tomshardware.com/reviews/geforce-8800-gts-512-mb,1743-2.html
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

8800GT - 336 gflops
Xenos - 240 gflops

i7 is there to eliminate CPU bottlenecks in benchmark, but both Crysis 2 and Mass Effect 3 would be first GPU limited with 8800GT than CPU on C2D.
BF 3 would need Quad-core or very frequency on C2D to get to 60fps, because its very CPU heavy in multiplayer, but if You would increase resolution and graphical settings to stick to 30hz, the CPU would not bottleneck 8800GT.

And i'm done with explaining You the basic stuff. If You still dont understand the comparison, You need to learn how all of it is working.

you know damn well that gflop metric in no way represents the actual performance difference of the 2 parts. when it comes to actual performance nvidias 8800 line is roughly 2.5 to 3x faster than the gpu in xbox 360 depending on whether its gt/s/x
 
People still don't know that comparing PC and console parts is a completely meaningless exercise? Developers can push performance with the consoles a whole lot further than with comparable PC parts.. Isn't this common knowledge by now?
 

ethomaz

Banned
People still don't know that comparing PC and console parts is a completely meaningless exercise? Developers can push performance with the consoles a whole lot further than with comparable PC parts.. Isn't this common knowledge by now?
Yes.

But some PC users want to fight back.

A console hardware will run a game better than a similar PC and PC will destroy it with the better hardware.
 
JGF your entire argument has now winded down to https://www.youtube.com/watch?v=KX5jNnDMfxA

You still cling to somehow getting 2x performance out of a console compared to an equivalent spec pc, despite being explained (incredibly patiently by absolute saints) that any low overhead "to the metal" optimisation only applies to the cpu cost of 3d rendering, and that it does nothing tangible for the gpu side of the rendering. (you know, the actual shading of those 3d objects the cpu renders and silly stuff like deciding what color the pixels should be or how many pixels you can show)

After being told that you say we can't know because there is no way to measure it and compare. So Neogaf, Believe.
When posters give you multiple real world comparisons that are as close as you'll get and point to a whole generation of games that doesn't have a single game showing any meaningful (let alone 2x) performance difference between console and pc hardware, you won't dignify it and try to squirm out of it.

Your argument then turns to how the more efficient use of the cpu could be used to free up gpu resources on ps4 similar to how the cell did it on ps3.
Yes man... that anemic rollerskate of a jaguar cpu that more than negates any cpu performance benifit compared to 4-8x more powerful desktop cpus is going to take over work from the gpu (the hd7850 being a disproportionally much more competent part of the console)
... While the console manufacturers are talking about doing the exact opposite (using gpu compute to take some load off those poor overworked jaguar cores)

If ifs and buts were candy and nuts, we'd all have a merry Christmas.
But reality is all that matters, last gen consoles did not outperform equivalent pc gpus, later on gpus that were on paper 4x stronger did in fact render games 4x faster (at higher settings too) and something like a gtx680 which on paper was 10x faster would render DMC at 300 fps at higher res and settings vs the 30 fps console version.
Current gen games so far have shown the same results when comparing them to a 260x or 270

Now you can hold on to your hope and frame that contextless carmack quote above your bed, while ignoring everything Durante has told you and all the evidence games provide you and hope that maybe some day down the line a ps4 can outperform a hd 7850. ( (long after it or its equivalent successor stops being stocked by retailers)
You're just doing yourself a disservice, as well as everyone you try to convince.
Accept these consoles for what they are and judge their value based on what they are. It's a lot easier and there is a lot less buyer's remorse and dissapointment on that path.
 

KKRT00

Member
So you're telling me they are using an i7 to eliminate CPU bottlenecks, but there aren't any? Yeah I really should learn how THAT sort of things are working. I should use that reasoning in my own evaluations. It would save me from a lot of headaches.

There arent any in 30hz and for 8800GT. Sure there will be CPU bottlenecks on C2D when You put GTX 570 and want to run it in 100hz
Remember it was a benchmark. No one would match 570 with C2D, because he/she wont test 570, because of C2D bottlenecks.
Its common logic.

---
you know damn well that gflop metric in no way represents the actual performance difference of the 2 parts. when it comes to actual performance nvidias 8800 line is roughly 2.5 to 3x faster than the gpu in xbox 360 depending on whether its gt/s/x
No, it exactly shows that You cant get 100% utilization from console GPUs and PC GPUs utilization in not in 50% range.
It shows that architecture matter a lot and bottlenecks can force developers to design their tech in games more carefully.
It also shows that late gen ports were both CPU and GPU limited quite hard, so if You had cycle on GPU, some CPU tasks could diminish Your performance. On PC You generally are limited from one side - GPU or CPU.
And its two times faster in games.
 
No, it exactly shows that You cant get 100% utilization from console GPUs and PC GPUs utilization in not in 50% range.
It shows that architecture matter a lot and bottlenecks can force developers to design their tech in games more carefully.
It also shows that late gen ports were both CPU and GPU limited quite hard, so if You had cycle on GPU, some CPU tasks could diminish Your performance. On PC You generally are limited from one side - GPU or CPU.
And its two times faster in games.

so you think the 8800gt is only 40% faster than xenos as those glop metrics would imply?
 

jgf

Member
JGF your entire argument has now winded down to https://www.youtube.com/watch?v=KX5jNnDMfxA

You still cling to somehow getting 2x performance out of a console compared to an equivalent spec pc, despite being explained (incredibly patiently by absolute saints) that any low overhead "to the metal" optimisation only applies to the cpu cost of 3d rendering, and that it does nothing tangible for the gpu side of the rendering. (you know, the actual shading of those 3d objects the cpu renders and silly stuff like deciding what color the pixels should be or how many pixels you can show)

<rant goes on>

You got me wrong in one point. I never clinged to the 2x performance gain. I barely said that I can't rule it out. Thats a huge difference. In that sense I went from 25:75 pro 2x to 50:50 and have now arrived at 75:25 likely a myth. Please don't confuse me with someone that jumps to conclusions based on personal preference. I try to keep openminded and I don't like when judgement is passed without definitive evidence. You may discuss console trolls very frequently, that may be the cause for this extreme reaction IMHO. I always tried to present my argument together with my line of reasoning and I never ruled out the possibility that any part of the argument may be flawed. I think everybody should do this.

See I have no buyers remorse. I've got a pretty potent PC (i7, gtx680) myself. That was never my point.

There arent any in 30hz and for 8800GT. Sure there will be CPU bottlenecks on C2D when You put GTX 570 and want to run it in 100hz
Remember it was a benchmark. No one would match 570 with C2D, because he/she wont test 570, because of C2D bottlenecks.
Its common logic.

Yeah I get why no PC user would be interested in that type of benchmark. So to put this unholy discussion to an end. As I said before, so far I feel its very likely that the 2x performance gain is not present in the last gen. I would feel better if at least one of the big engine devs would go on record that its not true or/and if a benchmark even closer to the actual console specs would exist - one that also includes CPU and memory. I really wonder why neither has happened so far (to my knowledge).
 
Top Bottom