• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Leadbetter interviews 4A's Oles Shishkovstov about current gen consoles + PC

coastel

Member
I though on pc you need a CPU 2 times better of the ps4 counterpart to run the same equivalent graphic in a game, most of the times. But I found the whole comparison pointless. If pc are like per like to console, why the same rig on pc comparable to the ps4 can't run the same things?

Yea comparing a i7 @ 4.3ghz to the ps4 cpu is a joke, crap article very fanboyish.
 

teokrazia

Member
is entirely different from a whole team like 4A working on porting an already x86 game to other x86's platforms, dialing down a few features and putting in better environmental assets for certain areas.

Oh my, how ungenerous sounds this. In 2033 they have overhauled TONS of stuff, both from a technical and artistic standpoint, and basically reversed the game in a different UI and gameplay environment.

Oh, also Spartan mode.
 

coastel

Member
Isnt the point of the article to point at the GPU and not the CPU? Hence the CPU's clock and type?

Yea comparing a £100 g/card to an entire console with a different cpu then expecting it to be a 1:1 comparison, tell me you don't see something wrong with it?
 
Yea comparing a £100 g/card to an entire console with a different cpu then expecting it to be a 1:1 comparison, tell me you see something wrong with it?

I see something wrong with that perhaps if it is implying that the hardware is perfecly analogous, but I see nothing wrong with comparing the GPUs unbound performance against consoles as to prove a point about the difference between GPU optimizations.
 

Durante

Member
Yea comparing a £100 g/card to an entire console with a different cpu then expecting it to be a 1:1 comparison, tell me you don't see something wrong with it?
There's nothing wrong with it if the purpose is to compare GPU efficiency in both settings. What it tells us is that PC APIs don't introduce overhead in pure GPU calculations. I know from experience that this is not obvious for everyone, but often highly contested, so it has value as a finding.

It would only be wrong to derive any ideas about CPU overhead from this comparison.
 
In this analogy, the core benefit of the console is hassle-free knowledge that you will be served for the extent of the generation with a relatively low initial investment (i.e. whatever about efficiency across multiplats, you do tend to get good bang for buck, and relatively brilliant bang for buck in the higher investment exclusives). It's low risk, with a known support window (i.e. the entirety of the generation), and there's no comparison envy within the scope of that platform. My PS4 bought now will play games just as well in 5 years time as a PS4 bought then. It's just all low low risk and stability, relatively cheap, and that's all very attractive to a lot of consumers. Consoles are popular for good reason.

Yup. The constant feeling that I could run my pc games just a little better and feeling that I was missing out on the 'optimal' experience by not upgrading is one of the reasons I prefer gaming on console. As long as the game runs solidly enough and looks good enough, I'm happy to buy the ps4 version of a multiplatform game. I understand not all of us think the same way though.
 
Isnt the point of the article to point at the GPU and not the CPU? Hence the CPU's clock and type?

Even most pc enthusiasts (sadly) don't understand how a benchmark works or are able to read and interpret the graphs properly, probably too much to ask for someone who is on the console side of things to do so:\

e.g in an amd vs intel thread you'd have some idiot pointing out that cpu chart in this gpu bottlenecked scenario (this one http://www.techspot.com/articles-info/734/bench/CPU_01.png) as some sort of evidence that an amd fx is as good as an i7 for gaming:p
You used it right though, it shows that in this case (bf4 at console settings) the game is gpu bottlenecked and has plenty of cpu performance headroom for the 260x benchmark to be representative of its performance compared to the ps4 gpu+ 'coding to the metal'

But again, most people don't understand why you would want to benchmark a gpu by ruling out any cpu bottlenecks, and why you'd test a cpu by doing the opposite.
 
Even most pc enthusiasts (sadly) don't understand how a benchmark works or are able to read and interpret the graphs properly, probably too much to ask for someone who is on the console side of things to do so:\

e.g in an amd vs intel thread you'd have some idiot pointing out that cpu chart in this gpu bottlenecked scenario as some sort of evidence that an amd fx is as good as an i7 for gaming:p

Yep. :/
The cpu is the ps4 is bad yet with the gpu produces results to 260x and an i7.

Hence why battlefield 4 on PS4 is probably GPU bound. Which its 900p what its resolutions tells us. This then further points towards the tests legitimacy.
 

StevieP

Banned
Yup. The constant feeling that I could run my pc games just a little better and feeling that I was missing out on the 'optimal' experience by not upgrading is one of the reasons I prefer gaming on console. As long as the game runs solidly enough and looks good enough, I'm happy to buy the ps4 version of a multiplatform game. I understand not all of us think the same way though.

I don't get that. You're generally getting even less with the ps4 than you do on an enthusiast rig, though.
 

AmyS

Member
Digital Foundry: Xbox 360 and PS3 were highly ambitious designs for the 2006/7 era.

That's a bit off.

Xbox 360 hw (Xenon) was designed between 2002 and 2004, launching in 2005.
PS3 was designed, starting with CELL, between 2001 and 2005, launching in 2006.
 

coastel

Member
Yep. :/


Hence why battlefield 4 on PS4 is probably GPU bound. Which its 900p what its resolutions tells us. This then further points towards the tests legitimacy.

So am i correct to say that if I have a 260x and a fx 4100 will not get less average fps than a 260x and an i7 really curious to as im about to get a PC I can save cash if this is true ofline and online.
 
So am i correct to say that if I have a 260x and a fx 4100 will not less average fps than a 260x nd a i7 really curious to as im about to get a PC I can save cash if this is true ofline nd online.
Well I have not run the test, so it is hard to say. Would depend on the OS and game. But why the FX4100 for a console CPU comparison. Why not something with a similar core count?

I would definitely never recommend the 4100 for any DX9 or single threaded DX11 title (it would of course run them, but not in a way I would find great). Especially if you are aiming for a constant 60fps in multiplayer gaming at "PC gamer" settings.

In general you are better off with one of the cheaper less threaded intel chips currently. With DX12, newer Open GL games, or Mantle games you would see this maybe changing in a bit.
 
So am i correct to say that if I have a 260x and a fx 4100 will not less average fps than a 260x and an i7 really curious to as im about to get a PC I can save cash if this is true ofline nd online.

In sp in this game with this gpu and at these settings yes.

In saint's row 4 and AC games or most mmos you'd be stuck at 30 fps (just like the consoles with their equally low end cpus)
Noone buys a cpu for one game, don't buy an fx4100 for gaming;p

Go to this thread for information: http://www.neogaf.com/forum/showthread.php?t=835397
If you are on a super strict budget the anniversary edition pentium is much better than the amd fx, if you care about 60 fps you want nothing less than a quad core i5
(similarly don't buy a ps4 for multiplatform games if you care about 60 fps)
 
I don't get that. You're generally getting even less with the ps4 than you do on an enthusiast rig, though.

With the ps4 there is no option of upgrading, therefore less insecurity on my part about missing out on the 'best' experience by not spending more money to upgrade my pc. I know, it's my fault for being like that.
 

belmonkey

Member
Yea comparing a i7 @ 4.3ghz to the ps4 cpu is a joke, crap article very fanboyish.

The cpu is the ps4 is bad yet with the gpu produces results to 260x and an i7.

As people (myself included) have mentioned, singleplayer is GPU-bound, meaning the CPU isn't terribly important after a certain point (anything above a quad-core FX 4100, or probably even the $80 quad-core Athlon 750K doesn't offer much benefit). They could have ran their benchmark with a ~$400 PC with a 260x and an Athlon 750k and the result probably wouldn't be much different. That i7 would be a lot bigger help in multiplayer though.
 
So am i correct to say that if I have a 260x and a fx 4100 will not less average fps than a 260x and an i7 really curious to as im about to get a PC I can save cash if this is true ofline and online.

I think I managed to parse this. The answer is it depends on the game, but the quote you responding to isn't suggesting that a CPU cannot be the primary bottleneck. He's just suggesting that the choice of not going with a 1080p render resolution indicates that they were making compromises to deal with a GPU bottleneck in this one situation.
 

coastel

Member
As people (myself included) have mentioned, singleplayer is GPU-bound, meaning the CPU isn't terribly important after a certain point (anything above a quad-core FX 4100, or probably even the $80 quad-core Athlon 750K doesn't offer much benefit). They could have ran their benchmark with a ~$400 PC with a 260x and an Athlon 750k and the result probably wouldn't be much different. That i7 would be a lot bigger help in multiplayer though.

Yea there wil be a fps drop surely in sp and alot in mp which i dont think they compared and woulnt matter anyway as an i7 is great for online games to. Im trying to say it's wrong comparing and it's basically impossible for a fair comparison I couldn't care if a console is or isn't 2x the power of the same sort of PC build ( iv'e stated before I don't beleive it ). I just find the comparisons pointless. Like the article's headline was just fanboy bait to what is a good card to test.
 

jgf

Member
To try to take a different tack on this: you did ask about his experience with this stuff, as though that answer was relevant to the debate (which it's not). Durante's too modest to respond so I think I'll step in for him a bit.

The person you're talking to has his PhD in Computer Science and is a researcher working in, IIRC, compiler optimization for high end parallel computing on heterogenous processors. He's most famous on the internet for writing code that fixed the PC port of Dark Souls and more recently on a tool that allows users to use supersampling with any game they want. I remember from earlier posts that he also has quite a decent amount of experience writing drivers.

So, knowing absolutely nothing about this conversation, I would say that Durante has the kind of experience profile associated with being able to make general claims on this subject, even though he doesn't work in gaming.
I know that he did the popular mod for Dark Souls and I also never questioned his competence. Still I think the person whose quote we're discussing is at the very least equally competent in this area.
 

coastel

Member
I think I managed to parse this. The answer is it depends on the game, but the quote you responding to isn't suggesting that a CPU cannot be the primary bottleneck. He's just suggesting that the choice of not going with a 1080p render resolution indicates that they were making compromises to deal with a GPU bottleneck in this one situation.

I see what you mean now but why compare a 260x with an OC i7 and 16gb ram to a £350 console and say they are similar specs to prove the point he was trying to make in the first place that console hardware doesn't have benefits from the same sort of PC hardware which again isn't ever possible to test 1:1.
 
but in an age where not even Naughty Dog can't run its last-gen title consistently at 1080p60 on PS4

That is just a shitty dig. Converting an end of the console cycle game to next gen is not easy as they think it is. That game was optimized to death on the PS3 and using every trick that was suited to its architecture. Porting everything to PS4 with totally different architecture is not easy. I saw a video where they talked about porting Jak PS2 to PS3 and how hard it was to do that because of the same reasons.
 

JaggedSac

Member
Getting a game engine that ran on low level PPC cell architecture running indistinguishably the same in a X86 box is less challenging than porting and optimizing a very nice X86 engine to X86 fixed hardware?

I'm going to say no.

Moving all the rending out of the cell to the GPU alone was probably a shitton of work.

And that's not a dig at A4s wizardry. Its more a dig at PS3 reliance on exotic architecture and pie in the sky thinking of actual software development.

4A used console versions as the baseline, not the X86 one. As can be seen in their discussion about CPU differences.
 

Durante

Member
I know that he did the popular mod for Dark Souls and I also never questioned his competence. Still I think the person whose quote we're discussing is at the very least equally competent in this area.
You are ignoring the part where I explained why argument by authority is a logical fallacy, why we should be discussing facts and not people, and why taking a far too generalized statement as the basis for anything is at best misleading.

I'll put it here again before we're going in circles:
The reason I argue so vehemently against your position is that I simply don't believe that arguing on the basis of the relative prestige of whoever made a statement has much merit, especially not if there is objective data available instead. Sure, such data may not be perfect, but dismissing it out of hand seems like throwing away your best chance to make a reasoned assessment.

Basically, the problem is this. One developer can say that they get a factor of 10 performance improvement on console compared to an equivalently specced PC. Another can say they see less than 2% improvement.

Both can very well be right. At the same time. Maybe the former is looking at a draw-call limited scenario and comparing DX9 code on PC to very low level code on console. And the other is simply measuring the time it takes for his main deferred shading pixel shader to run.

And that's the true issue with the "2x" quote and its ilk: it's simply not specific enough to be of any value. It doesn't reduce down to a "truth" or a "lie" as you would seem to believe. Because of this ambiguity, these quotes are dragged out in every argument, regardless of their applicability to the scenario, component or bottleneck being discussed. What you get then is a simple appeal to authority, and discussions that go in circles and never get to the actual issues.
 

ethomaz

Banned
Is ESRAM expensive? Is there a reason Microsoft couldn't have gone with 64 or 128MB?
eSRAM is a big part of the chip die... close to 30%... so 64MB or 128MB could make it 2-4x times bigger... I never saw a CPU/GPU chip that big... could cost over $1000 per unit if they have tech and can produce a chip like that.

Yeap... a hell of expensive.
 

belmonkey

Member
I see what you mean now but why compare a 260x with an OC i7 and 16gb ram to a £350 console and say they are similar specs to prove the point he was trying to make in the first place that console hardware doesn't have benefits from the same sort of PC hardware which again isn't ever possible to test 1:1.

Simply to show how strong the GPU is in the absence of bottlenecks.

Apparently in a similar situation removing bottlenecks (old hardware), a GTS 8800 from 2007 (similar to the GTX 8800 from 2006) can play BF4 SP at 900p medium settings at over 30 FPS.

https://www.youtube.com/watch?v=NtXRKUj_fGc
 

coastel

Member
Simply to show how strong the GPU is in the absence of bottlenecks.

Apparently in a similar situation removing bottlenecks (old hardware), a GTS 8800 from 2007 (similar to the GTX 8800 from 2006) can play BF4 SP at 900p medium settings at over 30 FPS.

https://www.youtube.com/watch?v=NtXRKUj_fGc

Yea again its not a 1:1 comparison and never will be. Im with the 2x more powerful is BS group. It's just wrong to compare an apple and a orange and call them both apples.
 

jgf

Member
So, if you can't do it by looking at the same game on multiple platforms, how would you go about measuring this claim?
As I previously said I would take an unoptimized (in the sense of optimized for a specifc platform) multiplatform game, rewrite it from scratch for the target platform to the best of my ability and then measure the difference. I fully agree with you that this is not practical at all.

The reason I argue so vehemently against your position is that I simply don't believe that arguing on the basis of the relative prestige of whoever made a statement has much merit, especially not if there is objective data available instead. Sure, such data may not be perfect, but dismissing it out of hand seems like throwing away your best chance to make a reasoned assessment.
I'm not so much argueing about the prestige of said developers. I'm not believing them because I think they are magical wizards that cannot fail. I would at least assume that they actually tried to optimize a game for a fixed platform and have some internal benchmark results that show said performance gain.

Also I'm not completely dismissing the performance data of multiplatform games. I just don't think that it translates 1:1.

Basically, the problem is this. One developer can say that they get a factor of 10 performance improvement on console compared to an equivalently specced PC. Another can say they see less than 2% improvement.

Both can very well be right. At the same time. Maybe the former is looking at a draw-call limited scenario and comparing DX9 code on PC to very low level code on console. And the other is simply measuring the time it takes for his main deferred shading pixel shader to run.

And that's the true issue with the "2x" quote and its ilk: it's simply not specific enough to be of any value. It doesn't reduce down to a "truth" or a "lie" as you would seem to believe. Because of this ambiguity, these quotes are dragged out in every argument, regardless of their applicability to the scenario, component or bottleneck being discussed. What you get then is a simple appeal to authority, and discussions that go in circles and never get to the actual issues.

I hope this clarifies things.
I completely agree with you that its a very broad claim. I also see it as some basic estimation and not some number set in stone. At some points they see no gain, at others maybe 10x. After all is said and done it basically settles (in the developers experience) around a 2x performance gain compared to a straight forward port of a game that uses no platform specific optimization. I can imagine that knowing the exact timings of each operation, cache size, cpu count, amount of ram availabe and so forth can help you immensily already during the design phase of the game. Where you think about which code runs where and which algorithms to use best etc.. I know that things like automatic parallelization based on availiable ressources is very hard to pull of automatically, like a generic library would need to do. So I find his claim believable.
 

ethomaz

Banned
But the fact is we haven't begun to fully utilise all the computing power we have. For example we have not utilised parallel compute contexts due to the lack of time and the 'alpha' state of support on those consoles at that time. That means that there is a lot of untapped performance that should translate into better visuals and gameplay as we get more familiar with the hardware.

This part makes me feel like devs are not using PS4 GPU power yet...
 

martino

Member
Need full context of x2 power quotes....i'm sure it will add something like " up to 2x in best case scenarios"... "down to none in worst"....depending on games an average between the 2^^

ND games on ps3 where not low scale level design with not a lot of npc for nothing....even more when you can use spe for graphic purposes

Edit : what i'm sure of is you can't ps3 gpu perform better than pc equivalent when you also use additionnal compute power from the cell
 
Need full context of x2 power quotes....i'm sure it will add something like " up to 2x in best case scenarios"... "down to none in worst"....depending on games needs an average between the more often of the 2^^

I think the part everyone but the OP is skipping is pretty telling:

And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec. Practically achieving that performance takes some time, though!

It's 2x at best. Practically you might not be able to achieve this given that budgets are not infinite.
 
I'm not always convinced that people are comparing apples with apples (though that says more about my ignorance than anything else). The next several years should settle this dispute unequivocally. Everyone seems to agree that paired examples are important. Hardware similarities will surely allow for more accurate comparison in future (certainly PS4).
 

Spongebob

Banned
You only need to look at games like The Order:1886 and UC4 (running at a cool 1080p/60fps) to know that coding to the metal is legit.
 

Durante

Member
I completely agree with you that its a very broad claim. I also see it as some basic estimation and not some number set in stone. At some points they see no gain, at others maybe 10x. After all is said and done it basically settles (in the developers experience) around a 2x performance gain compared to a straight forward port of a game that uses no platform specific optimization.
And see, that is once again the kind of generalization and dumbing down which helps no one, and is in most cases outright wrong.
 
More like op trying to start a faux controversy.

No Leadbetter just brought them into it with absolutely no need to do so. As Percy also said, ofcourse Leadbetter just had to bring them into it to take a shot at them unnecessarily.
And despite ND's game using all of the end of gen tricks of the PS3, which are almost the opposite of PC/PS4 since they've talked about how much they use the Cell.

Both did really good I think.

That is just a shitty dig. Converting an end of the console cycle game to next gen is not easy as they think it is. That game was optimized to death on the PS3 and using every trick that was suited to its architecture. Porting everything to PS4 with totally different architecture is not easy. I saw a video where they talked about porting Jak PS2 to PS3 and how hard it was to do that because of the same reasons.

This.

I mean how do you easily port a game that even used a trick like using the PS1 internals that were inside the PS2 for extra performance? Just hearing it in the behind the scenes video sounds challenging. They no doubt did crazy tricks with the unique setup of the PS3 that's very different from PS4/PC.
 

RoboPlato

I'd be in the dick
They only had less than six months with console dev kits? Damn impressive port job then. Really enjoying it on PS4
 
That's all it takes eh? Who would have thought?!

Ofcourse coding to the metal isn't all it takes, but these guys always say that it helps a ton whenever the topic comes up. Why is it hard to believe that not having to write and test your game to work best on tons of different PC variations of different levels and being able to work and test on the exact hardware would make a huge difference?
 
even if you can get 2x performance out of PS4's GPU, that's still less powerful than a Radeon 7970, a top end (single GPU) graphics card from Jan 2012
 
Top Bottom