• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Leadbetter interviews 4A's Oles Shishkovstov about current gen consoles + PC

Ofcourse coding to the metal isn't all it takes, but these guys always say that it helps a ton whenever the topic comes up. Why is it hard to believe that not having to write and test your game to work best on tons of different PC variations of different levels would make a huge difference?

Define "huge". There lies the problem.
 

jgf

Member
And see, that is once again the kind of generalization and dumbing down which helps no one, and is in most cases outright wrong.

But at that point we are discussing how helpful such a single "performance gain" number is and not if it really is what the developer expects it be. I thought we are talking about the quote of the 4A developer. He said that in his case he expects (those potentially very variable gains) to be around 2x. I argued that I believe that this was his honest estimation and that he probably has already some tricks in mind that may result in said gain. As I understood you, you said that he can't make such a statement and that he is wrong.

But now your argument seems to be more like that independent of which actual performance gain number he said, it would still be pointless, because its a broad generalization. Thats a bit different to what I'm arguing about.
 

El_Chino

Member
even if you can get 2x performance out of PS4's GPU, that's still less powerful than a Radeon 7970, a top end (single GPU) graphics card from Jan 2012
Is that how we should interpret that quote though? So for example X1's gpu could theoretically performe as good as a 2.6TF gpu?
 

Seanspeed

Banned
I completely agree with you that its a very broad claim. I also see it as some basic estimation and not some number set in stone. At some points they see no gain, at others maybe 10x. After all is said and done it basically settles (in the developers experience) around a 2x performance gain compared to a straight forward port of a game that uses no platform specific optimization.
Is this based on personal experience or personal knowledge of what developers acheive? You're making a very specific claim that things 'average out' to a 2x performance gain now, in response to Durante's hypothetical argument. Specific claims really need a bit of specific proof, or at least specific reasoning. Find us a quote that says developers find an 'average' of 2x performance gain by coding on consoles.

I hate to be harsh here, but all you are doing is trying to argue a case for something which, in the end, all you have is an argument from authority. You cant very well elucidate on the subject when its not really based on personal understanding in the first place.
 

Kezen

Banned
Is that how we should interpret that quote though? So for example X1's gpu could theoretically performe as good as a 2.6TF gpu?

I find this hard to believe even considering DX11's age. And I really don't believe we will ever need two times GPU power when DX12 becomes the standard.
 

RoboPlato

I'd be in the dick
The one thing that I think is going to be a bigger deal in optimizing for consoles this gen is the amount of control you have over CPU, GPU, and RAM usage, particularly once HSA functionality is taken into account. The ability to fill in gaps and monitor performance more carefully will allow for a lot of gains but it's a different type of gain compared to previous gens.
 

jgf

Member
Is this based on personal experience or personal knowledge of what developers acheive? You're making a very specific claim that things 'average out' to a 2x performance gain now, in response to Durante's hypothetical argument. Specific claims really need a bit of specific proof, or at least specific reasoning. Find us a quote that says developers find an 'average' of 2x performance gain by coding on consoles.

I hate to be harsh here, but all you are doing is trying to argue a case for something which, in the end, all you have is an argument from authority. You cant very well elucidate on the subject when its not really based on personal understanding in the first place.

I'm specifically talking about the quote of the 4A developer from the article this thread is about:
And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec. Practically achieving that performance takes some time, though!

I read it that in his experience and for his use case he expects a 2x gain. Thats all.
 
The one thing that I think is going to be a bigger deal in optimizing for consoles this gen is the amount of control you have over CPU, GPU, and RAM usage, particularly once HSA functionality is taken into account. The ability to fill in gaps and monitor performance more carefully will allow for a lot of gains but it's a different type of gain compared to previous gens.

I'll believe it when I see it.
 

ethomaz

Banned
Is that how we should interpret that quote though? So for example X1's gpu could theoretically performe as good as a 2.6TF gpu?
No.

It is like you are using 40% of the GPU and start to use 80%.

The GPU is yet theoretical 1.31TF raw performance.

No game ever, no matter how close to metal it is coded, will use 100% of FLOPs found in a GPU... consoles or PC. PC hardware is sub-utilized because the crazy amount of abstraction levels.
 

Easy_D

never left the stone age
After the direction they took last light, no thanks. I need someone to make a stalker game, not someone that will make a cod game with stalker paint.

Are you saying Last Light is a CoD game that looks like Metro 2033? Or what. I don't understand.
 

KKRT00

Member
You only need to look at games like The Order:1886 and UC4 (running at a cool 1080p/60fps) to know that coding to the metal is legit.

The Order isnt even that impressive for resolution and framerate its running, i'm much more impressed by The Division to be honest.
And i have yet to see anything from Uncharted except for godly looking character model in a cutscene.
 
we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result.

In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.

Pretty damning stuff from Oles Shishkovstov.
 

Shin-Ra

Junior Member
That's too reasonable to be included in the PC GAF lore. Try again.



A fact that gets ignored more and more. It's almost like some are trying to revise history when downplaying these comparisons and forgetting the many, many problems lastgen games had. Even the fabled NGods couldn't do without major popins, subHD and noticeable framerate fluctuations.
I don't think any of these apply to Uncharted 2 and none of their games were subHD.
 
"Yes it is true, that the maximum theoretical bandwidth - which is somewhat comparable to PS4 - can be rarely achieved (usually with simultaneous read and write, like FP16-blending)..."

I guess we can put that BS to bed.
 

Cuyejo

Member
The Order isnt even that impressive for resolution and framerate its running, i'm much more impressed by The Division to be honest.
And i have yet to see anything from Uncharted except for godly looking character model in a cutscene.

Come on now...
Considering you found Ryse impressive if I remember correctly.
 
"we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result.

In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does."

This is what happens when you design your console's architecture with multitasking as your first priority and not the actual gaming itself.
 

Metfanant

Member
I think people are really reading too much into the "2x" performance comment...

...though I will say there is no surprise as to who some of the people so vehemently arguing against the idea of console performance being looked at positively in comparison to PCs...

Let's be honest..consoles punch above their weight when it comes to performance..this always has, and always will be the truth..obviously its impossible to put a concrete number on just how much that is...

But this particular Dev seems to think ~2x is accurate in his experience...but also clearly states that you can't always practically extract all that extra performance...
 

KKRT00

Member
Come on now...
Considering you found Ryse impressive if I remember correctly.

Impressive for Xbone yeah and for a launch title, it has more tech features than The Order from what they showed so far.
Ryse showed battles with 70+ npcs, geocache tech, tessellation of world and water, new lighting technique, advanced ssdo, PoM on almost every surface and geometry shelling, improved vegetation, ultra setting of DoF and Motion Blur etc.
Stopped looking at art long time ago in graphics comparisons, thats why i find The Division much more impressive.

---
Look, I know nobody at GAF takes you seriously, but c'mon bro...
Does it look two times better to You than Ryse from technical point of view? Be honest, because this is exactly what quoted by me post implied - big coding to metal advantages in comparison to PC hardware/ports which would indicate game like for example Ryse which You run on Xbone, so even weaker than PS4.
 
Ryse isn't really indicative of the overall hardware power of the 2 consoles; I'd imagine better looking and performing games will be just around the corner. Rather, it is more indicative of the advanced shader and compute feature sets found on the GPU architecture common to both the X1 and the PS4. So far, Crytek has been the only dev with the middleware to even begin to fully leverage these advanced sets found on the GPUs.
 

Seanspeed

Banned
I'm specifically talking about the quote of the 4A developer from the article this thread is about:


I read it that in his experience and for his use case he expects a 2x gain. Thats all.
I get that, but you're trying to turn it into an argument that its an 'average' of 2x performance. But that's not from personal experience or personal knowledge. That's just you reinterpreting the same argument from authority you're already using. You don't know if that's what they mean or not. You're just guessing based on what suits your argument.
 

Chobel

Member
Impressive for Xbone yeah and for a launch title, it has more tech features than The Order from what they showed so far.
Ryse showed battles with 70+ npcs, geocache tech, tessellation of world and water, new lighting technique, advanced ssdo, PoM on almost every surface and geometry shelling, improved vegetation, ultra setting of DoF and Motion Blur etc.
Stopped looking at art long time ago in graphics comparisons, thats why i find The Division much more impressive.

When you say Division do you mean this year E3 gameplay, or last year reveal?

And Personally I think The Witcher 3 is more impressive than The Division.
 

Chobel

Member
I mean tech trailer.
And this E3 gameplay had most features enabled from what i saw from crappy quality footage.

Ah OK. The tech trailer was really impressive.

However this year E3 gameplay looked worse than Last year gameplay, everyone many assumed it was Xbone footage when it was actually PC footage.
 

RoboPlato

I'd be in the dick
Given the xb1 hardware, Ryse does look much better than most would expect. It does not run very well though.
Yeah, Ryse looks really impressive for a launch game given the hardware. If we can get most games looking like that at a steady framerate at 900p on XBO and 1080p on PS4 I will be quite happy. That's part of the reason I want cross gen games to die off soon. I want games tailored to PC/PS3/XBO without having to deal with legacy code or 512mb RAM limitations. I think we'll see a big jump once devs start focusing on the more capable platforms.
 

Cuyejo

Member
Impressive for Xbone yeah and for a launch title, it has more tech features than The Order from what they showed so far.
Ryse showed battles with 70+ npcs, geocache tech, tessellation of world and water, new lighting technique, advanced ssdo, PoM on almost every surface and geometry shelling, improved vegetation, ultra setting of DoF and Motion Blur etc.
Stopped looking at art long time ago in graphics comparisons, thats why i find The Division much more impressive.

You said The Order wasn't impressive for the resolution or the framerate it's running while ignoring that:

1. The game is not done yet.

There's signs the framerate has been improved between builds, if I recall screen tear was prominent on the first showing and was completely solved on subsequent builds, plus there's still enough time left for optimizations.

BTW Ryse was nowhere a locked 30 fps experience.

2. The Order is running at a higher resolution than Ryse and using 4XMSAA plus temporal AA, no upscaling either which means a much cleaner IQ than Ryse.

3. Character models are as detailed if not more than Ryse's ones, SSS is top notch too.

I reckon that we'll have to wait for a more extensive analysis of what The Order is doing to fully compare it to Ryse's features but I don't get how can you say The Order doesn't look impressive while showering Ryse in flattery.
 

KKRT00

Member
You said The Order wasn't impressive for the resolution or the framerate it's running while ignoring that:

1. The game is not done yet.

There's signs the framerate has been improved between builds, if I recall screen tear was prominent on the first showing and was completely solved on subsequent builds, plus there's still enough time left for optimizations.

BTW Ryse was nowhere a locked 30 fps experience.

2. The Order is running at a higher resolution than Ryse and using 4XMSAA plus temporal AA, no upscaling either which means a much cleaner IQ than Ryse.

3. Character models are as detailed if not more than Ryse's ones, SSS is top notch too.

I reckon that we'll have to wait for a more extensive analysis of what The Order is doing to fully compare it to Ryse's features but I don't get how can you say The Order doesn't look impressive while showering Ryse in flattery.
Its running on better hardware, its in production already longer than Ryse ever was [only 18 months] and it will launch more that a year after console launch, not day one.
So yeah, i dont find it that impressive for a showcase of 'coding to metal' advantages when its hardly looking better than a game from a year ago on a weaker platform.

And You are forgetting that it doesnt use PoM, it didnt show any tessellation [yet], it uses bokeh in 1/4 resolution etc. I mean, its top notch looking game, its not that impressive showcase of hardware utilization, especially in comparison to multiplatform The Division.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
The thing that will make the most difference in terms of visuals this gen is shader quality. The models in Ryse looked so good because of how advanced the shaders were. You can see the same in the UC4 trailer, and 1886.

Shader complexity has tremendously improved by leaps and bounds since last generation, and modern GPU's like Xbox One and PS4 support many features PS3 and 360 didn't even have coded into them to start with.

Games like Second Son use GPU compute for every bit of particle effects in the game. And games like The Tommorow Children use the same voxel based GI that was scrapped from UE4 also by utilizing the power of the GPU for asynchronous compute resources.

Whether or not you want to agree with 4A on the nature closed platform optimization, the fact is that such a thing definitely does exist, and makes a difference.
 

RoboPlato

I'd be in the dick
Its running on better hardware, its in production already longer than Ryse ever was [only 18 months] and it will launch more that a year after console launch, not day one.
So yeah, i dont find it that impressive for a showcase of 'coding to metal' advantages when its hardly looking better than a game from a year ago on a weaker platform.

And You are forgetting that it doesnt use PoM, it didnt show any tessellation [yet], it uses bokeh in 1/4 resolution etc. I mean, its top notch looking game, its not that impressive showcase of hardware utilization, especially in comparison to multiplatform The Division.
I agree with you to a point but considering Ready at Dawn haven't worked on powerful hardware before as opposed to Crytek who have always been at the forefront of tech, I'm still super impressed with The Order (Plus I LOVE the art and setting, which I know you don't care about). I think Uncharted will be the first game that shows what focused PS4 development can really do. That will be the first game from Sony's WWS that has had an ample amount of time after launch to really tear in.

EDIT: Whoops, I forgot about DriveClub
 

jgf

Member
I get that, but you're trying to turn it into an argument that its an 'average' of 2x performance. But that's not from personal experience or personal knowledge. That's just you reinterpreting the same argument from authority you're already using. You don't know if that's what they mean or not. You're just guessing based on what suits your argument.

See I'm not guessing anything. I did not make up an arbitrary number or something along those lines. The quote from the developer is pretty straight forward. IMHO it leaves very little room for interpretation like "oh he certainly only meant draw calls, etc.". As we already said its a rough estimation and a broad generalization, yet I grant him the benefit of doubt and say that he might be right. Simply because the rest of his statements made sense, he has no obvious incentive to lie at that point and he is qualified enough to make an educated guess about how an optimized game engine may perform. What is your personal knowledge that allows you to outright dismiss his statement?
 
Let's be honest..consoles punch above their weight when it comes to performance..this always has, and always will be the truth..

Great, I agree with all of this. Now, could we maybe get some, any really, proof for all that above-the-weight punching?

Anyone who has a basic understanding of technology and gaming understands that being able to focus on a specific piece of tech and having a lot of time to optimise your code for it brings some benefits. Hell, I vividly remember being impressed by some of the demo scene's projects for home computers like the Amiga.

The thing is, demos are not actual games. Today's triple-A games cost tens of millions to produce, they are released on multiple platforms and they are the result of the effort of hundreds of developers. Here's the business reality of the situation: No one is going to spend any significant amount of time and money in order to optimise a game for a specific platform just to eek out a bit more performance. The reality of multiplatform development is such that game code has to be as platform-agnostic as possible because any delay causes the loss of huge amounts of money. Carmack and his company tried to go the optimization route and this direction practically destroyed id software.

Then there's also another fact that has to do with the pace of technology advancement. Let's say for the sake of argument that RAGE performed 2x better on console than on a similarly specced PC. It didn't, of course, but let's say it did. Who cares? By the time this super-optimised game came out, PC tech had leaped so far ahead of consoles that it was piss-easy to get much better than console performance by even the cheapest available graphics cards. In the end that amount of optimization is utterly pointless in the context of a PC/console comparison.
 

DieH@rd

Banned
They only had less than six months with console dev kits? Damn impressive port job then. Really enjoying it on PS4

Don't forget, they had both games up and running on PC for ages, which helped them a lot to prepare Metro2033 asset move to LL engine. They used those few months just to optimize engine for console hardware.
 

Game4life

Banned
The Order 1886 is the best looking game I have ever seen. No PC game looks even remotely close and no its not because of the art.
 

Arulan

Member
See I'm not guessing anything. I did not make up an arbitrary number or something along those lines. The quote from the developer is pretty straight forward. IMHO it leaves very little room for interpretation like "oh he certainly only meant draw calls, etc.". As we already said its a rough estimation and a broad generalization, yet I grant him the benefit of doubt and say that he might be right. Simply because the rest of his statements made sense, he has no obvious incentive to lie at that point and he is qualified enough to make an educated guess about how an optimized game engine may perform. What is your personal knowledge that allows you to outright dismiss his statement?

The problem with your argument, and why many have attempted to respond to you is because you're committing large logical fallacies in doing so. You keep coming back to what someone said, asking for "proof" that he is lying, and essentially turning the burden of proof around and asking for impossible demands. You're creating an unfalsifiable scenario in which it is impossible for your argument and/or opinion to be disputed. This is simply a very poor argument and cannot be met with any objective or rational discussion due to the way you've framed it. It also heavily implies that you do indeed have a bias for the outcome.
 

RoboPlato

I'd be in the dick
Great, I agree with all of this. Now, could we maybe get some, any really, proof for all that above-the-weight punching?

Anyone who has a basic understanding of technology and gaming understands that being able to focus on a specific piece of tech and having a lot of time to optimise your code for it brings some benefits. Hell, I vividly remember being impressed by some of the demo scene's projects for home computers like the Amiga.

The thing is, demos are not actual games. Today's triple-A games cost tens of millions to produce, they are released on multiple platforms and they are the result of the effort of hundreds of developers. Here's the business reality of the situation: No one is going to spend any significant amount of time and money in order to optimise a game for a specific platform just to eek out a bit more performance. The reality of multiplatform development is such that game code has to be as platform-agnostic as possible because any delay causes the loss of huge amounts of money. Carmack and his company tried to go the optimization route and this direction practically destroyed id software.

Then there's also another fact that has to do with the pace of technology advancement. Let's say for the sake of argument that RAGE performed 2x better on console than on a similarly specced PC. It didn't, of course, but let's say it did. Who cares? By the time this super-optimised game came out, PC tech had leaped so far ahead of consoles that it was piss-easy to get much better than console performance by even the cheapest available graphics cards. In the end that amount of optimization is utterly pointless in the context of a PC/console comparison.
The bolded isn't entirely true. Games aren't developed on a bubble and collaboration between teams does happen, hence events like GDC. If someone has a breakthrough on a specific platform and shares it, other studios can start using that same technique. This is especially true with engine licensees. Lionhead has already developed a new GI technique that was incorporated into UE4. This is part of the reason EA is using Frostbite for pretty much everything now, research one team does can help all others.
 

jgf

Member
The problem with your argument, and why many have attempted to respond to you is because you're committing large logical fallacies in doing so. You keep coming back to what someone said, asking for "proof" that he is lying, and essentially turning the burden of proof around and asking for impossible demands. You're creating an unfalsifiable scenario in which it is impossible for your argument and/or opinion to be disputed. This is simply a very poor argument and cannot be met with any objective or rational discussion due to the way you've framed it. It also heavily implies that you do indeed have a bias for the outcome.

So assuming his quote is wrong until he is prooven right is more logical and unbiased? Sorry but I don't get that.

I'm not asking anybody to blindly believe that statement. But you should give him at the very least the benefit of doubt that he may know what he is talking about. I'll call that unbiased.

I don't have a secret console agenda or something along those lines. If anything I try to defend the developer. I think it was a great interview where he spoke quite freely about the problems that occur when developing for the new consoles. He seems to be a pretty knowlegable guy that talks straight. Plainly dismissing his statements does not seem fair to me.
 

Mr Vast

Banned
The Order 1886 is the best looking game I have ever seen. No PC game looks even remotely close and no its not because of the art.

Even crysis 1 with mods look better then the order and it was released years ago. It's also not rendered in a res lower then my phone's screen and at 30 fps. https://www.youtube.com/watch?v=tZApOsCc1oI

I mean, nothing special here. Characters lack details, textures are nice but flat and the breaking black bar while you watch a chain of QTE.
 
Even crysis 1 with mods look better then the order and it was released years ago. It's also not rendered in a res lower then my phone's screen and at 30 fps. https://www.youtube.com/watch?v=tZApOsCc1oI

I mean, nothing special here. Characters lack details, textures are nice but flat and the breaking black bar while you watch a chain of QTE.

this thread is pretty awful, but this post is really awful

It seems some PC fanboys just can't fathom or tolerate a good looking console game.
 

luca_29_bg

Member
Even crysis 1 with mods look better then the order and it was released years ago. It's also not rendered in a res lower then my phone's screen and at 30 fps. https://www.youtube.com/watch?v=tZApOsCc1oI

I mean, nothing special here. Characters lack details, textures are nice but flat and the breaking black bar while you watch a chain of QTE.

Myopia can be cured, trust me. The order looks cg in movement and it's the really first game to do this. Crysis not. Simple like water.
 
I'm the only one that thinks inFAMOUS and Killzone SP looks better than Ryse?

hm I don't know, Cry Engine is really impressive. I haven't seen Ryse in person, but I did play Crysis 3 maxed and that game had some amazing tech. I wasn't impressed by Killzone Shadow Fall at all to be honest.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
Even crysis 1 with mods look better then the order and it was released years ago. It's also not rendered in a res lower then my phone's screen and at 30 fps. https://www.youtube.com/watch?v=tZApOsCc1oI

I mean, nothing special here. Characters lack details, textures are nice but flat and the breaking black bar while you watch a chain of QTE.

This is probably one of the worsts posts i've seen in this thread.


You go on to compare two completely different games in an entirely biased and subjective manner with literally no actual evidence backing up your claims.

In my personal opinion, Order can look damn near close to older CGI cut-scenes. I hardly interpret that as anywhere near your hyperbolic claims
 

Corine

Member
The Order 1886 is the best looking game I have ever seen. No PC game looks even remotely close and no its not because of the art.

I say the same thing about Star Citizen. Haven't seen a console game close to the graphics that game is pumping out. Plus Star Citizen is an open ended game with those crazy graphics which makes it more impressive.
 
I say the same thing about Star Citizen. Haven't seen a console game close to the graphics that game is pumping out. Plus Star Citizen is an open ended game with those crazy graphics which makes it more impressive.

Star citizen hasn't impressed me one bit, I think order looks better. To each his own
 
Top Bottom