• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Beyond3D Xenos Article

gofreak

GAF's Bob Woodward
Hajaz said:
ya, but that goes 2 ways. For all we know PS3 might actually be weaker then x360 graphically. Sony or NV certainly wouldnt tell us.

Of course.

Hajaz said:
Youve got to admit that the lie about NV30's number of pipes, and the futuremark scandal were too much over the top to be considered simply "presenting a product in the best possible way". That was outright misleading of consumers, and damaging the reputation of another company (futuremark)

I don't disagree, but I don't see how this makes any company more or less inclined toward "telling the truth" re. relative performance vs other cards. Either company could have a dog's dinner of card, but neither would admit as much, at least while it is marketable.
 

Hajaz

Member
the quak thing was a bug, that was fixed in the next driver update. Mind you it was Nvidia who informed Tomshardware about it.
 

Blaster1X

Banned
gofreak said:
I don't disagree, but I don't see how this makes any company more or less inclined toward "telling the truth" re. relative performance vs other cards. Either company could have a dog's dinner of card, but neither would admit as much, at least while it is marketable.
I for one wouldn't believe anything from Nvidia, unless benchmarked..
 

Nerevar

they call me "Man Gravy".
Hajaz said:
the quak thing was a bug, that was fixed in the next driver update. Mind you it was Nvidia who informed Tomshardware about it.

didn't NVidia do the same thing with their 5200-line of cards (the ones that were woefully outclassed by ATi's)? Detected what executable was running and took shortcuts in the shaders, or something along those lines? I remember it being a pretty big uproar, and very ironic after they "outed" ATi for doing the same thing.
 

Fafalada

Fafracer forever
Well if you want to point fingers, ATI have done fudging up of benchmarks and stuff before NVidia was even a major player in the graphics market, and it's not like they were alone in doing so either. ;)
Nor is this somehow limited to graphic market - CPU makers have been playing "games" with numbers for decades... Heck finding ways to cheat through standardized CPU benchmarks has practically become a science at one point.
 

HokieJoe

Member
Fafalada said:
The tiling approach they use isn't PVR's, and it's not really related to procedural synthesis either. You should be looking at MEMEXPORT for connections to PS.

Speaking of which, since when was procedural snythesis "Microsoft's"? :)


Although I sense a well placed bit of tongue in cheek here, :) my reference was the ARS Technica article which in part, detailed procedural synthesis.

WRT the tiling technique, I just assumed that maybe they had shoe-horned some of PVR's tech in their since ATI bought them or bought the technology(?) Cripes, I'm gonna have to read those articles several more times because I get lost with this stuff sometimes.

What I really need is video game programmer's dictionary for dummies. :)
 

Pimpwerx

Member
midnightguy said:
I don't think RSX has much of a transister advantage, if any at all now.
RSX: 300M transistors, mostly logic and caches, no eDRAM.

Xenos is looking to be in the 350 to 382M range. 232M + 80M eDRAM plus possibly upto 70M more of logic.


yes, RSX will have some advantages over Xenos, and Xenos will have some advantages over RSX. I almost agree with J Allard that it is "a wash". at least as far as graphics.

and even though Cell trounces XeCPU in floating point, the XeCPU might have better general purpose CPU performance, interger performance. as far as pure cache, XeCPU has more: 1 MB - compared to Cell's 512K cache for the PPE. the 2 MB, er actually more like 1.8 MB (256K x 7) of Local Storage on Cell gives Cell more total memory than XeCPU, but LS has some disadvantages over cache, which XeCPU has more of.

ok enough rambling, it boils down to this: both PS3 and Xbox 360 are very powerful consoles, each will have advantages over the other and weaknesses compared to the other. it'll be upto developers to maximize the strengths of each, and hide the weaknesses.

Xenos' unified shaders are gonna be "fatter" than the seperate VS and PS arrangement in RSX. I don't know how to factor in the logic on Xenos. The eDRAM is kinda wierd. If you want to count the logic on there, that's fine, it probably closes the gap, but kinda hard to tell in what way.

Honestly, RSX is a later product with a larger tranny budget for logic and a quicker clock. If Xenos outperforms RSX, I'd wonder WTF Nvidia's been doing all this time. eDRAM and unified shaders aren't needed to outpower Xenos. Some think R520 might (I don't believe it will though). And again, there are compromises and limitations on what Xenos can do. RSX will have its weaknesses too. But will the areas where RSX falls behind make a big a difference as the areas in which it excels? And I'm only talking about RSX here, not throwing Cell into the mix.

BTW, Cell has better integet performance, although I think the 3PPEs are better at general code. But I'm discussing graphics solely here, not computational power. Cell's the better CPU, period. The applications for gaming are limitless, and it'll clobber the XeCPU in most of them. But again, different philocophies going on there. Xenos is the heart of 360. Cell is still the heart of PS3. It's not suprising they spent more making it more capable. But for the GPUs, I'd say they were equal if they were made at the same time. But I see no reason why 6 months shouldn't boat Sony the advantage in that area as well. PEACE.
 

Pimpwerx

Member
Fafalada said:
Well if you want to point fingers, ATI have done fudging up of benchmarks and stuff before NVidia was even a major player in the graphics market, and it's not like they were alone in doing so either. ;)
Nor is this somehow limited to graphic market - CPU makers have been playing "games" with numbers for decades... Heck finding ways to cheat through standardized CPU benchmarks has practically become a science at one point.
Hell, it is a science. It was taught to us in our OS class as well as our computer design lab. :lol PEACE.
 
Pimpwerx said:
But for the GPUs, I'd say they were equal if they were made at the same time. But I see no reason why 6 months shouldn't boat Sony the advantage in that area as well.

It has taken at least that long for nvidia to catch up with ATi before.
 

Pimpwerx

Member
dorio said:
What do you see as the main weaknesses compared to other next gen cards?
Certain things it can't do, or apparently has problems doing. I think it will match up very well against the PC cards from this year. I don't see it being a weak part by any means. Vague? Yeah. I shouldn't even bother making such remarks anyway since there are probably gonna be some shortcomings for the RSX too. I don't feel like saying more than that, so I'll leave it alone. I basically think the time differential will make a difference. PEACE.
 

Pimpwerx

Member
AltogetherAndrews said:
It has taken at least that long for nvidia to catch up with ATi before.
How many times has ATI been ahead. I prefer their cards on the PC side, but how many times have they been the clear performance leader? NVidia's spent a good deal of time leading the way too, so I don't think you can draw conclusions from that. Again, you don't need a crazy new design to outpower an older part. PEACE.
 

Tenacious-V

Thinks his PR is better than yours.
Pimpwerx said:
Certain things it can't do, or apparently has problems doing. I think it will match up very well against the PC cards from this year. I don't see it being a weak part by any means. Vague? Yeah. I shouldn't even bother making such remarks anyway since there are probably gonna be some shortcomings for the RSX too. I don't feel like saying more than that, so I'll leave it alone. I basically think the time differential will make a difference. PEACE.

That sounds like a book report answer from a kid who didn't read the book.........

And the time differential thing, IMO is very questionable. Take the NV30 for instance, it released what like a year after the 9700 and yet was still a POS and beaten in every way by ATi's offering. And ATi released it a year earlier. Time doesn't always mean better, there's a ton of factors at play.

And IMO ATi would have has the lead with the 8500 as well if it weren't for their god aweful drivers for the time. The 8500 was a sweet piece of hardware, run by shoddy drivers. If you look back, it was released as a direct competitor to the GF3!!! Not GF4. As time progressed and the drivers were progressively improved it became a pretty good contender to the GF4 Ti series and blew the pants off of GF3.
 

Pimpwerx

Member
Tenacious-V said:
That sounds like a book report answer from a kid who didn't read the book.........

And the time differential thing, IMO is very questionable. Take the NV30 for instance, it released what like a year after the 9700 and yet was still a POS and beaten in every way by ATi's offering. And ATi released it a year earlier. Time doesn't always mean better, there's a ton of factors at play.

And IMO ATi would have has the lead with the 8500 as well if it weren't for their god aweful drivers for the time. The 8500 was a sweet piece of hardware, run by shoddy drivers. If you look back, it was released as a direct competitor to the GF3!!! Not GF4. As time progressed and the drivers were progressively improved it became a pretty good contender to the GF4 Ti series and blew the pants off of GF3.
First of all, I think people misunderstand what I'm saying. Both machines will do some things much better than the other. We know about a few of the Xenos'. For instance, it'll probably handle 4xAA a lot better than the RSX or any other GPU for that matter. It's also got a higher efficiency due to 3 selectable VS/PS blocks.

But there are some next-gen functions that might be problematic. I believe RSX is supposed to support 128bit blends, as should many of the new cards. But there might be another feature that could be a problem if next-gen graphics engines make heavy use of it. Maybe it can be fixed using MEMEXPORT and XeCPU. The specifics of the problem, I won't mention. And with any luck, it'll be proven wrong anyway. The more power, the better. PEACE.
 

MetalAlien

Banned
All the problems (if any) will be worked out in time.. Look at GT4.. how in the hell did they get that resolution with only 4 megs of video ram? The fact the PS2 did a detailed game like GT4 in 1080i should prove that given enough time they can do just about anything.

Whatever short comings the XB360 and PS3 have will be minimal, so if one is better at somehting than the other, it will all be worked out in software in time.
 

dorio

Banned
Pimpwerx said:
The specifics of the problem, I won't mention. And with any luck, it'll be proven wrong anyway. The more power, the better. PEACE.
I don't understand why you need to be so coy, or you under some kind of NDA?
 

Pimpwerx

Member
dorio said:
I don't understand why you need to be so coy, or you under some kind of NDA?
I'm under no NDA. And like I said, it might not end up being anything at all for all I know. But if you were to have read all the posts on R500/Xenos from when DaveB said he was gonna be asking ATI question, then you'll see it was mentioned and I believe briefly discussed. Sorry for being vague. But just think, what has been said about the PPP yet, right? I mean, hardly anything substantive has been said about it, yet its functions are known. Not everything has been revealed yet, for whatever reason, about Xenos. PEACE.
 

dorio

Banned
Pimpwerx said:
I'm under no NDA. And like I said, it might not end up being anything at all for all I know. But if you were to have read all the posts on R500/Xenos from when DaveB said he was gonna be asking ATI question, then you'll see it was mentioned and I believe briefly discussed. Sorry for being vague. But just think, what has been said about the PPP yet, right? I mean, hardly anything substantive has been said about it, yet its functions are known. Not everything has been revealed yet, for whatever reason, about Xenos. PEACE.
The PPP? What's the danger of just spelling out your concern. No ones going to ridicule you. Give us fellow board members a heads up so we won't be disappointed later with this thing.
 

MetalAlien

Banned
Pimpwerx said:
I'm under no NDA. And like I said, it might not end up being anything at all for all I know. But if you were to have read all the posts on R500/Xenos from when DaveB said he was gonna be asking ATI question, then you'll see it was mentioned and I believe briefly discussed. Sorry for being vague. But just think, what has been said about the PPP yet, right? I mean, hardly anything substantive has been said about it, yet its functions are known. Not everything has been revealed yet, for whatever reason, about Xenos. PEACE.

It's full of anthrax isn't it? I knew it!
 

Pimpwerx

Member
dorio said:
The PPP? What's the danger of just spelling out your concern. No ones going to ridicule you. Give us fellow board members a heads up so we won't be disappointed later with this thing.

Programmable Primitive Processor. It's another name for the HOS tesselator. But if it's got more capabilities than just that. This as far as I've heard. So it's not like all the info yet to be revealed is all bad. Anyway, I honestly shouldn't have made that original post b/c it just stirred the pot and got this thread off track. Suffice to say, when some dev/eng mentions it in an interview or something, I'll point it out. Maybe it won't be featured prominently in next-gen engines, or maybe there's a workaround. Devs haven't had final hardware yet to figure out tricks. PEACE.
 

dorio

Banned
Pimpwerx said:
Programmable Primitive Processor. It's another name for the HOS tesselator. But if it's got more capabilities than just that. This as far as I've heard. So it's not like all the info yet to be revealed is all bad. Anyway, I honestly shouldn't have made that original post b/c it just stirred the pot and got this thread off track. Suffice to say, when some dev/eng mentions it in an interview or something, I'll point it out. Maybe it won't be featured prominently in next-gen engines, or maybe there's a workaround. Devs haven't had final hardware yet to figure out tricks. PEACE.
I thought you mentioned a weakness. Why would a ppp be a weakness and devs do have final hardware they're just not clocked at the final speed.
 

HokieJoe

Member
Pimpwerx said:
How many times has ATI been ahead. I prefer their cards on the PC side, but how many times have they been the clear performance leader? NVidia's spent a good deal of time leading the way too, so I don't think you can draw conclusions from that. Again, you don't need a crazy new design to outpower an older part. PEACE.


IIRC, the difference between the R300/9700 and NV30/FX cards was an example of ATI's more intelligent design vs. Nvidia's brute force approach. That's a broad brush, so if I'm wrong please correct me.

The R300 really leapfrogged ATI over Nvidia for the performance crown. It wasn't until the NV40 that Nvidia took the crown back. Before that point, it was Nvidia's crown to lose for the lion's share of the time.

The funny thing is, I'm more of an Xbox fan, but historically, I've always used Nvidia cards. I think the ATI's had better looking output in general, but their driver support sucked way too much ass for my tastes. Thank God things have improved with their Catalyst drivers. :)
 

mrklaw

MrArseFace
the 4xAA approach on Xenos may well kick RSXs ass, and give X360 better image quality than PS3 generally. Its notionally free, and pretty much will be used on all titles.

RSX will need to use some of its power to do that, and cannot use the massive eDRAM bandwidth to help out. That'll suck up both power and bandwidth.

It'll be interesting to see how the RSX architecture deals with FSAA. at 720p resolutions, that could be a big differentiator.
 

Nostromo

Member
I fear RSX might not be able to deal with FSAA at all with floating point render targets..(except with supersampling...)
 

Hajaz

Member
Fafalada said:
Well if you want to point fingers, ATI have done fudging up of benchmarks and stuff before NVidia was even a major player in the graphics market, and it's not like they were alone in doing so either. ;)
.
source please.

i'd also like to point out that the 8500 was alot cheaper then the gf3 series witch it competed with.
Take a look at the Battlefield2 compatibility list. raddy8500 is on there, Geforce3/4 arent.
No 1.4 shaders.
 

skrew

Banned
and the 6800 was a better card than radeon 800, and ati didn't have sm3.0

what does that prove? not a godam thing, both companies made card that kicks the competitions ass, but that doesn't have anything to do with how their future cards are going to match up
 

Hajaz

Member
id disagree about the 6800 beeing the better card.
most effects that can be done with sm3.0 can be done with sm2.0, if the developers bother to code it. the x800 has the raw poweradvantage
 

Tenacious-V

Thinks his PR is better than yours.
skrew said:
and the 6800 was a better card than radeon 800, and ati didn't have sm3.0

what does that prove? not a godam thing, both companies made card that kicks the competitions ass, but that doesn't have anything to do with how their future cards are going to match up

Not true here, the R420 series of cards is the better gaming card. Save for Doom 3 (which IMO ATi has pulled basically neck and neck with performance wise) ATi wins in every other game. nVidia's only better through SLi, which of course ATi has it's counter now with Crossfire. The SM3.0 didn't matter at the time and most games used 2.0 which was fine for the x800/x850 cards. Now R520 has full SM3.0 support when the games are actually "beginning" to emerge with it, well not even really yet, so you can understand why it wasn't important in the last round of cards.
 

gofreak

GAF's Bob Woodward
Tenacious-V said:
Not true here, the R420 series of cards is the better gaming card. Save for Doom 3 (which IMO ATi has pulled basically neck and neck with performance wise) ATi wins in every other game. nVidia's only better through SLi, which of course ATi has it's counter now with Crossfire. The SM3.0 didn't matter at the time and most games used 2.0 which was fine for the x800/x850 cards. Now R520 has full SM3.0 support when the games are actually "beginning" to emerge with it, well not even really yet, so you can understand why it wasn't important in the last round of cards.

The use of the hardware, how many games supported a feature, is kind of irrelevant to the technical merits of the hardware, however.

I guess the point is, for every NV30 that can be pointed at, people can point at other examples to support the the opposite claim. The 6800 Ultra came out 6 months prior to ATi's card, and ATi's performance edge wasn't exactly staggering depsite the gap. So one could equally wonder how ATi might release a more powerful chip 6 months prior to NVidia when they could arguably only just about do so 6 months after with the last set of chips. Historically, prior to the NV30/9700Pro, NVidia had held a more consistent performance lead vs ATi too.

The point is, there are points of history that can be brought up to support either side of the argument.
 

gofreak

GAF's Bob Woodward
Hajaz said:
uh... 6800 launched in may
so did the x800 series

x800 was in in shorter supply though.

My point was that the X850 coming out 6 months later didn't thoroughly trounce the 6800 Ultra Extreme, indeed in some benches the latter retained a lead (though as always, you can probably tell whether NVidia or ATi will win in a benchmark by just looking at the name of the game ;)). It was a more powerful chip of course, and that showed in some benches, but not all-encompassingly so.

The X800 series did indeed launch with the 6800 Ultra and Ultra Extreme, and whilst they stayed the pace, NVidia did have an edge overall in most reviews I've seen. And the 3.0 support, if we're examining technical merit.
 

Tenacious-V

Thinks his PR is better than yours.
gofreak said:
My point was that the X850 coming out 6 months later didn't thoroughly trounce the 6800 Ultra Extreme, indeed in some benches the latter retained a lead (though as always, you can probably tell whether NVidia or ATi will win in a benchmark by just looking at the name of the game ;)). It was a more powerful chip of course, and that showed in some benches, but not all-encompassingly so.

The X800 series did indeed launch with the 6800 Ultra and Ultra Extreme, and whilst they stayed the pace, NVidia did have an edge overall in most reviews I've seen. And the 3.0 support, if we're examining technical merit.

The Ultra Extreme was nothing more than a benchmarking card to try and get bragging rights for top spot, and even then the XT PE beat it. The 6800 Ultra is their highest widely available card and ATi bested it. As for the 3.0, name some substantial games that fully utilize it right now..............exactly. Now that games are starting to trickle out, and by that I mean sllllllllllooooooooooooowwwwwwwwwwlllllllllllyyyyyyyyyy, R520 will be here with full SM3.0 when it matters.
 

gofreak

GAF's Bob Woodward
Tenacious-V said:
The Ultra Extreme was nothing more than a benchmarking card to try and get bragging rights for top spot, and even then the XT PE beat it.

ATi quanitities weren't exactly bountiful either IIRC.

The (X800) XT beat it in some benchmarks, but the Ultra E beat it in more, at least looking at the Anandtech review. The X850 review is the one I'm referring to above though. The 6800 Ultra still beats it in a decent number of benches.

Tenacious-V said:
As for the 3.0, name some substantial games that fully utilize it right now..............exactly.

Again, this is irrelevant to technical merit on a hardware level.

At the very least, the NV40 gen was a marked return to form for NVidia, and certainly does pose as an answer to references to NV3x. They met the performance challenge in addition to bringing some new features beyond ATi, so given how close performance was I could understand why some might place them ahead technically for the last cycle of cards.
 

Tenacious-V

Thinks his PR is better than yours.
gofreak said:
ATi quanitities weren't exactly bountiful either IIRC.

The (X800) XT beat it in some benchmarks, but the Ultra E beat it in more, at least looking at the Anandtech review. The X850 review is the one I'm referring to above though. The 6800 Ultra still beats it in a decent number of benches.

That's true, but at least to this day you can go out and buy an XT PE, making it an actual "real" card. The Ultra Extreme is nowhere to be found, because it was nothing more than a benchmarking card.

gofreak said:
Again, this is irrelevant to technical merit on a hardware level.

At the very least, the NV40 gen was a marked return to form for NVidia, and certainly does pose as an answer to references to NV3x. They met the performance challenge in addition to bringing some new features beyond ATi, so given how close performance was I could understand why some might place them ahead technically for the last cycle of cards.

I don't think so much of it being a return to form as opposed to a NEED to redeem themselves from that catastrophe. They were seen as cheats, liars, and incompetent because of NV3x. They thought they could dictate the industry however they liked just because they were market leader. ATi leapfrogged them because of this. The reason they had more features with NV40 is because ATi isn't exactly so incredibly rich to be able to come out with a completely new architecture every 6 months. They did what they had to do, which was boost performance out the ass from the R300 architecture. R300, R420, R520 all fundamentally R300. SM3.0 is a FP32 minimum requirement, when R300 was released MS stated FP24 as minimum. ATi went this route and won. You can't go from FP24 to 32 just like that, it takes a lot of modifications. Hence R420 was the intermediary, and made a damn good one at that. R520 is how long it took to get to SM3.0. Now that technically is their "18 month" or so product cycle. Just in time for R600 to coincide with longhorn.

Now how did NV get to SM3.0 faster? It was because their NV3x was designed with FP32 support, although they believed nobody would use it so their card was really a FP16 monster. Or a partial precision beast in other words.

But back on track, yes there's excuses for both sides. But I believe ATi has been better since the 8500 (GF3 contender, but surpassing it in every way to become a GF4 contender) once the drivers kicked in. 8500, 9700, x800, R520. Not to say NV isn't competitive though.
 

Hajaz

Member
gofreak said:
My point was that the X850 coming out 6 months later didn't thoroughly trounce the 6800 Ultra Extreme

but thats not what you said at all...
you said that the "6800 ultra came out 6 months prior to Ati's card"
you didnt specify the x850xt nor the ultra extreme edition, so it was quite a misleading statement.


anyway. i can remember x800xt-pe / 6800 ultra reviews beeing even with a slight edge for the x800...
 

gofreak

GAF's Bob Woodward
Hajaz said:
but thats not what you said at all...
you said that the "6800 ultra came out 6 months prior to Ati's card"
you didnt specify the x850xt nor the ultra extreme edition, so it was quite a misleading statement.

It was not my intention to mislead, somehow the word "last" got lost between my brain and the keyboard. I was referring to the last cards from NVidia and ATi respectively (6800 Ultra/Ultra Extreme and the X850).
 

Hajaz

Member
well the ultra extreme is identical to a 6800 except for clocks (better yields)

an x850 is actually a redesign of the board (refresh) that allows for much higher clocks
 

rastex

Banned
gofreak said:
It was not my intention to mislead, somehow the word "last" got lost between my brain and the keyboard. I was referring to the last cards from NVidia and ATi respectively (6800 Ultra/Ultra Extreme and the X850).

I think what this also implies is that the 6-month time gap between X360 and PS3 can't necessarily be equated to a power advantage for PS3. From the examples you've pointed out, the trend doesn't always follow and thus it's not a very strong argument for PS3's power advantage (and time is really the main argument I've seen bandied about)
 
off-topic.

man, I cannot wait to see GSCube-like configurations of RSX and Xenos, used for realtime rendering.

I cannot wait to see how close to various levels of pre-rendered CGI nextgen renderboxes can get :D
 
Tenacious-V said:
Not true here, the R420 series of cards is the better gaming card. Save for Doom 3 (which IMO ATi has pulled basically neck and neck with performance wise) ATi wins in every other game. nVidia's only better through SLi, which of course ATi has it's counter now with Crossfire. The SM3.0 didn't matter at the time and most games used 2.0 which was fine for the x800/x850 cards. Now R520 has full SM3.0 support when the games are actually "beginning" to emerge with it, well not even really yet, so you can understand why it wasn't important in the last round of cards.


agreed :)
 
Top Bottom