• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

ATI R520 and NVIDIA G70 Coming

http://beyond3d.com/#news23487

ATI R520 and NVIDIA G70 Coming
30-May-2005, 03:12.52 Reporter : DaveBaumann

As if we need any further evidence that new graphics chips are approaching
Digitimes has posted a report suggesting just that.

The report indicates that ATI's R520 chip have been produced in small
volumes on TSMC's 90nm process and that ASE has recieved orders for
packaging the chip. Meanwhile ASE is suggesting that NVIDIA may place orders
with them for the packaging and testing of G70.

What the report doesn't indicate is the volumes and hence where they are
along in terms of production. The "small volumes" noted for R520 could
merely be engineering samples that are still being used for evaluation
purposes.



http://www.digitimes.com/news/a20050530A2009.html

ATI and Nvidia to boost ASE 3Q sales
Amy Lee, Taipei; Carrie Yu, DigiTimes.com [Monday 30 May 2005]


Advanced Semiconductor Engineering (ASE) is expected to have a strong
third quarter due to orders from ATI Technologies and Nvidia, sources
stated.

ATI's R520 graphics chips have been produced in small volumes at
Taiwan Semiconductor Manufacturing Company (TSMC) using 90nm technology
and
sources indicated that ASE has received orders for related wafer sorting and
flip-chip (FC) packaging.

In addition, sources at JP Morgan pointed out that ATI's R500 chips
will be embedded in Microsoft's new Xbox 360 and is expected to be produced
at TSMC,
with packaging and testing to be handled by ASE.

Furthermore, Nvidia may also place orders with ASE for FC packaging
and testing for its G70 chips.
 
meanwhile, Nvidia's G70 is at Computex 2005, but ATI's R520 is not.

the G70 will apparently be on store shelves soon.


http://www.anandtech.com/tradeshows/showdoc.aspx?i=2431&p=2

NVIDIA’s G70 at the Show

NVIDIA plans on sticking it to ATI with G70 and offering widespread availability of their new GPU very soon. Manufacturers at the show have already indicated that the first shipments of G70 boards will be in their hands by the second week of June.

G70.jpg


Boards are on display at the show, behind closed doors of course. Of course, in Taiwan nothing is ever secret - and thus we’ve had the ability to play around with a number of G70 cards at the show. We can’t say much about the G70 as we are bound by NDA, but all of the cards at the show are single-slot solutions which is refreshing.



Only a handful of NVIDIA’s closest partners have been given G70 designs to show off at Computex, the rest are told to wait until further notice.

We also heard about a new ATI card, but not the R520 we’ve all been waiting for, rather a replacement for the X300 - the Radeon X550. Like the X300, the X550 is a 4-pipe GPU but now running at 400MHz. The GPU is also paired with a 128-bit 200MHz DDR memory bus, but little is known beyond those specs.

So far the R520 is no where to be found at the show; it’s looking like the rumors of a late release of R520 may be true. ATI’s focus at Computex 2005 seems to be their multi-GPU chipset that is due to be launched at the official opening of the show.
 
http://www.hardwarezone.com/articles/view.php?id=1603&cid=3&pg=1

The New NVIDIA G70

As Computex Taipei draws near, the Hardware Zone team here is hard at work bringing you the latest news and happenings right in the heart of all the excitement itself. Today, we'll whet your appetite with pictures of the latest NVIDIA G70 graphics card. Yes guys, it's here and it's coming! Time to throw out the old and prepare for greater and a more fluid graphics experience.

g70-front.jpg

the new NVIDIA G70 graphics card which will be based on the PCI Express interface. Design of the card allows for up to 512MB of frame buffer. This card has only 256MB of memory. Other components are pretty much similar to current generation of graphics cards.

g70-powerconn.jpg

The card uses the less conventional 6-pin power connector. According to sources, this power connector requires up to 75W of power to run this baby. Better stock up those high powered PSU if you want one of this in your system.

g70-chip_memory.jpg

Here's a look at the chip and memory chips without the fan and heatsink. As said earlier, the card is designed for up to 16 memory chips (eight in front and another eight at the back). Currently this card uses only eight which gives it a total of 256MB.

g70-chip.jpg

Here's a close-up view of the future NVIDIA G70 graphics chip that would probably view Half Life 2 and Doom 3 as just 'a walk in the park'.

According to our sources, the card is hot, so hot that we did not managed to even perform a live run. The card is obviously damaged due to heat and upon closer inspection, we saw brown burnt marks around the card, near the PCI Express connector and even at the fan plug. The 10-layer PCB which holds the GPU also suffered heat damage and we saw blistering of the PCB at areas near the GPU.

The card will draw, *drumroll*, a total of 150W. That's even more than any regular desktop CPU we know. 75W will be drawn from the PCI Express interface while another 75W will be supplied through an extra power connector on the card. This will pose as a challenge to graphics card manufacturers as they start designing the appropriate cooling solutions for their card. It doesn't look like it's going to work as a single slot solution, but we'll never know for sure. If NVIDIA is going to deliver the G70 in bulk, they'll need to get their manufacturing partner to further refine their manufacturing process and decrease its heat output. Sources tell us that the G70 is manufactured using 110nm process technology and not 90nm as earlier speculated.
 

SantaC

Member
They need to make the graphics cards for PC gaming cheaper. I don't want to plung out $400 for a new graphics card when the next gen consoles will come cheaper than that. I bought my GF6800 GT for $370 and I haven't really used it that much.
 
Chips like this drawing a whopping 150 watts tends to make one wonder about what next generation systems are going to need to run. Take Xenos with it's 48 pipelines for example. What do you think it'll draw? 200 watts by itself, maybe more? The 3 3.2Ghz chips are certainly not going to be cheap power and heat wise either. 5-600 watt PSUs probably in both PS3 and X360 for sure, and who knows what the heat dissipation will be like.

Makes you think about what these guys will be losing initially on their manufacturing and what defect risks they're taking.
 

Lord Error

Insane For Sony
Chips like this drawing a whopping 150 watts tends to make one wonder about what next generation systems are going to need to run. Take Xenos with it's 48 pipelines for example. What do you think it'll draw? 200 watts by itself, maybe more? The
Those are not pipelines in the traditional sense. 48 pipes used in R500 are different, and should roughly compare to G70's implementation in performance (24+8 pipes) according to ATI spokeperson. Just so that the comparision makes more sense, in terms of executed shader instructions each of 48 pipes in R500 can execute two instructions per cycle (vector+scalar), while each of 24 pixel pipes in G70 can do 4+1 instructions (4x alu op+1x texture op) Also, the embedded ram will help R500 to achieve lower heat output as the main ram memory writes will be much reduced

3 3.2Ghz chips are certainly not going to be cheap power and heat wise either.
It's one 3.2GHz chip with three cores.
 

Mrbob

Member
150Watts for the power consumption? :lol Any elegance at all in the G70 design, Nvidia? :)


SantaCruZer said:
They need to make the graphics cards for PC gaming cheaper. I don't want to plung out $400 for a new graphics card when the next gen consoles will come cheaper than that. I bought my GF6800 GT for $370 and I haven't really used it that much.

Don't. No reason you have to upgrade at the moment. I plan on sticking with m 9800PRO for the forseeable future.
 

Pimpwerx

Member
It's a fact of life. GPUs are highly parallelized, and you strive to have most of those pipes running all the time. So the whole damn die gets hot. The clock speed is lower than a CPU, so just wait until we hit 1GHz on these beasts.

BTW, I guess this pic answered my question about VRAM temps:

http://www.hardwarezone.com/img/data/articles/2005/1603/g70-front.jpg

That fansink covers the memory too. My GF2 Pro didn't even had heatsinks on the memory. Now I guess mere heatsinks aren't enough, so they need to blow air on them too? Goddamn, that's a hot card. I would expect the GPUs in the PS3 and 360 to be power-hungry beasts too. This is why I was arguing that the Rev's form factor is gonna be a problem for power. You can't expect this level of performance to come for free. Xenos and RSX will be on a smaller process, but they're both 500+MHz parts. They may be cooler, but we're talking relative to a very hot card here. In the coming years, expect support brackets and full-blown copper fansinks, on top of heat pumps (heat "pipes" if you will). I expect Intel will need to come up with a new case form factor, and a new PSU standard sometime next year. Simply plopping one of these cards into the top slot is no longer gonna cut it. You need a dedicated flow of air for these cards, seperate from the one the GPU will get. I'd guess future case standards will feature special ducted internals. PEACE.
 
Marconelly said:
Those are not pipelines in the traditional sense. 48 pipes used in R500 are different, and should roughly compare to G70's implementation in performance (24+8 pipes) according to ATI spokeperson. Just so that the comparision makes more sense, in terms of executed shader instructions each of 48 pipes in R500 can execute two instructions per cycle (vector+scalar), while each of 24 pixel pipes in G70 can do 4+1 instructions (4x alu op+1x texture op) Also, the embedded ram will help R500 to achieve lower heat output as the main ram memory writes will be much reduced

I knew that this was roughly the case, and that the G70 didn't have general purpose pipelines like the Xenos but figured it was still a less complex part+110nm Vs 90nm. I'd assume the Xenos GPU will still draw as much, and likely more power than this, but I'm still not sure how much, probably pretty similar from your description. That clears things up a lot, thanks!:)

I'd think the Xenos GPU would be still be more powerful in terms of general horsepower as well, mostly due to it's on die cache although I can't imagine it being by much at all from what you're describing here.

It's one 3.2GHz chip with three cores.

That'll cool things down a bit and save some power consumption, but you're talking about a pretty serious wattage downer anyway. Maybe 110-150 watts depending on efficiency/number of transistors? (2 Core A64 X2 at 2.6 Ghz, 90nm is about 110 watts according to Tom's Hardware).

So, maybe 250-300 watts before you even add the system LSI and any peripherals! You're obviously a ahead of me on tech aspects, but that's a hell of a lot regardless of that fact.
 
Top Bottom