• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Allegedly Begins Testing Its Fastest Next-Gen GPU, The AD102, For GeForce RTX 4090 Graphics Card, Features 24 Gbps GDDR6X Memory

tusharngf

Member
NVIDIA's next-gen AD102 GPU is currently being tested for the GeForce RTX 4090 graphics card, as reported by Kopite7kimi.

NVIDIA GeForce RTX 4090 Rumors: Flagship AD102 GPU Enters Test Phase, 24 Gbps GDDR6X Memory Ready​

There has been speculation that NVIDIA might go for the GeForce RTX 50 series naming scheme instead of the expected GeForce RTX 40 series branding but it looks like Kopite7kimi has stated NVIDIA's decision to stick to the 40 series naming scheme. Other than that, one big milestone is that NVIDIA may already have started testing and evaluating its flagship Ada Lovelace GPU, the AD102, which will power a series of graphics cards such as the RTX 4090 and the RTX 4080.



The leaked PCB design talked about the memory which will feature 12 solder points on the PCB and all are compatible with Micron's GDDR6X memory. Higher-end cards might go with single-sided and dual-capacity memory since that offers the best power/temperature balance and feature up to 24 GB capacities but at higher speeds (Up To 24 Gbps). As for the mainstream segment, we are likely to see 20 Gbps+ designs but in 8 GB and up to 16 GB flavors which can help reduce power since the power regulation will be dropped to 3 VRMs for the memory.

As for cooling these monster PCBs, NVIDIA is reportedly going to reuse their triple-slot BFGPU design while board partners are going to utilize 3.5 and even quad-slot cooling solutions weighing over 2 kg. Most AIBs might just end up utilizing AIO and Hybrid cooling designs, something that you will be seeing in the RTX 3090 Ti. The cards are expected to feature up to a 28 phase VRM design on the flagship NVIDIA GeForce RTX 4090 graphics card so all that extra cooling will be put to good use.


NVIDIA Ada Lovelace & Ampere GPU Comparison​

Ada Lovelace GPUSMsCUDA CoresTop SKUMemory BusAmpere GPUSMsCUDA CoresTop SKUMemory BusSM Increase (% Over Ampere)
AD10214418432RTX 4090?384-bitGA1028410752RTX 3090 Ti384-bit+71%
AD1038410752RTX 4070?256-bitGA103S607680RTX 3080 Ti256-bit+40%
AD104607680RTX 4060?192-bitGA104486144RTX 3070 Ti256-bit+25%
AD106364608RTX 4050 Ti?128-bitGA106303840RTX 3060192-bit+20%
AD107243072RTX 4050?128-bitGA107202560RTX 3050128-bit+20%


NVIDIA GeForce RTX 4090 'AD102 GPU' Graphics Card PCB Design - What We Know​

The NVIDIA GeForce RTX 40 series graphics cards equipped with the AD102 GPUs are expected to offer TDPs of up to 600W. That is at least what the current BIOS shipping to board partners is rated at so the rumors about 450-600W TDPs might be true but we haven't yet seen the final figures. The power ratings are usually on the high side during the testing phase so those could be optimized when the cards actually launch. The cards will be outfitted with the PCIe Gen 5 connectors and will ship with a 4 x 8-Pin to 1 x 16-pin adapter to support the huge power draw. The upcoming GeForce RTX 3090 Ti itself will be shipping with a 3 x 8-Pin to 1 x 16-Pin adapter.

More : https://wccftech.com/nvidia-geforce...ics-card-testing-begins-24-gbps-gddr6x-rumor/
 

SlimySnake

Flashless at the Golden Globes
I was hoping for a complete redesign but they seem to be just throwing more Cuda cores at the problem. We know from Turing to Ampere that increasing cuda cores by 3x only resulted in a performance increase of around 2x. 18k is nuts. 600 watts is nuts. The 3080 12 GB is already 400 watts. AMD matched the 3080's standard rasterization performance with just 4,600 shader cores. Even with that large infinity cache, it routinely had a 50 watt differential staying around 250-270 watts for most games.
 
Last edited:

M1chl

Currently Gif and Meme Champion
The TDP is insane, I hope they will optimize it a bit. I am already have the space heater called 3090 and more than el bill, what really is sometimes problem for me is to get the heat out of the case properly. Over longer periods, when I do some ML work, I put it on the bench...but than it's fucking noisy, how it's outside and reeeeee
 

Pagusas

Elden Member
So excited, my 3090 is sweating knowing this is coming. Planning on swapping the 3090 into an EGPU container for my wife's gaming laptop then building a new ITX build with a 4090 water cooled + Intel or AMD's next best chip. Time to turn this 5950x into a satellite rendering processor in the closet.

The TDP is insane, I hope they will optimize it a bit. I am already have the space heater called 3090 and more than el bill, what really is sometimes problem for me is to get the heat out of the case properly. Over longer periods, when I do some ML work, I put it on the bench...but than it's fucking noisy, how it's outside and reeeeee

the 3090 LOVES large water blocks, and I'm sure the 4090 will too. hours of 4k gaming and my load temp never cracks 50 degrees. Throw your next one on water.
 
Last edited:

FireFly

Member
I was hoping for a complete redesign but they seem to be just throwing more Cuda cores at the problem. We know from Turing to Ampere that increasing cuda cores by 3x only resulted in a performance increase of around 2x. 18k is nuts. 600 watts is nuts. The 3080 12 GB is already 400 watts. AMD matched the 3080's standard rasterization performance with just 4,600 shader cores. Even with that large infinity cache, it routinely had a 50 watt differential staying around 250-270 watts for most games.
Last time they doubled the number of CUDA cores per SM. This time they are keeping the ratio the same, but increasing the number of SMs.
 

M1chl

Currently Gif and Meme Champion
So excited, my 3090 is sweating knowing this is coming. Planning on swapping the 3090 into an EGPU container for my wife's gaming laptop then building a new ITX build with a 4090 water cooled + Intel or AMD's next best chip. Time to turn this 5950x into a satellite rendering processor in the closet.



the 3090 LOVES large water blocks, and I'm sure the 4090 will too. hours of 4k gaming and my load temp never cracks 50 degrees. Throw your next one on water.
Like AIO or custom loop?
 

M1chl

Currently Gif and Meme Champion
I run a custom loop because they are fun to build, especially when doing ITX builds, but even an AIO would do wonders on any modern GPU. The large die sizes and spread out nature of the boards make them love large block cooling.
Do you have some recommendation on block? Preferably that one which does not leak?

I am going to keep this card probably for longer than it's going to be my main card. I do a lot of things which are continuous and long running, which requires GPU. I can build things so why not custom loop, but I totally don't know what I should be buying. You know some good brands of components. I did a "custom loop" in my new house( heating radiators in the room), so meh, I can do this easily.

So do you have some on hand recommendations?
 
Going to get the best gpu that will be solid withy 750w psu.

So 4070 is probably my limit. Hopefully Nvidia doesn't cheap out on vram capacity this time, because the only generous sku with regards to that was my 12gb 3060.
 

RoboFu

One of the green rats
ElectricMeter_Erossi-114143710.gif
 

clem84

Gold Member
I was hoping for a complete redesign but they seem to be just throwing more Cuda cores at the problem. We know from Turing to Ampere that increasing cuda cores by 3x only resulted in a performance increase of around 2x. 18k is nuts. 600 watts is nuts. The 3080 12 GB is already 400 watts. AMD matched the 3080's standard rasterization performance with just 4,600 shader cores. Even with that large infinity cache, it routinely had a 50 watt differential staying around 250-270 watts for most games.
How hot will this thing run if there's 600W of power pumping into it? I have a feeling this might shorten the GPU's lifecycle.
 

Pagusas

Elden Member
Do you have some recommendation on block? Preferably that one which does not leak?

I am going to keep this card probably for longer than it's going to be my main card. I do a lot of things which are continuous and long running, which requires GPU. I can build things so why not custom loop, but I totally don't know what I should be buying. You know some good brands of components. I did a "custom loop" in my new house( heating radiators in the room), so meh, I can do this easily.

So do you have some on hand recommendations?

I've never had a single block leak, like ever. Just be smart and if you are building a custom loop, leak test it by pressurizing the loop for a few hours and making sure no air leaks out.

For prebuilts, it will entirely depend on what make of gpu get, but for custom loops I tend to buy EVGA gpu's and love EKWB blocks. CPU blocks are all over the place and most within a few degrees of each other, so buy what you think looks nice and fits your layout, I personally loved my Tech-N block but moved to a Optimus Water Block. Get a D5 pump if you can fit it, and off you go.

Im tearing my o11 mini apart right now as I type this to clean it out and swap in a solid liquid.
kJtyr9p.jpg
 
Last edited:

M1chl

Currently Gif and Meme Champion
I've never had a single block leak, like ever. Just be smart and if you are building a custom loop, leak test it by pressurizing the loop for a few hours and making sure no air leaks out.

For prebuilts, it will entirely depend on what make of gpu get, but for custom loops I tend to buy EVGA gpu's and love EKWB blocks. CPU blocks are all over the place and most within a few degrees of each other, so buy what you think looks nice and fits your layout, I personally loved my Tech-N block but moved to a Optimus Water Block. Get a D5 pump if you can fit it, and off you go.

Im tearing my o11 mini apart right now as I type this to clean it out and swap in a solid liquid.

fcykj8B.jpg
I only buy EVGA normally, but for 3090, I own FE version, do hopefully this won't a bitch to find.

For prebuilts, it will entirely depend on what make of gpu get, I tend to buy EVGA gpu's and love EKWB blocks. CPU blocks are all over the place and most within a few degrees of each other, so buy what you think looks nice and fits your layout, I personally loved my Tech-N block but moved to a Optimus Water Block. Get a D5 pump if you can fit it, and off you go.

I am heading to shopping, based on this, thank you very much. Let your temps be low and framerate high my friend.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Jesus.
Over a TB/s memory + having nearly 100MB of L2 Cache.
Paired with a good implementation of DirectStorage....damn damn damn games are gonna be flying.
That thing is going to cost an arm and a leg.





If the xx70 really is AD103 imma shit a brick.
Cuz thats gonna be some serious gains gen on gen.
64MB of L2 Cache.....the rumors might be right that the xx70 could beat the 3090.
Heres to hoping mining has another crash just in time for launch.


NvP64Il.jpg
 

Pagusas

Elden Member
Jesus.
Over a TB/s memory + having nearly 100MB of L2 Cache.
Paired with a good implementation of DirectStorage....damn damn damn games are gonna be flying.
That thing is going to cost an arm and a leg.





If the xx70 really is AD103 imma shit a brick.
Cuz thats gonna be some serious gains gen on gen.
64MB of L2 Cache.....the rumors might be right that the xx70 could beat the 3090.
Heres to hoping mining has another crash just in time for launch.


NvP64Il.jpg
yeah seriously, that L2 cache is going to cost a metric ton even for the manufactures, that 4090 100% will have a $1999+ price if those specs are true.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
So **80 is AD102 again. 🤞
xx80 is almost certainly bad yield AD102s that cant meet xx90 and xx90ti standard.
It would be silly to have AD102 only power the range topping cards cuz they will never be able to meet yield so might as well have an 70 - 80% AD102s power the xx80
Nvidia is like Apple when it comes to being stingy. Up the VRAM for fucks sake.
From the rumblings the lowest launch(window) cards will be 12GB.....then 16GB for 256bit......then 20GB for xx80.
The xx50 might be the only below 8GB card....but xx60 might also be 8GB which I know people will call a backslide versus the 3060 but it will be much faster than the 3060.
 
xx80 is almost certainly bad yield AD102s that cant meet xx90 and xx90ti standard.
It would be silly to have AD102 only power the range topping cards cuz they will never be able to meet yield so might as well have an 70 - 80% AD102s power the xx80

From the rumblings the lowest launch(window) cards will be 12GB.....then 16GB for 256bit......then 20GB for xx80.
The xx50 might be the only below 8GB card....but xx60 might also be 8GB which I know people will call a backslide versus the 3060 but it will be much faster than the 3060.
That's awesome considering how close 3080 was to the very top cards, so there were rumors that the new **80 will be AD103 instead of just AD102 but few SU disabled and half? VRam.
 

Reallink

Member
xx80 is almost certainly bad yield AD102s that cant meet xx90 and xx90ti standard.
It would be silly to have AD102 only power the range topping cards cuz they will never be able to meet yield so might as well have an 70 - 80% AD102s power the xx80

From the rumblings the lowest launch(window) cards will be 12GB.....then 16GB for 256bit......then 20GB for xx80.
The xx50 might be the only below 8GB card....but xx60 might also be 8GB which I know people will call a backslide versus the 3060 but it will be much faster than the 3060.

LOLNO, you're living in Candyland if you believe selling out $1000+ GPU's within 5 seconds for 2 years hasn't altered their 4XXX product stack. Take the "historical norm" expectation and shift it down a tier. The AD103 is the 4080. Full AD102's will either be a Titan or 4090 while faulty AD102's will either be a 4090 (under the Titan) or 4080Ti (under the 4090). There is no reality where Nvidia are going to sell a card twice as fast as a 3080 for "$699" or even "$799" or "$899". Full AD102 card will be $2000, cut down AD102 card will be $1500. AD103 Founders Edition will likely be unveiled as "$799" with 10 units available while the remaining 99.9% of inventory are AIB's priced between $900 - $1000. And they will sell every card they can make while posting record profits.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
That's awesome considering how close 3080 was to the very top cards, so there were rumors that the new **80 will be AD103 instead of just AD102 but few SU disabled and half? VRam.
Yeah I saw the AD103 rumors for xx80.
But that would be a huge huge gap between the xx80 and xx80ti/xx90 whatever numbering they will go with.
If Nvidia are being greedy and want to say the "greatest Ti ever" by gimping the xx80 to AD103, thats super fucked up.
It would also mean a pretty small jump gen on gen for the xx80 in terms of CUDA cores....AD103 at full tilt has 10752, the RTX 3080 has 8960 (round off to 9000)....assuming some space for yield an AD103 xx80 would basically only gain 1000 CUDA cores over the previous gen.
And it means the xx70 probably wont be as huge a leap as im hoping for.

LOLNO, you're living in Candyland if you believe selling out $1000+ GPU's within 5 seconds for 2 years hasn't altered their 4XXX product stack. Take the "historical norm" expectation and shift it down a tier. The AD103 is the 4080. Full AD102's will either be a Titan or 4090 while faulty AD102's will either be a 4090 (under the Titan) or 4080Ti (under the 4090).
Bookmarking this post.
Got another similar post bookmarked cuz I truly believe Nvidia is not going to fuck us by making the xx80 AD103 and the xx80Ti/90/Titan AD102.
If im wrong ill be sure to tag you.
Im not one to shy away from admitting when I dun goofed.

RDNA3 is coming with the heat....Nvidia cant afford to drop the ball, AMD will have improved their RT implementation, they are already ahead in terms of raster performance....RDNA3 is supposedly 2.5x RDNA2.....Nvidia shouldnt play by gimping the xx80.

P.S As long as AD103 is still the xx70 i think ill be okay.
 
It would also mean a pretty small jump gen on gen for the xx80 in terms of CUDA cores....AD103 at full tilt has 10752, the RTX 3080 has 8960 (round off to 9000)....assuming some space for yield an AD103 xx80 would basically only gain 1000 CUDA cores over the previous gen.
Yeah that would look extremely poor from their side.
 

jigglet

Banned
Insane. I want to upgrade soon, I usually buy the xx60 of a new generation. Never makes sense to spend 3x for a xx90 series that you hold onto with an iron grip for 5 years when you could replace a xx60 card every 2-3 years and still end up financially ahead.
 

Reallink

Member
Yeah I saw the AD103 rumors for xx80.
But that would be a huge huge gap between the xx80 and xx80ti/xx90 whatever numbering they will go with.
If Nvidia are being greedy and want to say the "greatest Ti ever" by gimping the xx80 to AD103, thats super fucked up.
It would also mean a pretty small jump gen on gen for the xx80 in terms of CUDA cores....AD103 at full tilt has 10752, the RTX 3080 has 8960 (round off to 9000)....assuming some space for yield an AD103 xx80 would basically only gain 1000 CUDA cores over the previous gen.
And it means the xx70 probably wont be as huge a leap as im hoping for.


Bookmarking this post.
Got another similar post bookmarked cuz I truly believe Nvidia is not going to fuck us by making the xx80 AD103 and the xx80Ti/90/Titan AD102.
If im wrong ill be sure to tag you.
Im not one to shy away from admitting when I dun goofed.

RDNA3 is coming with the heat....Nvidia cant afford to drop the ball, AMD will have improved their RT implementation, they are already ahead in terms of raster performance....RDNA3 is supposedly 2.5x RDNA2.....Nvidia shouldnt play by gimping the xx80.

P.S As long as AD103 is still the xx70 i think ill be okay.

Yeah that would look extremely poor from their side.

They already did this with the 2080, which was ironically also a series that was developed and planned during a crypto boom amid perpetual GPU sell out/shortages. There is no downside for Nvidia here. If demand for whatever reason flatlines and sales tank, they can self correct with price drops or Super/Ti rebadges of everything to get them back in line. AMD holds single digit dGPU marketshare, they would have to release cards 50-100% faster than than the 4XXX's at several hundred dollars less to claw even 20% or 30% away from Nvidia. Nvidia is a literal monopoly in dedicated gaming GPU's, people don't realize how utterly insignificant AMD truly are in the space, they effectively don't even exist.
 
Last edited:

Celcius

°Temp. member
Hmm, I may just keep my 3090 and skip next gen since my current card is still a beast at 4k and has enough vram to last a long time.

Do you guys plan to get the 4090 at launch or wait for the inevitable 4090 Ti?
 

LiquidMetal14

hide your water-based mammals
I am black rock and can afford every card as it comes.

And I can offset cost by selling my 3090. 5th world problems here in the imperial ship.
 

ahtlas7

Member
Hmm, I may just keep my 3090 and skip next gen since my current card is still a beast at 4k and has enough vram to last a long time.

Do you guys plan to get the 4090 at launch or wait for the inevitable 4090 Ti?

rumor is, 4090 will not be at launch
 
Last edited:
Top Bottom