• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen NVIDIA GeForce RTX 4090 With Top AD102 GPU Could Be The First Gaming Graphics Card To Break Past 100 TFLOPs

kiphalfton

Member
Every generation people gets their hopes up. I expect it will be like Turing and it will be a small improvement. Especially since like Pascal, Ampere is coming hot off the heels of a huge cryptomining surge.
 

Celcius

°Temp. member
I'm still wondering why we don't have video ram in the 64gb range or even higher.
I think that alone solves most people's problems for certain things.
That would make the cards cost much more money, and it would make them more expensive to produce.
I paid $2k for my rtx 3090, and that's with only 24gb vram. Would you pay $4000 msrp?
Plus, nvidia wants to make as much profit as they can, so they want to give you as little as they can to get the job done.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
That would make the cards cost much more money, and it would make them more expensive to produce.
I paid $2k for my rtx 3090, and that's with only 24gb vram. Would you pay $4000 msrp?
Plus, nvidia wants to make as much profit as they can, so they want to give you as little as they can to get the job done.
The A6000 is 48GB and costs about 5000 dollars.
Quadros have Quadro Tax applied, id think a 48GB RTX 40x0 should....should cost less than 3000 dollars. (realistically it should be 2000 dollars)
Considering people were/are willing to spend 2000 dollars on a GPU thats marginally faster than the RTX 3080 I dont see why they wouldnt for a 48+GB VRAM card.
 

Buggy Loop

Member
I'm still wondering why we don't have video ram in the 64gb range or even higher.
I think that alone solves most people's problems for certain things.

That’s going against the trend of the industry where SSD + IO management is drip feeding exactly what is needed for the scene and has barely any assets waiting in VRAM..

Movie studios have special cards for their needs, so can you clarify on « most peoples’ problems » ?
 

Clear

CliffyB's Cock Holster
If your rig is pulling a kilowatt/hr at peak load then the running costs are going to be pretty severe. Especially if its hooked up to a big TV and a surround setup... Yikes.
 

Klik

Member
If RTX 4080 end up being powerful as 2x3080 is it still good idea to go for 1440p/144hz monitor?

I mean 4k is nice, but i think 4k 60-100fps with RTX 4080 will not be possible(for new games). Maybe in 2 years with RTX 5080..
 
Last edited:

Dream-Knife

Banned
The A6000 is 48GB and costs about 5000 dollars.
Quadros have Quadro Tax applied, id think a 48GB RTX 40x0 should....should cost less than 3000 dollars. (realistically it should be 2000 dollars)
Considering people were/are willing to spend 2000 dollars on a GPU thats marginally faster than the RTX 3080 I dont see why they wouldnt for a 48+GB VRAM card.
Like the 24gb in the 3090 it would largely just be a waste.

I wouldn't be surprised if Nvidia doesn't bump the vram in these next cards. Casuals are attracted to bigger numbers, and amd did well with that.
 

DukeNukem00

Banned
If RTX 4080 end up being powerful as 2x3080 is it still good idea to go for 1440p/monitor?

I mean 4k is nice, but i think 4k 60-100fps with RTX 4080 will not be possible(for new games). Maybe in 2 years with RTX 5080..

A 3080 can do 4k/60 on pretty much every game released with some tweaks where required. If 4080 would be twice as fast, that means the games where 3080 does 60, 4080 can do 120. We're also expecting much improved ray tracing. So we'll see. 4k at 100 frames should definitely be doable in the most next gen of games. I cant wait for these cards to come out
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Like the 24gb in the 3090 it would largely just be a waste.

I wouldn't be surprised if Nvidia doesn't bump the vram in these next cards. Casuals are attracted to bigger numbers, and amd did well with that.
Yeah all the rumors point towards the xx80 and xx70 are getting VRAM bumps but with reduced memory interface, the new caches are massive though so should make up for the dropped memory in gaming scenarios which hopefully will also mean they arent that good for mining considering the bumped up TDPs.......so we are hopefully good.
The xx60 is supposedly going to be on 128bit interface and likely only have 8GB of VRAM.

Ju4Zu2m.jpg
 

Hezekiah

Banned
If RTX 4080 end up being powerful as 2x3080 is it still good idea to go for 1440p/monitor?

I mean 4k is nice, but i think 4k 60-100fps with RTX 4080 will not be possible(for new games). Maybe in 2 years with RTX 5080..
What? Absolutely not chance. Where did you hear that, an Nvidia press release?
 
Last edited:

GreatnessRD

Member
Whilst this is exciting for most, I’m much more intrigued to see the performance per watt and how it compares to AMD, and Apple.
It will be intriguing to see for sure. Nvidia just gonna brute force the powah! I personally thought AMD was gonna outright win this time around with RDNA3, but I've backed down from that claim, lol.
 

Hezekiah

Banned
Hmm i guess with RTX 4080 we can hope at about 40%-50% increase over RTX 3080?
Yeah something like that.

Also we know teraflop count isn't really relevant when comparing.

I'm desperate to get a 4080 (for under a grand anyway), however I'm expecting a significant but not massive jump from the 3000 series.
 

DukeNukem00

Banned
What? Absolutely not chance. Where did you hear that, an Nvidia press release?



Leaks have consistently pointed to at least 2 times over Ampere for more than a year. This is from yesterday. Of course its more than possible. This isnt even the full chip. We're having a massive node jump, massive cache, other architectural improvements. Jumps of double or more in performance have happened multiple times in history.

A jump of 40% over 3080 is literally impossible. A 3090TI is right NOW 25% over 3080. That would mean a 4080 is 15% faster ? Not even Turing was that bad, it was 30% over 1080TI. The node jump alone plus extra power draw would give more than 40% discounting everything else. Relax, its gonna be one of the biggest jumps since gpus exist at this launch
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Yeah something like that.

Also we know teraflop count isn't really relevant when comparing.

I'm desperate to get a 4080 (for under a grand anyway), however I'm expecting a significant but not massive jump from the 3000 series.
If they make MSRP 1000 dollars we are fucked.
Assuming they dont goof and have low stock, prices should be close to MSRP not too long after launch.

If the whole July paper launch rumors are true, then by the end of the year I should have saved enough to get a 4080....assuming its enough of a jump over the 3080 cuz you know people are gonna offload those in droves, I could live with a 3080 as I "only" play at 3440x1440p
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not


Leaks have consistently pointed to at least 2 times over Ampere for more than a year. This is from yesterday. Of course its more than possible. This isnt even the full chip. We're having a massive node jump, massive cache, other architectural improvements. Jumps of double or more in performance have happened multiple times in history.

A jump of 40% over 3080 is literally impossible. A 3090TI is right NOW 25% over 3080. That would mean a 4080 is 15% faster ? Not even Turing was that bad, it was 30% over 1080TI. The node jump alone plus extra power draw would give more than 40% discounting everything else. Relax, its gonna be one of the biggest jumps since gpus exist at this launch

The gap between the 4080 and 4090 is absolutely massive.
Nvidia are seemingly going back to xx102 being reserved for Ti/Titan/90 class cards.
The 4080 is just over half a 4090
The 4080 is highly unlikely to match the same improvements gen on gen that the 3090 and 4090 might enjoy.

As is the 4080 might only just beat the 3090ti.
We will have see what the new large cache does for actual performance.
 

Knightime_X

Member
That’s going against the trend of the industry where SSD + IO management is drip feeding exactly what is needed for the scene and has barely any assets waiting in VRAM..

Movie studios have special cards for their needs, so can you clarify on « most peoples’ problems » ?
Maxing out sliders.
You hit the vram limit pretty fast in some games like Resident evil 2 remake.
 

Buggy Loop

Member
Maxing out sliders.
You hit the vram limit pretty fast in some games like Resident evil 2 remake.

No

Especially not on the basis of the in-game meter (totally borked)

All settings max, texture quality to high, 4k

Game will tell you it requires 13.8GB

Oh my god! I better buy an AMD then!!!

Oh wait, MSI afterburner’s allocated vs used VRAM is 7851 MB allocated and 7001 MB used.

Even less for RE 3 which hovers at 5.7GB utilization.

There’s a ton of examples like this.

And again, this is before DirectStorage + Sampler feedback + RTX IO, the games right now are basically having the VRAM bogged down with waiting assets. We’re moving away from that. It’s clear.
 

Hezekiah

Banned


Leaks have consistently pointed to at least 2 times over Ampere for more than a year. This is from yesterday. Of course its more than possible. This isnt even the full chip. We're having a massive node jump, massive cache, other architectural improvements. Jumps of double or more in performance have happened multiple times in history.

A jump of 40% over 3080 is literally impossible. A 3090TI is right NOW 25% over 3080. That would mean a 4080 is 15% faster ? Not even Turing was that bad, it was 30% over 1080TI. The node jump alone plus extra power draw would give more than 40% discounting everything else. Relax, its gonna be one of the biggest jumps since gpus exist at this launch

You know we're talking about the 4080 right?

I am really looking forward to seeing what they deliver, but I don't see any chance the 4080 offers double the performance of the 3080.
 

Knightime_X

Member
Ok, what does that have to do with wanting max settings?
I didn't include resolution because everyone is using something different. Be it 1080p to 4k or whatever.
For me, I want to see all sliders to the right running excellent performance.
I get fairly annoyed when I hit vram limits, which happens more often, esp in newer games.

But whatever.
The way to solve this is to slowly upgrade your gpu. (by slowly I mean grab whatever is newest)
What was a limit caps today will be minimum requirements tomorrow.
 
Last edited:

Hezekiah

Banned
If they make MSRP 1000 dollars we are fucked.
Assuming they dont goof and have low stock, prices should be close to MSRP not too long after launch.

If the whole July paper launch rumors are true, then by the end of the year I should have saved enough to get a 4080....assuming its enough of a jump over the 3080 cuz you know people are gonna offload those in droves, I could live with a 3080 as I "only" play at 3440x1440p
I play at 3440*1440 aswell, but any 3080 would have to be the 12GB version, at way under £649 even if the 4080 sees a price bump.

And I would have reservations about used as I don't like buying second-hand electronics - who knows much it's been ragged.
 

Buggy Loop

Member
Ok, what does that have to do with wanting max settings?
I didn't include resolution because everyone is using something different. Be it 1080p to 4k or whatever.
For me, I want to see all sliders to the right running excellent performance.
I get fairly annoyed when I hit vram limits, which happens more often, esp in newer games.

But whatever.
The way to solve this is to slowly upgrade your gpu. (by slowly I mean grab whatever is newest)
What was a limit caps today will be minimum requirements tomorrow.

Are you reading the replies? The game’s estimated VRAM is wrong. There’s tools to measure it correctly and typically games are way off. There’s also a big difference between ALLOCATED and UTILIZATION.

Resolution does affect VRAM utilization but not by much. 4K is basically default « all I’ll ever need current gen» VRAM usage. Lower the resolution the lower the required memory.
 
Last edited:

BlueHawk

Neo Member
It will be intriguing to see for sure. Nvidia just gonna brute force the powah! I personally thought AMD was gonna outright win this time around with RDNA3, but I've backed down from that claim, lol.
Guess my reasoning on why I prefer the whole power per watt vs brute power is because I feel eventually more people are going to want smaller, more efficient devices, perhaps the Steamdeck could lead the path. Just think big tower, high power PC gaming has been on a decline for a while and for it to succeed surely a power per watt os the way to go?
 

Reallink

Member


Leaks have consistently pointed to at least 2 times over Ampere for more than a year. This is from yesterday. Of course its more than possible. This isnt even the full chip. We're having a massive node jump, massive cache, other architectural improvements. Jumps of double or more in performance have happened multiple times in history.

A jump of 40% over 3080 is literally impossible. A 3090TI is right NOW 25% over 3080. That would mean a 4080 is 15% faster ? Not even Turing was that bad, it was 30% over 1080TI. The node jump alone plus extra power draw would give more than 40% discounting everything else. Relax, its gonna be one of the biggest jumps since gpus exist at this launch


The 2080 literally traded blows with a 1080Ti in rasterization. The 3080 was a 3090 chip that had a few faulty cores disabled. The 4080 is the 3070 chip because you idiots bought out $1500+ GPUs like they were free candy. You get arguably the largest piece of the puzzle in the cuda core count. +20% core count + probable if not guaranteed clock bump + memory changes + architectural improvements. A multi game average of +40% is optimistic if anything. Performance doesn't scale linearly with higher cores and clocks.

Guess my reasoning on why I prefer the whole power per watt vs brute power is because I feel eventually more people are going to want smaller, more efficient devices, perhaps the Steamdeck could lead the path. Just think big tower, high power PC gaming has been on a decline for a while and for it to succeed surely a power per watt os the way to go?

High power PC gaming has actually been on a dramatic incline for several years now. Steamdeck is still in the low hundreds of thousands of units range while 3070 and higher tier desktop chips (i.e new $800+ high end GPUs) have sold around 6 million units. The number of Steamdeck buyers who also own a "high power PC" is probably more than 95%. They're the same customer in other words. Steamdeck hasn't even begun or pretended to demonstrate a demand outside of hardcore PC gamers.
 
Last edited:
Yeah all the rumors point towards the xx80 and xx70 are getting VRAM bumps but with reduced memory interface, the new caches are massive though so should make up for the dropped memory in gaming scenarios which hopefully will also mean they arent that good for mining considering the bumped up TDPs.......so we are hopefully good.
The xx60 is supposedly going to be on 128bit interface and likely only have 8GB of VRAM.

Ju4Zu2m.jpg
F are they really gonna try to sell 4070 as 4080 and 4060 as 4070? If true than it's really greedy move :messenger_angry: I'd expect 4080ti to come out the door very soon after.
 

Dream-Knife

Banned
F are they really gonna try to sell 4070 as 4080 and 4060 as 4070? If true than it's really greedy move :messenger_angry: I'd expect 4080ti to come out the door very soon after.
x80 series have historically been the xx104 chip.

The 3070ti is actually the real 3080.
 

Pagusas

Elden Member
I was planning on buying this card for Starfield. With Starfield delayed, I can't think of any game I'd play with it. I wonder if im buying it just to buy it... Hmmm... And by the time starfield comes out, a TI version of the 4090 might be out..
 
x80 series have historically been the xx104 chip.

The 3070ti is actually the real 3080.
And xx80 has also been historically the very top xx102 chip. In the end it's just the naming.

Everything points to 4080 being another step down in tier by invidia. Just like what they did with gtx 680 which was suppose to be gtx 670 and real 680 suddenly became 780. Watch the real 4080 the 4080ti suddenly have 320 bit set up just like 3080.

But just because amd could not compete that year greedia decided to move a tier up and so a mid range x70 with 256 bus width card sudenly was sold as x80 card. not like gtx 580 and others before it was 384bit, rtx 3080 was 320 bit, but now 4080 again with mid range 256 bit configuration moved and sold as a higher tier card.

Wait just noticed 4070 with 192 bit conf now? What a joke. Man I still member the day when a second tier card, the gtx 570, was a $330 card! Now second tier rtx 3080 or rtx 3080 ti is ~$1000. ~ 3x the price. Greed really has no limits.
 
Last edited:

mitchman

Gold Member
eh, got me one of those glass cubes (disabled basically all RGB).
026nHay2FCZ8JXc65oBvTXv-1..v1569469954.jpg


5x120mm in
5x120mm out

wish it was all 140mm though
Thermals must be bad since you need that many fans. But it seems like there is basically no gaps to suck air into the case, so I guess you need that many fans. You get underpressure inside the case, which should be fine I suppose for certain setups.
 

Dream-Knife

Banned
And xx80 has also been historically the very top xx102 chip. In the end it's just the naming.

Everything points to 4080 being another step down in tier by invidia. Just like what they did with gtx 680 which was suppose to be gtx 670 and real 680 suddenly became 780. Watch the real 4080 the 4080ti suddenly have 320 bit set up just like 3080.

But just because amd could not compete that year greedia decided to move a tier up and so a mid range x70 with 256 bus width card sudenly was sold as x80 card. not like gtx 580 and others before it was 384bit, rtx 3080 was 320 bit, but now 4080 again with mid range 256 bit configuration moved and sold as a higher tier card.

Wait just noticed 4070 with 192 bit conf now? What a joke. Man I still member the day when a second tier card, the gtx 570, was a $330 card! Now second tier rtx 3080 or rtx 3080 ti is ~$1000. ~ 3x the price. Greed really has no limits.
You mean 80ti. That's also ignoring titan and workstation cards.

Smaller bus is allegedly made up with cache like AMD did with rdna2. We will see I guess.
 
You mean 80ti. That's also ignoring titan and workstation cards.

Smaller bus is allegedly made up with cache like AMD did with rdna2. We will see I guess.
You're right I'm more concerned about gaming cards and exclude those workstation ones. But simply by looking at the leaked specs [if they're accurate] you can tell the perf gap is so historically ginormous that clearly a card is missing between them.

Clearly the specs of 4080 listed looks nothing more than what 4070 should be and the real 4080 is not coming at launch, but soon after just named 4080 ti instead to put higher price than usual on xx80 card.
 
Last edited:

Nickolaidas

Banned
You can also use it as a nuclear reactor, as a bonus!

But before buying one, I advise you people to get the ok from your local town's mayor, in order to let him know you're going to be needing half your neighborhood's power supply ...
 
As a reminder it was said by Tim Sweeney that a console with roughly 40TF performance should be able to provide photorealistic visuals. It will be a few gens before this type of card sits in a console box.
 

GreatnessRD

Member
Guess my reasoning on why I prefer the whole power per watt vs brute power is because I feel eventually more people are going to want smaller, more efficient devices, perhaps the Steamdeck could lead the path. Just think big tower, high power PC gaming has been on a decline for a while and for it to succeed surely a power per watt os the way to go?
I'm kinda there with you. My 6800 XT draws less power than a 3080 and beats/match it at my resolution (1440p). If I do go AMD for my next GPU, I hope they keep their power per watt approach, honestly.
 
Top Bottom