• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: NVIDIA GeForce RTX 4070 Graphics Card Specs, Performance, Price & Availability (300W + 36TF performance)

High End got ridiculous in the past couple of years. I hope for a 3650GTX (like the 1650GTX). Something that formerly easily would have been mid range but now would be considered entry level.
 
Last edited:

sendit

Member
damn right walter white GIF by Breaking Bad


I am running a 5950x 3090 in my main gaming PC and I am all in on a 4090 and with how Bethesda optimizes their games (especially at launch) might get 4k 60 out of Starfield

;)

Bought the 3090 (paired with a 5800x) specifically for Cyberpunk 2077 (we all know how that turned out). I was barely able to get ~60 with DLSS + Ray Tracing at 4K. I'm hoping the 4000 series exponentially accelerates ray tracing performance in comparison to the 3000 series.
 
Last edited:

Kenpachii

Member
No way , 3090 is barely faster than 3080. If what you say came true then 4080 will only be 10% faster than 3080. This has never happened in the GPU history and will be a massive disappointment.

All depends on how competitive AMD is, the only reason the 3080 existed was because of AMD, or else it was straight up called a 3090 and the 3080 canned completely.
 

Athreous

Member
Can i ask what game they are?. I have a 11gb 1080ti and i have only seen one game go over 8gb vram usage at 9gb. Pretty much all the other games i have played have been well under 8gb.
Oh wait is it a ray tracing thing?, does using RT up your vram usage considerably?.
The new Resident Evil games if I`m not wrong.
 

Rickyiez

Member
All depends on how competitive AMD is, the only reason the 3080 existed was because of AMD, or else it was straight up called a 3090 and the 3080 canned completely.
3090 as 3080 would still trash 2080 in performance , it just won't be as affordable . My point is , generation leap from the same tier had never be a 10% only leap

3080/3090 ~ 50% over 2080
2080 ~ 30% over 1080
1080 ~ 50% over 980

But he says prices wont drop more because it never happened before, but he also admits that this pricing situation never happened before. He does not know.

Yea Jay2C talk shits out of his arse , his takes should only be taken with a grain of salt .
 

Kenpachii

Member
3090 as 3080 would still trash 2080 in performance , it just won't be as affordable . My point is , generation leap from the same tier had never be a 10% only leap

3080/3090 ~ 50% over 2080
2080 ~ 30% over 1080
1080 ~ 50% over 980



Yea Jay2C talk shits out of his arse , his takes should only be taken with a grain of salt .

Numbers like that mean nothing to nvidia mate. While you look at one metric, it gets disproved by another.

Go look at the increases from x80>x80ti and then look at ampere version of it. Yea suddenly your whole point is moot. Also with naming x90 cards where originally dual GPU cards, yet now they aren't anymore and performance increase is also laughable. Why? nvidia names whatever they want to whatever they can sell, they have no loyalty towards numbers like you do. And anybody with a bit of experience with nvidia knows this.

Lets look at the 2000 series.

Nvidia builds cards towards the competition. the 2000 series where a absolute joke when they released it, not even worthy of a generation upgrade, specially with the flagship card 2080ti being ridiculous expensive for what it delivered. If AMD was competing, the 2080ti would have been the 2080. That's my point i am making. And also with pricing of the 2000 series the 2080 was priced at the same price as a 1080ti and was only 5% faster. Honestly naming with nvidia is a total joke.

Like i said, the 3070 would have been the 3080 which makes total sense also if you look at the chip they used for the 3070 and the 1080ti vs 2080 and 2080ti vs 3070.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Can confirm.

At 3440x1440, on max/ultra settings, RE2 and RE3 are 14+ GB of VRAM.
If you are taking that number from the in game menu, DONT.
If you are taking it from Allocated VRAM, DONT
If you are taking if from Usage VRAM, you likely have a memory leak somewhere.

With a 10GB 3080 at the exact same resolution i just barely go above 7GB of used VRAM.

I havent checked RE3R so i just googled it and it seems to use about the same as RE2R.
vram.png
 
If you are taking that number from the in game menu, DONT.
This is exactly where I'm taking it from. Why shouldn't I!? If this was somehow so dramatically incorrect, you'd think gamers would have thrown a huge fit by now (as they tend to do for issues much smaller than this). You'd think Capcom would've fixed it by now.

So... I'm gonna have to research this.

With a 10GB 3080 at the exact same resolution i just barely go above 7GB of used VRAM.
At 3440x1440, maxed out? (And by that I also mean the "Image Quality" setting ABOVE 100%)

I havent checked RE3R so i just googled it and it seems to use about the same as RE2R.
vram.png
Is this at maxed out settings?

Now, to be fair, I'm not saying that it's impossible -- the VRAM reporter in RE Engine could be broken or something, or it could be a "recommendation." After all Red Dead 2 at 2560x1440 at practically Ultra settings is somewhere in the 8 GB VRAM range -- maybe even less from what I remember.

Either tonight or tomorrow I'll look at an alternative source for this VRAM number, maybe the AMD Adrenalin tool gives me an independent reading or something...
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
This is exactly where I'm taking it from. Why shouldn't I!? If this was somehow so dramatically incorrect, you'd think gamers would have thrown a huge fit by now (as they tend to do for issues much smaller than this). You'd think Capcom would've fixed it by now.
So... I'm gonna have to research this.
At 3440x1440, maxed out? (And by that I also mean the "Image Quality" setting ABOVE 100%)
Is this at maxed out settings?
Now, to be fair, I'm not saying that it's impossible -- the VRAM reporter in RE Engine could be broken or something, or it could be a "recommendation." After all Red Dead 2 at 2560x1440 at practically Ultra settings is somewhere in the 8 GB VRAM range -- maybe even less from what I remember.

Either tonight or tomorrow I'll look at an alternative source for this VRAM number, maybe the AMD Adrenalin tool gives me an independent reading or something...
Yes the in game RE2R and RE3R VRAM usage graphs are wrong, and there was a pretty big stink about it, I guess not big enough to make headlines but it was talked about quite extensively....it doesnt affect performance so gamers generally dont care.

Use Afterburner and set it to display VRAM Allocation and process VRAM usage, in alot of games youll notice a huge disparity between the two.

Some games allocate VRAM as in they simply request from the system "hey I might need to use xx amounts of VRAM", but never actually use it.
Theres a game that escapes my memory(lol) right now that will literally request exactly 1GB less than your total VRAM, so if you have a 10GB card it asks for ~9GB, if you have a 12GB it asks for ~11, but when you look at per process memory its using something like 5GB..........RE2/3R likely use some backwards method of VRAM requirements, i believe theres just some borked setting which doubles whatever the number actually should be....so if its asking you for 14GB it likely needs 7GB.

The Allocation number is pretty much useless.
Process VRAM Usage however is exactly how much an app is actually using and is much more accurate than most in game telemetries.
 
Yes the in game RE2R and RE3R VRAM usage graphs are wrong, and there was a pretty big stink about it, I guess not big enough to make headlines but it was talked about quite extensively....it doesnt affect performance so gamers generally dont care.

Use Afterburner and set it to display VRAM Allocation and process VRAM usage, in alot of games youll notice a huge disparity between the two.

Some games allocate VRAM as in they simply request from the system "hey I might need to use xx amounts of VRAM", but never actually use it.
Theres a game that escapes my memory(lol) right now that will literally request exactly 1GB less than your total VRAM, so if you have a 10GB card it asks for ~9GB, if you have a 12GB it asks for ~11, but when you look at per process memory its using something like 5GB..........RE2/3R likely use some backwards method of VRAM requirements, i believe theres just some borked setting which doubles whatever the number actually should be....so if its asking you for 14GB it likely needs 7GB.

The Allocation number is pretty much useless.
Process VRAM Usage however is exactly how much an app is actually using and is much more accurate than most in game telemetries.
Very useful info, I'll try this out. Thanks! 👍🏾
 

SF Kosmo

Al Jazeera Special Reporter
Bought the 3090 (paired with a 5800x) specifically for Cyberpunk 2077 (we all know how that turned out). I was barely able to get ~60 with DLSS + Ray Tracing at 4K. I'm hoping the 4000 series exponentially accelerates ray tracing performance in comparison to the 3000 series.
I think Cyberpunk in particular is very CPU limited compared to most games.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Specs have been updated again (Subject to change as they did before since they are still testing it). This is looking good boys! https://videocardz.com/newz/nvidia-...now-rumored-with-more-cores-and-faster-memory


The full AD104 is the RTX4070??
Well thats alot better than the original rumored specs.

It should match an RTX3090 in this trim.

This also bodes well for the RTX4060Ti and RTX4070Ti.
The original RTX4070 spec will likely now be the RTX4060Ti
And the RTX4070Ti likely to be a cut down RTX4080 hopefully on the 256bit bus with 16GB of VRAM.

If I was to hazard a guess completely of the cuff.

RTX 4060 - 10GB GDDR6 (I dont think 8GB is gonna be a thing anymore when this card will likely match a 3070+)
RTX4060Ti - 10GB GDDR6
RTX4070 - 12GB GDDR6X
RTX4070Ti - 16GB GDDR6X
RTX4080 - 16GB GDDR6X
RTX4080Ti - 20GB GDDR6X
RTX4090 - 24GB GDDR6X
RTX4090Ti - 48GB GDDR6X

Pretty much every board manufacturer is gonna have to up their game cuz if basically the full lineup is GDDR6X those EVGA fuckers cant keep skimping on the backplate.....it needs to be metal with thermal pads.
Much cheaper brands do a better job.

EVGA FTW3 backplate:
cooler4.jpg

^The FTW3 is their range topping card.....no thermal pads my bro?

Palit GOC Backplate:
cooler4.jpg

^Palits budget card.....their range toppers build is even more legit.
 
Last edited:
There's just nothing game wise you'd even need these.
Simply not true. As stated here in this thread by another gamer, right now a 3090 cannot hold 60 fps at ultra with ray tracing even with the assistance of DLSS. Those of us who want to game at 4k 120 (shooters) and 4k 60 (single player open world) don't have an option in the video card market right now. Gaming is the only thing I spend big money on. I want to have the option to spend lots of money to push the standard forward. I can't go back to 1440p, I can't go back to below 120fps. I need a significantly more powerful card made available for MY particular gaming goals and I'm not alone
 

Crayon

Member
I am super excited to see what nvidia and amd cards can do in the next few months. On the other hand, there is no way I'm going to spend above the 4050/7600 tier with this kind of performance. I'm worried about how long I'll be waiting. Being able to force fsr1 has given my card a last gasp so hopefully no really heavy pc-only games come out between now and the lower range cards. I can get most games on the ps5 in the meantime. Except starfield, if it's worth getting in on early. Let's face it though, the patient ones who can get it on sale when it's been patched up a bit are going to have a better time so I shouldn't let that rush me.
 

TheGecko

Banned
Simply not true. As stated here in this thread by another gamer, right now a 3090 cannot hold 60 fps at ultra with ray tracing even with the assistance of DLSS. Those of us who want to game at 4k 120 (shooters) and 4k 60 (single player open world) don't have an option in the video card market right now. Gaming is the only thing I spend big money on. I want to have the option to spend lots of money to push the standard forward. I can't go back to 1440p, I can't go back to below 120fps. I need a significantly more powerful card made available for MY particular gaming goals and I'm not alone

I don't mean this in a derogatory sense but people like you just care about running a game at said fps and said resolution and settings and tweeking, you don't really play games. If you did then nvidia wouldn't be able to sell mediocre gpus at such high prices.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
So pretty much 4070 will be 3090 perf w/o enough memory for native 4k.
12GB of VRAM?
Native 4K60 would be doable with current games.
Add in DLSS and even 10GB would suffice.

The chip wont be powerful enough to do Native 4K60 with RT enabled anyway, so the added VRAM cost of RT would be irrelevant cuz you'd be playing a slideshow even if it had 100GBs of VRAM.

And with nextgen games the xx70 is gonna be relegated again to being a 1440p card even if it has 3090 levels of power inside it.


You could of course just NOT play with every setting maxed out and probably glide through the generation at 4K easy work.
 

Kenpachii

Member
Did we discuss this?



We should have some good prices if any of it is actually true... maybe


Nvidia can easily not sell a large part of it and store them, and then drop the production of it straight away to just drip feed it and rebrand a new series with ti's behind it or whatever and produce them whenever they need more.

So no i don't think nvidia will flood the market.
 
Last edited:

Sanepar

Member
12GB of VRAM?
Native 4K60 would be doable with current games.
Add in DLSS and even 10GB would suffice.

The chip wont be powerful enough to do Native 4K60 with RT enabled anyway, so the added VRAM cost of RT would be irrelevant cuz you'd be playing a slideshow even if it had 100GBs of VRAM.

And with nextgen games the xx70 is gonna be relegated again to being a 1440p card even if it has 3090 levels of power inside it.


You could of course just NOT play with every setting maxed out and probably glide through the generation at 4K easy work.
12gb is not enough for native 4k even for current games. Many games ask more than 12gb on ultra settings.

When next gen games arrive next year 16gb will me minimum for native 4k and ultra.
 
Why do I see people say the current cards can't push 4k/60fps? You can do so many games at the moment, starting from 3070 Ti to the 3090. I own 3 of the 30 series cards before 3090 Ti, and I found all 3 capable of doing 4k/60 on most games. The top 3 games that gave problems were Resident Evil Village, Star Citizen and Cyberpunk 2077(<---this game is the most flexible of the 3). If we factor in 4k/60 fps settings in emulators, Xenoblade Chronicles 3#Edit is one of the top GPU hogs atm.

One example where a game asks for more than 12GB, say... Resident Evil 2 Remake. Does it even add anything after being maxed out? You can technically max out this game at 4k/60fps from start to finish. I think the engine lets you push gpu usage further, but that mainly eats up vram. Saw no different when using more Vram, past 12GB.
 
Last edited:
I don't mean this in a derogatory sense but people like you just care about running a game at said fps and said resolution and settings and tweeking, you don't really play games. If you did then nvidia wouldn't be able to sell mediocre gpus at such high prices.
People like me don't really play games? That's awfully presumptuous of you. I play an insane amount of games. Playing rounds of Apex on my steam deck right now. I do enjoy tinkering though!

The ppl buying "mediocre" GPUs are playing at 1080p/60 fps. If I was just playing at 4k60fps, I would require a GPU that's 4 times as powerful as theirs. Now factor in that I aim for 120fps so I need a GPU that's 8 times as fast. Like I said, there is no GPU that can do AAA games at 4k 120fps consistently so while I know it's an obsense luxury, I am eagerly awaiting the 4090/4080.

PS- be careful how broad a brush you use to paint those that disagree with you :) I play A LOT of hours of games a week (yes I also make time for playing with settings tho) cheers!
 

hlm666

Member
12gb is not enough for native 4k even for current games. Many games ask more than 12gb on ultra settings.

When next gen games arrive next year 16gb will me minimum for native 4k and ultra.
They allocate more, it doesn't mean the game needs it. Also why are next gen going to need 16gb of vram? the console have 16gb total and some is reserved for the os and some is also used for the actual running game program which the pc will keep in system (ddr4 etc) memory.



the difference explained.

 

Stuart360

Member
They allocate more, it doesn't mean the game needs it. Also why are next gen going to need 16gb of vram? the console have 16gb total and some is reserved for the os and some is also used for the actual running game program which the pc will keep in system (ddr4 etc) memory.



the difference explained.


I think he might be confusing with system ram maybe?. I have only seen one game that used more than 8gb vram and that game had a vram problem that was fixed eventually. Most modern games i have played are in the 3-6gb vram range at max settings, although thats at 1440p, but 4k only adds 1-2gb on top. Ghost Recon Breakpoint has the highest vram usage of any modern game i have played, and that was 7.5-8gb of vram at 1440p and 'Extreme' settings, which are future proof settings.
And like you said, if you have a card with a lot of vram, a game will show it as using more than it actually needs.
And yeah the consoles only have around 13gb of total ram for games, split between vram and system ram. So yeah any card with 10+gb of vram should be good for this gen, hell 8gb will probably be enough for a lot of games this gen.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
12gb is not enough for native 4k even for current games. Many games ask more than 12gb on ultra settings.

When next gen games arrive next year 16gb will me minimum for native 4k and ultra.
What?
Many?
And by "ask" do you mean in the menu, or actual usage?
Name 10 games that not only use more than 12GB of VRAM but also actually chug when paired with 12GB cards.

These are easy benchmarks to find you know that right?
The RTX 3080Ti and RTX 3090 are effectively the same chip.
RTX3090 has double the VRAM of the RTX3080Ti, which "many" games run significantly better on the RTX3090 due to VRAM limitations?

I wont even have you waste your time finding those games because it simply isnt true.
They dont exist.

Allocation and Process memory arent the same, and games like Cold War which seemingly have a memory leak or fill VRAM with the whole games textures are the exception not the rule.

And with DirectStorage on the horizon having massive massive amounts of VRAM wont be as relevant, games will switch data fast enough they "should" be coded to not lockout the most popular segment of the market.....sub 16GB cards.
Hell even the 4080 is probably a measly 16GBs.
 

ZywyPL

Banned
The ppl buying "mediocre" GPUs are playing at 1080p/60 fps. If I was just playing at 4k60fps, I would require a GPU that's 4 times as powerful as theirs. Now factor in that I aim for 120fps so I need a GPU that's 8 times as fast. Like I said, there is no GPU that can do AAA games at 4k 120fps consistently so while I know it's an obsense luxury, I am eagerly awaiting the 4090/4080.

The performance doesn't scale linearly with resolution tho, to go from FHD to 4K you need "just" 2-2.5x more computing power, depending on the title. So 4K120 in reality needs about 4-5x more power than 1080p60, which is still a lot, but the upcoming RTX4000 and RDNA3 cards should have enough of it.
 
Last edited:

Sanepar

Member
What?
Many?
And by "ask" do you mean in the menu, or actual usage?
Name 10 games that not only use more than 12GB of VRAM but also actually chug when paired with 12GB cards.

These are easy benchmarks to find you know that right?
The RTX 3080Ti and RTX 3090 are effectively the same chip.
RTX3090 has double the VRAM of the RTX3080Ti, which "many" games run significantly better on the RTX3090 due to VRAM limitations?

I wont even have you waste your time finding those games because it simply isnt true.
They dont exist.

Allocation and Process memory arent the same, and games like Cold War which seemingly have a memory leak or fill VRAM with the whole games textures are the exception not the rule.

And with DirectStorage on the horizon having massive massive amounts of VRAM wont be as relevant, games will switch data fast enough they "should" be coded to not lockout the most popular segment of the market.....sub 16GB cards.
Hell even the 4080 is probably a measly 16GBs.
Recent games i remember 4 who uses more than 12gb. Sniper Elite 5, Cyberpunk 2077, Forza Horizon 5, Halo Infinite all with all max settings.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Recent games i remember 4 who uses more than 12gb. Sniper Elite 5, Cyberpunk 2077, Forza Horizon 5, Halo Infinite all with all max settings.
None of those games need more than 12GB of VRAM.
None of those games even sweat a base 3080.

  • Cyberpunk 2077 runs just fine on a 3080Ti literally the exact same frame rate as a 3090
  • Forza Horizon 5 runs on a potato at 4K no problem
  • Halo Infinite is super CPU bound so its hard to even say whats causing it to sutter, even 10900Ks can hitch.
  • Sniper Elite 5 has/had a memory leak it would eat whatever VRAM was available but will run just fine on 10GB let alone 12GB, if you have a 10GB card it will eat 10GB, if you have a 12GB card it will eat 12GB, if you have a 16GB card it will eat 16GB.


Receipts:
vram.png


vram.png


Sniper-Elite-5-GPU-benchmarks-2.png


iEkZRVb.png




Phew!
I thought you were going to mention god settings Dying Light 2 which can eat 11GB of VRAM, but when you consider all the effects going off it makes sense....but the chips give up before the VRAM does anyway.
 

Jayjayhd34

Member
Recent games i remember 4 who uses more than 12gb. Sniper Elite 5, Cyberpunk 2077, Forza Horizon 5, Halo Infinite all with all max settings.
cyberpunk does not use more 10gigs VRAM and im sure i maxed out halo infinite with rtx 2080 8gb.

edited for missing words
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
cyberpunk does not use more 10gigs VRAM and im sure i maxed out halo infinite with rtx 2080 8gb.

edited for missing words
Correct.

The VRAM fear mongering is still high it seems.
Even at the beginning of the Ampere generation we had people telling us owwe the 3080 only has 10GB of VRAM, the 3070 on 8GB.
They will die before the generation ends.
AMD fanboys were celebrating their glorious VRAM that to this day they havent touched. The only reason those cards came with 16GB is because 8GB would have been an insult and AMD committed to the 256bit bus.

Whats even funnier is RTX 3060 fans trying to justify their purchases telling me "My 3060 will outlast your 3070 cuz it will have more VRAM for longer" bitch that 3060 cant even play games at 1440p60 what makes you think in future generations it will handle 4k60?

I told people then, Ill say it again, the most likely outcome is the chip runs outta gas before the VRAM limitations become evident.
The only consolation i got from all this really is that my "prophecy" came true.
Games that genuinely go over 10GB of VRAM dont kill a 3080 because it lacks VRAM they kill it because at those settings the chip simply cant render that shit.

Dying Light 2 and Cyberpunk dont run bad at max settings with RT because they are out of VRAM, they run harshly because they are stressing the chips somewhere beyond.
 

hlm666

Member
I fear Far Cry 7.
We should probably point out FC6 was an AMD sponsored game and without downloading the HD texture pack it uses like 8gb vram. It was like godfall, questionable vram usage and shit RT to play to their strength and weakness. Ubisoft has been taking money from nvidia and amd for years and letting them screw each other over in their games. You could throw up watchdogs 2 and make some negative arguments towards AMD for instance.
 

01011001

Banned


I fear Far Cry 7.


? not sure I understand what you mean by that...

but FarCry 6 running at 4K ~70fps at max settings with RT is surprisingly good.

it also ran really well on my 3060ti at 1440p when I tried it during that free weekend Ubisoft had. with RT and decently high settings I easily hit above 60fps.

I found it disappointing that it didn't support DLSS tho, as the game's TAA is a fucking mess and I bet DLSS would look way cleaner
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not

NVIDIA GeForce RTX 4080 Allegedly Gets Spec Bump: 9728 Cores, 16 GB GDDR6X Memory, 420W TBP, Around 30% Faster Than 3090 Ti​



So they really want RTX 3080 owners to NOT buy the RTX 4080.
It was bad enough with the old specs, but this is getting insulting.
They are cutting up that chip some more?

They better not try to be clever and say the 4080Ti is the full AD103.
The 4080Ti better be AD102 and atleast......atleast 320Bit.


P.S

The is not a Spec Bump.....its a Spec Dump.
The old specs had the RTX 4080 with 10240 CUDA cores this drops the CUDA cores down to 9728.
 
Top Bottom