• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4000 could release in July or August

Elog

Member
I have a feeling that the next generation will be more about what performance per USD that Nvidia and AMD actually can offer than peak performance (that we often talk more about than effectiveness). This has ofc always been true to some extent but given the absolute cost for the modern flagship products I think that the winner the coming generation will be the one with the dominant lower mid-range card in terms of cost/performance more than ever before.
 

Mister Wolf

Member
I feel that.

The question is are there any games you play that your x3D and 3080ti combo leave you wanting more right now?

My problem with 3060 is yes I can do 120fps, but sometimes at 1080p only... I'll be satisfied with 1440p120 minimum, personally.

No its just a thirst for more power especially when you know that power is available. Where it struggles is when you introduce games with multiple raytracing effects. Raytraced reflections combined with RTGI will humble even a 3080ti. All of the showcase games moving forward(Stalker 2) will definitely have options to enable both on PC. I don't even bother turning on raytraced shadows as well, that's asking too much of a GPU at the framerates I desire. For instance on Dying Light 2 I've already resorted to using 4K DLSS Performance mode with RT Light and AO. No raytraced shadows or reflections.
 
Last edited:

HeisenbergFX4

Gold Member
The release is to far away…but if there is one day 1 sold!
Bill Murray GIF by Groundhog Day
 
No its just a thirst for more power especially when you know that power is available. Where it struggles is when you introduce games with multiple raytracing effects. Raytraced reflections combined with RTGI will humble even a 3080ti. All of the showcase games moving forward(Stalker 2) will definitely have options to enable both on PC. I don't even bother turning on raytraced shadows as well, that's asking too much of a GPU at the framerates I desire. For instance on Dying Light 2 I've already resorted to using 4K DLSS Performance mode with RT Light and AO. No raytraced shadows or reflections.
That's true, rt is going to get more and more expensive.

Hopefully the 4070 will do high/1440p (or dlss performance when available/necessary) at 120fps for a few years.

I'd like to run older games like Witcher 3 at 4k120 though.
 
Last edited:

Dream-Knife

Banned
No its just a thirst for more power especially when you know that power is available. Where it struggles is when you introduce games with multiple raytracing effects. Raytraced reflections combined with RTGI will humble even a 3080ti. All of the showcase games moving forward(Stalker 2) will definitely have options to enable both on PC. I don't even bother turning on raytraced shadows as well, that's asking too much of a GPU at the framerates I desire. For instance on Dying Light 2 I've already resorted to using 4K DLSS Performance mode with RT Light and AO. No raytraced shadows or reflections.
What framerates?

Even if these cards are 2x it still won't be enough for 4k. Maybe with the MCM designs in the 5000 series.
 

Rival

Gold Member
I’ll put my money down for a 4080 if they are available and not $1500. My 3080ti isn’t quite cutting it at 4K.
 

skneogaf

Member
My 3090ti and i7 pc plus my 83"oled and 7.2.6 denon av receiver is is only a few hundred watts away from 3000 Watts which is as much as UK plugs can do so the equivalent 4000 series needs to not be more power hungry for me to upgrade.
 

SenaxxNL

Member
I would rather see real world gaming benchmarks and see how it goes. They better have a reason for that big power consumption. If it's an fps or two better than a 3080 well .... then no
Everybody is still quoting that 900W? This was a full AD102SKU 48G 24Gbpis GDDR2X. This is probably not a consumer version. Rumors say the 4080 is to be expected around 350W.

5pMl2b0.png
 

Laptop1991

Member
I haven't had much use out of my 3 series yet, the 4 series won't be cheap, i'll wait until there is some new top games to play then get a 5 or 6 series one!.
 

BigBooper

Member
My body is ready, but I'm not sure if my PC is. What do you all think? I have a 6700k and a 1000watt psu.
Will my cpu bottleneck everything so much that a 4000 might not be worth it? Will my psu be able to power it?
 
Don't need it. Down the line will get one of course.

My 6900 XT should last me a good few/couple of years at 3440x1440 (whenever I finally get around to buying that monitor). At 300W, it's also perfect for me so that I don't have to replace my 750W PSU.

I know, I know... "LOL AMD, no DLSS and weak sauce ray tracing." Honestly at this point I still don't care about ray tracing, and as AMD keeps making improvements to FSR (already 2.0 is a big improvement over 1.0), I should be fine and won't be craving DLSS.

If for some reason Unreal Engine 5/Nanite games start making current gen GPUs spontaneously combust or something, then at that point I'll consider my options...
 

zcaa0g

Banned
In that context: Vulkan support for DCS is supposedly right around the corner. According to rumors it could mean a 3xxxx --> 4xxx like upgrade in performance improvements by itself. We'll see.


Yeah, that should be big, but Vulcan support being around the corner has been the case for quite some time now. We'll see.
 

VFXVeteran

Banned
What about native non-DLSS 4K + RTX + every setting on ultramax?

I mean, I can settle with less than that, but I guess some people might not.
You can do that now - just not at 60FPS. I don't think 4x000 series is going to put a dent in performance bottlenecks using RT. DLSS is still going to be required. I don't see really fast RT performance at native 4k for awhile. This generation of cards is too soon to be a big jump like 2x000 -> 3x000. I won't be buying one for sure. Maybe the 5x000 series of boards in a few years.
 
Last edited:

mitchman

Gold Member
Yeah, that should be big, but Vulcan support being around the corner has been the case for quite some time now. We'll see.
I'd rather see DLSS/FSR support as a higher priority, it should yield better performance increase than Vulcan alone.
 
I don't think theres a point in rushing to sell your old gpu anymore with such a limited supply from both vendors at launch. If last few years is anything to go by we might be able to swap 30 series into 40 series free of charge :messenger_horns:.
 

Dream-Knife

Banned
I don't think theres a point in rushing to sell your old gpu anymore with such a limited supply from both vendors at launch. If last few years is anything to go by we might be able to swap 30 series into 40 series free of charge :messenger_horns:.
Not with crypto crashing. If anything, 30's will be cheap due to people with FOMO.
 

supernova8

Banned
Rumors/some techtubers (grain of salt obviously) suggest that RDNA3 and Lovelace will be pretty neck and neck (or neckbeard and neckbeard) in terms of performance (plus NVIDIA may have less of a lead with raytracing, and AMD now has FSR2.0 so NVIDIA cannot gloat as much about DLSS).

Would make sense for NVIDIA to want to get their card out much earlier in the hopes that all the enthusiasts will get it and not wait for RDNA3. Plus, the quicker they launch the RTX 40 series, the more stupid the Intel ARC launch looks (whenever it actually launches) since even the top ARC card is only going to be around a 3070.

Complete speculation but I wouldn't be surprised if NVIDIA had a set of RTX 40 super cards ready to deploy around Jan-March depending on what the performance gap is with RDNA3 (which I think is due for October-ish?). Working backwards would mean NVIDIA should launch the main RTX 40 cards around June/July to have roughly 9 months gap between that and the super cards (launching within 6 months would probably annoy everyone who bought the first batch... but hey NVIDIA probably doesn't give a shit).

I just hope AMD will pummel NVIDIA in terms of price/performance and gain more market share. We see how Intel pulled its finger out when AMD started wiping the floor. Competition improves and everyone wins (read: I don't actually care if AMD wins, I just want NVIDIA to feel the pressure so that they offer better products at relatively lower prices)
 
Last edited:

TheStam

Member
I've thought for years that I'd upgrade my 2080ti when the 4-series arrives, but right now I feel like I can wait another generation. We'll see how it goes, but I can still max out pretty much any game in 1440p above 60 fps.. I usually tend to upgrade when high is a challenge. Especially since DLSS is such a life saver I feel like I can't really motivate the cost just yet. It would be nice though.
 

DukeNukem00

Banned
I've thought for years that I'd upgrade my 2080ti when the 4-series arrives, but right now I feel like I can wait another generation. We'll see how it goes, but I can still max out pretty much any game in 1440p above 60 fps.. I usually tend to upgrade when high is a challenge. Especially since DLSS is such a life saver I feel like I can't really motivate the cost just yet. It would be nice though.

If you wait that long that 2080TI would be near worthless for reseling. Best course of action when buying high end parts is to sell them imediately after the next gen comes out so you can recover as much money as possible. Then the new card might pay for itself or you only need to add a small amount. By doing this the only time where you shell out a large sum of money is the very first time you get a flagship gpu. After that you'll recoup 80% of your money if you resell while is still has value and power
 

nani17

are in a big trouble


Looks like he's disappointed with what he knows about RDNA3 but the 4090 450w wow
 
Last edited:

TheStam

Member
If you wait that long that 2080TI would be near worthless for reseling. Best course of action when buying high end parts is to sell them imediately after the next gen comes out so you can recover as much money as possible. Then the new card might pay for itself or you only need to add a small amount. By doing this the only time where you shell out a large sum of money is the very first time you get a flagship gpu. After that you'll recoup 80% of your money if you resell while is still has value and power

That's probably the best course of action financially. But I have a cabin / man cave in the wilderness pretty much and I tend to pass down parts from the main rig to the cabin rig. But of course the low end rigs parts are the last stop and don't hold much value after that point. Now with the Steam Deck I might not even need a second rig really.
 

Dream-Knife

Banned
If you wait that long that 2080TI would be near worthless for reseling. Best course of action when buying high end parts is to sell them imediately after the next gen comes out so you can recover as much money as possible. Then the new card might pay for itself or you only need to add a small amount. By doing this the only time where you shell out a large sum of money is the very first time you get a flagship gpu. After that you'll recoup 80% of your money if you resell while is still has value and power
I don't really understand this. You end up just renting the thing due to FOMO.

The beauty in buying a high end card is that you don't have to deal with that bs every gen. Otherwise you might as well go midrange.
 

DukeNukem00

Banned
I don't really understand this. You end up just renting the thing due to FOMO.

The beauty in buying a high end card is that you don't have to deal with that bs every gen. Otherwise you might as well go midrange.

You dont understand getting your money back instead of losing it ? Seriously ? How did you get fomo out of me saying that you can get your money back ? How is buying a product once every several years "dealing with bs" ? You simply buy a new product every 2 years then sell the old one to minimize the financial impact. Jesus man, to make something so normal into fomo
 

Dream-Knife

Banned
You dont understand getting your money back instead of losing it ? Seriously ? How did you get fomo out of me saying that you can get your money back ? How is buying a product once every several years "dealing with bs" ? You simply buy a new product every 2 years then sell the old one to minimize the financial impact. Jesus man, to make something so normal into fomo
A GPU isn't and investment, it's a depreciating asset. Selling on ebay is a hassle enough, especially now that they will report yearly sales over $600 to the IRS. It's way easier to just keep your card until you need to upgrade, and then do it and keep the old one in a backup system or as a backup or something. These things aren't crazy expensive anyway.
 
May - AMD's 7nm RX 6X50 refresh
June - Intel's 6nm ARC GPUs for desktop
July - NVIDIA's 4nm RTX 4000
August - AMD's 5nm Zen 4 CPU
September - Intel's 7nm 13th Gen CPU
October - AMD's 6nm RX 7000

That's the most realistic scenario in my opinion.
Intel is going to get man handled by Nvidia. Those Intel arc GPUs better be dirt cheap.
 

Griffon

Member
As far as I'm concerned, the GPU makers did not release any new GPUs in 6 years. I only buy GPUs at about 400€ max, and the performance at that price-range has been stagnating ever since.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
RTX 3080 owners basically only have the 4090 to look forward to....or the 4080Ti assuming its gonna be an AD102 chip.

Ju4Zu2m.jpg



That new cache better make up for the reduced memory interface.....and hopefully it also means these cards wont be particularly good for mining.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Everybody is still quoting that 900W? This was a full AD102SKU 48G 24Gbpis GDDR2X. This is probably not a consumer version. Rumors say the 4080 is to be expected around 350W.

5pMl2b0.png

Ha!
Not a chance the 4080 makes a TDP of 350W.....the 3080 has a TDP of 320W but hovers around 330 - 340W.
Its all but guaranteed that these cards have a ~70W TDP increase over Ampere so Id wager in the 400 range.
Basically every card has gone up a tier in terms of TDP.
4060 eats like a 3070
4070 eats like a 3080
4080 eats like a 3090

Funny that the 4090 might end up having the smallest increase in TDP gen on gen.

W9RxfZW.jpg
 
Top Bottom