• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4000 could release in July or August

winjer

Gold Member

Both AMD and NVIDIA are revising their TSMC orders, reports DigiTimes. The website citing their industry sources claims that Apple, AMD and NVIDIA all wish to change their orders. AMD reportedly wants to lower its 7/6nm wafer orders while NVIDIA is now facing a problem of over saturated GPU market and possibly lower demand for next-gen GPUs.

NVIDIA now has a large stock of GeForce RTX 30 graphics cards for sale, yet the company is now willing to lower the price just yet. This is despite the second-hand market now being inflated by mining cards which are no longer profitable to use and keep. There are simply too many cards now, which paints a grim outlook for RTX 40 demand.

The most interesting details from DigiTimes article have been translated by RetiredEngineer:


NVIDIA is one of the companies that made prepayments to TSMC for their 5nm wafers. Unfortunately for NVIDIA, TSMC is not willing to make concessions. At best NVIDIA can count on wafer shipment delay for up to two quarters, but it’s NVIDIA’s problem to find customers for TSMC vacated production capacity, which may be very hard given how demand has dropped for the whole sector.

NVIDIA is now expected to use 5nm (TSMC N4) technology for its upcoming GeForce RTX 40 series codenamed Ada Lovelace. The company is already using this process node for its Hopper (H100) data-center architecture.

The market situation has been reflected in lower stock price for AMD and NVIDIA. Both stocks have dropped by almost 50% in just 6 months time.


Without mining, there will be much lower demand for GPUs.
These tech companies are going to get desperate not being able to rip-off consumers like they did for 2 years.
 

winjer

Gold Member



The NVIDIA GeForce RTX 4090 will use 128 SMs of the 144 SMs for a total of 16,384 CUDA cores. The GPU will come packed with 96 MB of L2 cache and a total of 384 ROPs which is simply insane. The graphics card is expected to feature the latest TSMC 4N process node which is an optimized version of the 5nm process node and will rock some impressive clock speeds. The leaker mentions the base clock at 2235 MHz, boost clock at 2520 MHz, and the actual max boost clocks to be over 2.75 GHz which might be a confirmation of the 2.8 GHz+ clock speeds. It looks like custom models might be pushing this over the 2.9 GHz range too. That's a:

  • 60% Increase In Base Clock (2235 MHz RTX 4090 vs 1395 MHz RTX 3090)
  • 49% Increase In Boost Clock (2520 MHz RTX 4090 vs 1695 MHz RTX 3090)
  • 33% Increase In Max Clocks (~2800 MHz RTX 4090 vs ~2100 MHz RTX 3090)
That gives us slightly over 90 TFLOPs of compute horsepower so the rumored 100 TFLOPs figure may also be possible now with a full-fat AD102 configuration rocking similar or higher clocks.

qthA2MT.png
 

Rickyiez

Member
I’m going to wait and see who’s bringing the performance per dollar. I’ll go back to AMD if they put out a value as good as the 5700XT was.

I have no allegiance.
Same, I will buy from whoever to make the next RTX3080. I got it at $750 back then and it's damn awesome.

From the leak however it doesn't looks good for 4080. I'm hoping AMD will offer a much better high end
 
Last edited:

FingerBang

Member



Without mining, there will be much lower demand for GPUs.
These tech companies are going to get desperate not being able to rip-off consumers like they did for 2 years.
I hope TSMC doesn't let them cut their orders so 1, they can go fuck themselves. 2, their shit will go down in price quickly.

Still excited for next gen. I've got a 4K/HRR TV. The more power, the better.
 

Orta

Banned
Time to put my faithful 1070 out to pasture and get me a 4070. Hopefully we'll see a very noticeable jump in performance over the next gen consoles.
 

winjer

Gold Member
Oh. Didn't see that. Considering that 3060s Ti have 12, I hope that part it's false.

Because of this bit reduction and only 10GB of vram, this 4070 might be somewhat limited at higher resolutions, like 4K.
By this leak, it should have similar TFLOPs to a 3080 12GB. But this one has 12GB and a 384 bit memory bus.
 

01011001

Banned
True. Ever worse. That 4070 better have 12GB+.

I have a 3060ti and so far didn't really have much issue with tha 8GB vram tbh.

at 1440p this is hard to hit.
and IMO as games get optimised for consoles these days and more games make use of direct storage down the line, the VRAM pool will be less of an issue, especially if you don't need 4K and are fine with 1440p
 

Filben

Member
So many people are using mic,chat(text/voice), i really feel like im part of this big community whilst on PS i always felt like a bit "isolated" since very few are using mic, no chat, discussions etc..
Wait until you learn the toxic communities 😅 Serious though, my preference changed to the opposite. I still mainly play on PC although the lack of a dedicated UI, lack of DualSense support, issues with a potential of hundred different reasons and settings, and the stutter issue with many DX12 games make me playing more on PS5 then before.

But I like the laid back attitude of console gaming. In many games on PC you'll encounter sweaters and try hards taking this whole gaming business waaay too serious, hence I find myself more and more disabling text and voice by default. People just annoy me more than like 10 to 15 years ago. I like the isolated feeling as much as I like my somewhat isolated home and living place. I just want to have my peace and gaming.

That being said, I still love PC gaming for many reasons. Hope you'll enjoy the transition and you're having fun :)
 

Sanepar

Member
I have a 3060ti and so far didn't really have much issue with tha 8GB vram tbh.

at 1440p this is hard to hit.
and IMO as games get optimised for consoles these days and more games make use of direct storage down the line, the VRAM pool will be less of an issue, especially if you don't need 4K and are fine with 1440p
4070 has horsepower for 4k but not enough vram.
 

rodrigolfp

Haptic Gamepads 4 Life
I have a 3060ti and so far didn't really have much issue with tha 8GB vram tbh.

at 1440p this is hard to hit.
and IMO as games get optimised for consoles these days and more games make use of direct storage down the line, the VRAM pool will be less of an issue, especially if you don't need 4K and are fine with 1440p
We hope. Flight Simulator already eats more than 10GB (at 4k) without any RT effects. I can only imagine some future current gen games with all RT effects.
 

winjer

Gold Member
I have a 3060ti and so far didn't really have much issue with tha 8GB vram tbh.

at 1440p this is hard to hit.
and IMO as games get optimised for consoles these days and more games make use of direct storage down the line, the VRAM pool will be less of an issue, especially if you don't need 4K and are fine with 1440p

But consoles have a 16GB unified pool of memory. Meaning a bit less wasted memory, duplicating some data in two polls of memory, on PC.
And they also have high performance SSDs and system I/O. So this levels out on PC vs consoles.

Then we can add higher quality assets on PC, that range from higher texture resolution, higher quality shaders, higher LODs, etc. And soon we'll have a lot more vram usage.
And of course, with RTX becoming a standard, vram usage is bound to increase. All those reflections from off screenspace, don't come free. Neither those buffers for real time global illumination and shadows.

At this point, especially at the high-end, GPUs should have 16Gb. There is no excuse for an expensive card, like a 4070, that will probably cost 500-600$, to have just 10GB.
This is nVidia just being greedy, offering the less amount of memory, on very expensive GPUs. And making these GPUs obsolete faster, than they would otherwise.

Some still wonder why Sony and MS choose AMD for their consoles.
The answer is simply, because they would have to pay more, for smaller chips and lower performance.
 

Sanepar

Member
I expect 4070 to come at $699 and 4080 $899 so 10gb for $699 is a joke.

Nvidia historicaly forces people who buys xx70 to exchange every new gpu gen with poor vram size.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The 4070 at 160bit?
What the what?

They already have a 4070ti with 192bit and 12GB, you can already smell it.
You aint gonna trick me Nvidia.

The extra cache better make up for drop cuz the 4070 doesnt look much much better than 3070ti and that 4080 on paper doest seem to truly trump the 3080:12G

If i cant afford a 4090 ill be skipping the generation.
 

GymWolf

Member
The 4070 at 160bit?
What the what?

They already have a 4070ti with 192bit and 12GB, you can already smell it.
You aint gonna trick me Nvidia.

The extra cache better make up for drop cuz the 4070 doesnt look much much better than 3070ti and that 4080 on paper doest seem to truly trump the 3080:12G

If i cant afford a 4090 ill be skipping the generation.
Any thoughts on the 4080?
 

IFireflyl

Gold Member
420w tdp for a 4080....

hard pass. I already have a room heater.

Extremely this. I have a 3090. It is ridiculous that they're putting all of their R&D into increase power and not power efficiency. Especially with inflation. I don't want to pay higher electric bills each month in addition to having my room heating up in the summer.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Any thoughts on the 4080?
On paper im not that impressed. (spoilt by 308012G and 3080ti being GA102 chips)

But its really hard to even gauge performance because Nvidia seem to be backing having a large L2 cache instead of brute forcing their way to the top (cept the 4090 that thing looks mad).

We will only really know if the perf gain from the large cache is worth it when we can actually bench these cards with actual games.
Synthetic benches might not even be worth it as they are written right now and/or we will get a whole new batch of synth benches that take infinity cache and nvidias cache into account.

If software can utilize the large cache the low VRAM probably wont even be much of an issue due to how fast everything gets in and out.

Nvidia surely are cooking a bunch of other chips that they are just choosing to keep hidden till they know what AMD is doing.
A 4070ti 12GB is all but a given, we are likely also getting a 4080ti thats AD102 based with 20GB of VRAM.
 

GymWolf

Member
On paper im not that impressed. (spoilt by 308012G and 3080ti being GA102 chips)

But its really hard to even gauge performance because Nvidia seem to be backing having a large L2 cache instead of brute forcing their way to the top (cept the 4090 that thing looks mad).

We will only really know if the perf gain from the large cache is worth it when we can actually bench these cards with actual games.
Synthetic benches might not even be worth it as they are written right now and/or we will get a whole new batch of synth benches that take infinity cache and nvidias cache into account.

If software can utilize the large cache the low VRAM probably wont even be much of an issue due to how fast everything gets in and out.

Nvidia surely are cooking a bunch of other chips that they are just choosing to keep hidden till they know what AMD is doing.
A 4070ti 12GB is all but a given, we are likely also getting a 4080ti thats AD102 based with 20GB of VRAM.
Do you think it's an ok upgrade from a 2070super if i want to play everything in 4k60?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Do you think it's an ok upgrade from a 2070super if i want to play everything in 4k60?
The 4080?
It should easily handle 4K60.
And over a 2070Super thats a huge huge upgrade.

My only "beef" with the current paper specs is that for people with RTX3070tis and RTX3080s the only real worthwhile upgrades that youll "feel" is going a tier up.
3070ti owners will likely need 4080s
3080 owners will likely need 4090s

The gen on gen leap isnt as big "on paper" as I would have hoped, the 4090 is nigh literally twice as powerful as the 3090, but the 4070 and 4080 arent double the 3070ti and 3080 respectively.
Nvidia are basically forcing everyone who is doing a gen on gen upgrade to buy up because the 3080 was on such a good chip.

Im praying for prices to be reasonable and for the 4080ti to come sooner rather than later.
 
I would rather see real world gaming benchmarks and see how it goes. They better have a reason for that big power consumption. If it's an fps or two better than a 3080 well .... then no
Is there any big difference from the one model to the other that needs everyone to upgrade? Every electronic device needs one or two generation to mature and evolve. So you have to follow the same concept and buy a new graphic card every 2 to 3 years because there is nothing that needs that kind of power.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
so 4060 will have less vram than the 3060
Almost certainly.
Theyve gimped the 4070 to 160bit, so the 4060 is likely 10GB and based on a cutdown 4070 or even worse 8GB at 128bit.
Basically a drop in replacement for the 3070ti.
Is there any big difference from the one model to the other that needs everyone to upgrade? Every electronic device needs one or two generation to mature and evolve. So you have to follow the same concept and buy a new graphic card every 2 to 3 years because there is nothing that needs that kind of power.
2 -3 year upgrade cycles with Nvidia equals gen on gen upgrades.
Nvidia releases new GPUs every 2 years.

If you are someone who plays at Native 4K or higher Ada will certainly be worth it at the top end.
But realistically with DLSS/FSR you could probably float the generation with an Ampere card.

I just need to see the prices and benchmarks of higher Unreal Engine 5 games to decide if im gonna be upgrading to Ada.
 

winjer

Gold Member
With nVidia messing around so much with memory bus width, it's difficult to guess what they'll do.

for those who don't know, the amount of vram is very dependent on the bus configuration.
This happens because you can only connect one memory chip to a 32 bit channel.
So, for example, if a GPU has a 160 bit bus, it can have 5 memory chips. So a GPU maker like nVidia can use 2GB chips, to get to 10GB. Or They can use 4GB chips to get to 20GB.

Remember the 3060 Ti has 8GB of vram. This is because it has a 256 bit bus. So we got 8 chips of 1GB.
But the 3060 had 12Gb of vram. This is because it has a 192 bit bus. So we got 6 chips of 2GB.
 

Calverz

Member
Should I consider the upgrade if I already have a 3070?

Thing is, I play a lot on the TV which is 4K, so I guess I would notice a big improvement when playing at that res.


Man... I'd fuck your GPU, ngl.
That’s what I’m wondering. I have a 3070. But will be getting a lg 42c2 later in year so need to hit 4k
 
With nVidia messing around so much with memory bus width, it's difficult to guess what they'll do.

for those who don't know, the amount of vram is very dependent on the bus configuration.
This happens because you can only connect one memory chip to a 32 bit channel.
So, for example, if a GPU has a 160 bit bus, it can have 5 memory chips. So a GPU maker like nVidia can use 2GB chips, to get to 10GB. Or They can use 4GB chips to get to 20GB.

Remember the 3060 Ti has 8GB of vram. This is because it has a 256 bit bus. So we got 8 chips of 1GB.
But the 3060 had 12Gb of vram. This is because it has a 192 bit bus. So we got 6 chips of 2GB.
Yeah. Hopefully the rumors are wrong and the 4070 actually has either 12gb or at least if it's 10gb, an 320 bit bus...

Still not buying a 10gb card though, that's less than my 3060!
 

winjer

Gold Member
Yeah. Hopefully the rumors are wrong and the 4070 actually has either 12gb or at least if it's 10gb, an 320 bit bus...

Still not buying a 10gb card though, that's less than my 3060!

I doubt it would have a 320 bit bus. That's for higher tier cards.
The 1070, 2070, 3070, all had a 256 bit bus.
 
I doubt it would have a 320 bit bus. That's for higher tier cards.
The 1070, 2070, 3070, all had a 256 bit bus.
You're probably right but damn how pathetic is an 160 bit bus for a xx70 tier card?!?

I think the 4060 will have 12gb, and an 192 bit bus again but with significantly faster gddr6. That's probably the only card I'll consider, along with the rdna3 cards.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Yeah. Hopefully the rumors are wrong and the 4070 actually has either 12gb or at least if it's 10gb, an 320 bit bus...

Still not buying a 10gb card though, that's less than my 3060!
Its def not 320bit.
Nvidia have reduced the width because they are planning on making it up with their new larger cache.
Just like AMD had infinity cache so they never needed bus widths in the 320+bit.

Unless the leaker got the specs wrong and the 4070 is 192bit then 10GB makes sense.
If it is 192bit we can hope for 12GB and that kinda makes sense with the cards all going down a tier in terms of memory bus width.

But I suspect Nvidia have a 4070Ti with 192bit that has 12GB of VRAM.
 

winjer

Gold Member
You're probably right but damn how pathetic is an 160 bit bus for a xx70 tier card?!?

I think the 4060 will have 12gb, and an 192 bit bus again but with significantly faster gddr6. That's probably the only card I'll consider, along with the rdna3 cards.

Supposedly, these new nVidia cards will have a big increase in cache.
But I doubt it will be enough to make up for such a reduction in memory bus width.
Even the 6800 series, that had 128MB of L3 cache, had a 256 bit bus.

The best we can hope for is that it has a 192 bit bus and 12 GB. That would be less bad.
 
Last edited:

RespawnX

Member
Remember when the 1070 was 150W.

300W 4070 is an absolute fucking joke.
It will be nice paying 2 bucks every gaming evening just for electricity. At this power increment speed I'm out of gaming by 2025 just because of energy consumption and prices.
Gonna hire a real hooker instead of playing GTA and still save some money :messenger_downcast_sweat:.
 
Supposedly, these new nVidia cards will have a big increase in cache.
But I doubt it will be enough to make up for such a reduction in memory bus width.
Even the 6800 series, that had 128MB of L3 cache, had a 256 bit bus.

The best we can hope for is that it has a 192 bit bus and 12 GB. That would be less bad.
Agreed. With 12gb/192 bit I would consider the 4070.
 
Last edited:
Top Bottom