• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

Nydus

Member
I still wonder if it's PSU depended but don't think so. new corsair rm750x without any extensions and still sometimes whine.
The card is so quiet, it's easier to hear. I am fine with this :)
I'll report if the seasonic changes anything in comparison to Corsair. But I doubt it. Limiting the FPS helps. I also have extensions ok on the way, I'll see if they make it worse lol
 

rofif

Can’t Git Gud
I'll report if the seasonic changes anything in comparison to Corsair. But I doubt it. Limiting the FPS helps. I also have extensions ok on the way, I'll see if they make it worse lol
Which extensions have you ordered?
 

waylo

Banned
Do you guys consider coil whine a deal breaker, or just live with it?
Depends on how bad it is. Typically if I can't hear it over my headphones, I don't mind it. However, I have had some cards in the past where it's so loud and high pitched that I could actually hear it over music. In that case, it's time for an RMA.
 

GHG

Member
Do you guys consider coil whine a deal breaker, or just live with it?

At my regular playing framerates (60-120fps) it's a deal breaker. I can put up with a bit of fan noise but something about the frequency of coil whine drives me mad.
 
Last edited:

CrustyBritches

Gold Member

NVIDIA has revised the plans for the upcoming RTX 30 series ‘refresh’. If you were planning (or trying) to buy a high-end GPU, then you should definitely wait for December. Our sources are confident that NVIDIA will launch two SKUs: 3080 20GB and 3070 16GB in that month.

The plans to launch 3070 Ti with 6144 CUDA cores have allegedly been scrapped. No reason was provided, but we believe that yield of GA104 could’ve been an issue (the PG142 SKU 0 required full GPU to be operational). Thus, we should not expect GA104-400 till SUPER refresh next year.

The GeForce RTX 3070 8GB will be updated to 16GB memory. This SKU carries a codename of PG142 SKU 05 and it is also expected to launch in December. Meanwhile, in the mid-range space we have RTX 3060 Ti, which now has been pushed back to mid-November, our sources claim.

According to the information we gathered this week, AMD is set to launch three SKUs based on Navi 21 with 16GB memory. In response, NVIDIA is adding SKUs with higher memory capacity.

So the rumored 3070 Ti has seemingly been scrapped/delayed until next year as a Super refresh. Instead, the 3070 8GB and 16GB will both have 5888 CUDA cores and use GDDR6(non-x). Allegedly the 3070 16GB and 3080 20GB launch in December. Notably, after Cyberpunk releases.

I'd prefer a 3070 16GB, but it will be a bummer for Cyberpunk to release and I don't have a new card. I'll have to be patient, which sucks of course.

P.S.- That bit at the end about Navi 21 getting 3 SKUs with 16GB memory is new to me as well.
 
Last edited:

VFXVeteran

Banned
OK. 3090 is completely worth the price of admission..

Check it:

Big bandwidth hogging games are now running 4k/60FPS - Horizon ZD now a true 60FPS flawlessly. Marvel Avengers doesn't have to be capped to 30FPS anymore.

The big dawgs though! Crysis: Remastered is ridiculous on "Can it Run Crysis" settings!! Holy shit! Capped that to 30FPS/4k and the lighting/shading is INSANE!!

FS2020 - full on ULTRA settings @4k capped to 30FPS looks beautiful!

VRAM usage!!

Crysis - 16G VRAM!!!!
FS2020 - 9G VRAM
Avengers - 11G VRAM
Horizon - 10G VRAM
Death Stranding - 8G VRAM
Control - 8G VRAM
 

BluRayHiDef

Banned
OK. 3090 is completely worth the price of admission..

Check it:

Big bandwidth hogging games are now running 4k/60FPS - Horizon ZD now a true 60FPS flawlessly. Marvel Avengers doesn't have to be capped to 30FPS anymore.

The big dawgs though! Crysis: Remastered is ridiculous on "Can it Run Crysis" settings!! Holy shit! Capped that to 30FPS/4k and the lighting/shading is INSANE!!

FS2020 - full on ULTRA settings @4k capped to 30FPS looks beautiful!

VRAM usage!!

Crysis - 16G VRAM!!!!
FS2020 - 9G VRAM
Avengers - 11G VRAM
Horizon - 10G VRAM
Death Stranding - 8G VRAM
Control - 8G VRAM
Your post proves that 10GB isn't going to cut it for next-gen games at 4K with ultra quality textures. Texture quality will have to be dropped to attain 60 frames per second at 4K.
 

regawdless

Banned
OK. 3090 is completely worth the price of admission..

Check it:

Big bandwidth hogging games are now running 4k/60FPS - Horizon ZD now a true 60FPS flawlessly. Marvel Avengers doesn't have to be capped to 30FPS anymore.

The big dawgs though! Crysis: Remastered is ridiculous on "Can it Run Crysis" settings!! Holy shit! Capped that to 30FPS/4k and the lighting/shading is INSANE!!

FS2020 - full on ULTRA settings @4k capped to 30FPS looks beautiful!

VRAM usage!!

Crysis - 16G VRAM!!!!
FS2020 - 9G VRAM
Avengers - 11G VRAM
Horizon - 10G VRAM
Death Stranding - 8G VRAM
Control - 8G VRAM

How did you measure the VRAM usage?
 



So the rumored 3070 Ti has seemingly been scrapped/delayed until next year as a Super refresh. Instead, the 3070 8GB and 16GB will both have 5888 CUDA cores and use GDDR6(non-x). Allegedly the 3070 16GB and 3080 20GB launch in December. Notably, after Cyberpunk releases.

I'd prefer a 3070 16GB, but it will be a bummer for Cyberpunk to release and I don't have a new card. I'll have to be patient, which sucks of course.
Patience really is a virtue. I'll gladly wait for more Vram at non 3090 prices (I expect it to be somewhere in-between 3080 and 3090 price point.)... In other news, I've changed out the juice in my expandable AIO finally after it started to sound like a fish tank. My first time doing it and cleaning out the loop and radiator as well. Interesting process. Took my time to really understand everything and I think I've got a process down so that next time it'll be much faster. New fittings as well and also changed out the tubes and lengthened them to make room for a longer card so I don't have to change my radiator placement. Everything's ready to just slot in the new card. Can't wait.
 

Nydus

Member
Which extensions have you ordered?
Bitfenix Alchemy. 45cm long. I don't know if they are too long 😅 but I thought "why get shorter ones for the same price?"

Your post proves that 10GB isn't going to cut it for next-gen games at 4K with ultra quality textures. Texture quality will have to be dropped to attain 60 frames per second at 4K.
Dude chill already. His post proves that if a game WANTS to fill your vram, it can. Just like a stresstest can crash your non overclocked cards if it WANTS too. Trust on gamers nexus and co. And even if this whole fearmongering turns out true: just use high textures? For the life of me until today, I was never able to see a difference between textures on high and ultra. If you want a 3090, just get one and be happy that you can afford one. I drive a BMW X5, total overkill for what it does,but I fucking love that diaperbomber rolling fortress. No need running around telling people with smaller cars that they can never drive on vacation cause xyz.
 

BluRayHiDef

Banned
Bitfenix Alchemy. 45cm long. I don't know if they are too long 😅 but I thought "why get shorter ones for the same price?"


Dude chill already. His post proves that if a game WANTS to fill your vram, it can. Just like a stresstest can crash your non overclocked cards if it WANTS too. Trust on gamers nexus and co. And even if this whole fearmongering turns out true: just use high textures? For the life of me until today, I was never able to see a difference between textures on high and ultra. If you want a 3090, just get one and be happy that you can afford one. I drive a BMW X5, total overkill for what it does,but I fucking love that diaperbomber rolling fortress. No need running around telling people with smaller cars that they can never drive on vacation cause xyz.
I'm not telling anyone anything; I'm just expressing my opinion, which is in accordance with the purpose of a forum.
 

regawdless

Banned
OK. 3090 is completely worth the price of admission..

Check it:

Big bandwidth hogging games are now running 4k/60FPS - Horizon ZD now a true 60FPS flawlessly. Marvel Avengers doesn't have to be capped to 30FPS anymore.

The big dawgs though! Crysis: Remastered is ridiculous on "Can it Run Crysis" settings!! Holy shit! Capped that to 30FPS/4k and the lighting/shading is INSANE!!

FS2020 - full on ULTRA settings @4k capped to 30FPS looks beautiful!

VRAM usage!!

Crysis - 16G VRAM!!!!
FS2020 - 9G VRAM
Avengers - 11G VRAM
Horizon - 10G VRAM
Death Stranding - 8G VRAM
Control - 8G VRAM

This is the VRAM allocation and not usage. Games always allocate a lot of VRAM, which is shown in all the monitoring software.
BUT the actually used VRAM is more difficult to see. Special K did it and MSI Afterburner has a new, rather hidden function to display it correctly.
Flight Simulator for example is really only using 4GB of VRAM.
 
Last edited:

nemiroff

Gold Member
This is the VRAM allocation and not usage. Games always allocate a lot of VRAM, which is shown in all the monitoring software.
BUT the actually used VRAM is more difficult to see. Special K did it and MSI Afterburner has a new, rather hidden function to display it correctly.
Flight Simulator for example is really only using 4GB of VRAM.

Yes. It's almost like streaming data is a unimaginable concept when that's exactly what these GPUs are built for. It's a good thing we don't need a 2PB VRAM version GPU to run MSFS. It's also really easy to spot in benchmarks, like in the Crysis example. If there was a bottle neck it would show up in benchmarks. There's absolutely no reason whatsoever to refer to usage if it's not contextual.

I'm not telling anyone anything; I'm just expressing my opinion, which is in accordance with the purpose of a forum.

The problem I have with your posts is that you for some reason have an absurdly one-sided approach. There's no logical reason for why you completely disregard other sides to a topic, and objective reason in general. It's almost like you've set a narrative and picking random information to support it by all means necessary.
 

regawdless

Banned
Yes. It's almost like streaming data is a unimaginable concept when that's exactly what these GPUs are built for. It's a good thing we don't need a 2PB VRAM version GPU to run MSFS. It's also really easy to spot in benchmarks, like in the Crysis example. If there was a bottle neck it would show up in benchmarks. There's absolutely no reason whatsoever to refer to usage if it's not contextual.

Yeah. Some people are very concerned about the amount of VRAM in these GPUs, but I don't see any VRAM issues for the 3080 for example. If a huge world like that in Flights Simulator only uses 4GB, I don't see 10GB VRAM as being an issue this gen. Not even with RTX added.
 
I decided I’m too old to actually try to fight for a video card on the Internet. I signed up for the EVGA auto-notify queue, when they have a 3090 for me, I’ll order it. Until then I’ll wait.
 

Rentahamster

Rodent Whores
I'm not telling anyone anything; I'm just expressing my opinion, which is in accordance with the purpose of a forum.
Just an opinion? Not with word choice like "proves".

Your post proves that 10GB isn't going to cut it for next-gen games at 4K with ultra quality textures. Texture quality will have to be dropped to attain 60 frames per second at 4K.

VRAM allocation vs actual usage is a thing. You don't really prove anything until you produce benchmarks that control for VRAM amount and demonstrate lower performance as a function of VRAM capacity.
 

Siri

Banned
Bitfenix Alchemy. 45cm long. I don't know if they are too long 😅 but I thought "why get shorter ones for the same price?"

Dude chill already. His post proves that if a game WANTS to fill your vram, it can. Just like a stresstest can crash your non overclocked cards if it WANTS too. Trust on gamers nexus and co. And even if this whole fearmongering turns out true: just use high textures? For the life of me until today, I was never able to see a difference between textures on high and ultra. If you want a 3090, just get one and be happy that you can afford one. I drive a BMW X5, total overkill for what it does,but I fucking love that diaperbomber rolling fortress. No need running around telling people with smaller cars that they can never drive on vacation cause xyz.

Man, everyone just ate up that remark by the tech-Jesus guy regarding the allotment of VRAM - it’s super trendy right now to shut people down by airily stating that there’s a big difference between VRAM usage and VRAM allotment.

Go ahead and buy the 10 gigabyte VRAM 3080 then. Better you than me. I have an RTX 2080 TI and at 4K I’m seeing stutter when I turn the camera in Crysis remake, Final Fantasy, Avengers and Horizon Zero Dawn. It’s not going to get better. It’ll just get worse.

I’d bet money there’ll soon be a 3080 with more VRAM - why, to swindle people out of their money? No, because going forward, at 4K, you’re going to want lots of VRAM for enthusiast settings.
 
Last edited:

regawdless

Banned
Man, everyone just ate up that remark by the tech-Jesus guy regarding the allotment of VRAM - it’s super trendy right now to shut people down by airily stating that there’s a big difference between VRAM usage and VRAM allotment.

Go ahead and buy the 10 gigabyte VRAM 3080 then. Better you than me. I have an RTX 2080 TI and at 4K I’m seeing stutter when I turn the camera in Crysis remake, Final Fantasy, Avengers and Horizon Zero Dawn. It’s not going to get better. It’ll just get worse.

I’d bet money there’ll soon be a 3080 with more VRAM - why, to swindle people out of their money? No, because going forward, at 4K, you’re going to want lots of VRAM for enthusiast settings.

So stutter while turning the camera is hard evidence for not enough available VRAM? Are you sure that nothing else could cause this?

I would suggest that you actually educate yourself about how VRAM works and how much is really used in games, before making such strong statements.
 

VFXVeteran

Banned
This is the VRAM allocation and not usage. Games always allocate a lot of VRAM, which is shown in all the monitoring software.
BUT the actually used VRAM is more difficult to see. Special K did it and MSI Afterburner has a new, rather hidden function to display it correctly.
Flight Simulator for example is really only using 4GB of VRAM.

Good to know.

While that may be true, there is a distinct correlation between allocated VRAM and performance in the game. The GPU usage in most of these games aren't that shader heavy (the 2080Ti vs. 3090 shows around the same GPU usage even though we know that the 3090 has more cores and faster 32-bit FPUs) which indicates "slightly" that the shader cores aren't the real bottleneck but the memory bandwidth. I can tell a big difference in frametimes when VRAM allocation increases going from 2080Ti to 3090.

All of these games' VRAM allocation increased when I swapped in the 3090 from the 2080Ti. I suspect that the more available VRAM, the faster the performance (this is with native 4k res). Keep in mind, there are quite a few of these games that could not lock 60FPS. Even Control (as simple has it's shading is) RT require a LOT of memory bandwidth to implement MAX settings. DLSS quickly remedy that problem (which takes the resolution down a notch).
 
Last edited:

regawdless

Banned
Good to know.

While that may be true, there is a distinct correlation between allocated VRAM and performance in the game. The GPU usage in most of these games aren't that shader heavy (the 2080Ti vs. 3090 shows around the same GPU usage) which indicates that the shader cores aren't the real bottleneck but the memory bandwidth. I can tell a big difference in frametimes when VRAM allocation increases going from 2080Ti to 3090.

In short, the more available allocatable VRAM, it seems like the faster the performance (this is with native 4k res). All of these games VRAM allocation increased when I swapped in the 3090 from the 2080Ti.

Don't think it's that straight forward. There are differences in the architectures and the VRAM itself is different. The 2080ti uses GDDR6, while the 3080 and 3090 use faster GDDR6X memory and have higher bandwidth.
Just because GPU usage isn't at max, it doesn't mean that it's only the memory.

At least as far as I understand it, please someone correct me if I'm wrong.
 

VFXVeteran

Banned
Don't think it's that straight forward. There are differences in the architectures and the VRAM itself is different. The 2080ti uses GDDR6, while the 3080 and 3090 use faster GDDR6X memory and have higher bandwidth.
Just because GPU usage isn't at max, it doesn't mean that it's only the memory.

At least as far as I understand it, please someone correct me if I'm wrong.

Read my revision.
 

VFXVeteran

Banned
Don't think it's that straight forward. There are differences in the architectures and the VRAM itself is different. The 2080ti uses GDDR6, while the 3080 and 3090 use faster GDDR6X memory and have higher bandwidth.
Just because GPU usage isn't at max, it doesn't mean that it's only the memory.

Right. There are several factors under the hood that I'm not seeing. BUT I do know from a software developer experience that more RAM = better performance (of course the type and speed of RAM matters as well).

I remember when I first put the 2080Ti through it's paces on my last film project.

I intentionally wanted to exhaust the memory, so I filled it with 65 4k textures and just loaded them all in memory uncompressed. The 2080Ti crashed due to memory allocation. So I lowered the number down and reread the textures. I got my character to render but at 5FPS. I then, with the same card, compressed the textures down to DXTC 5 (max compression) and reloaded those images. Memory footprint was much lower and FPS went to 60FPS at the same resolution. This test had no lighting, no shading (other than applying the texture to the geometry) and was just to prove that for one character the GPUs today can't hang (even 3090) with the demands of film quality assets. Memory bandwidth (amount and speed) is important. Especially when you want to use RT, higher resolution, and/or higher res textures. I would claim MORE important than just raw speed of the CUDA cores.
 
Last edited:

Myths

Member
I think the “value of YouTubers” discussion that was taking place in here, while not on topic, was interesting. Should be it’s own thread.
It should definitely be its own thread. Already, one of the worst rebuttals I’ve read so far is someone saying since X is so easy why aren’t you doing it. Everyone doesn’t have the same passion as others to do certain things in life, regardless of how “easy” or “simple” it is to make a living out of it much less.

That’s just what it comes down to.
 
Last edited:

Rickyiez

Member
Man, everyone just ate up that remark by the tech-Jesus guy regarding the allotment of VRAM - it’s super trendy right now to shut people down by airily stating that there’s a big difference between VRAM usage and VRAM allotment.

Go ahead and buy the 10 gigabyte VRAM 3080 then. Better you than me. I have an RTX 2080 TI and at 4K I’m seeing stutter when I turn the camera in Crysis remake, Final Fantasy, Avengers and Horizon Zero Dawn. It’s not going to get better. It’ll just get worse.

I’d bet money there’ll soon be a 3080 with more VRAM - why, to swindle people out of their money? No, because going forward, at 4K, you’re going to want lots of VRAM for enthusiast settings.

Having non of the stutter here with just 10GB here in comparison to your 11GB . Are you sure it's VRAM related ? :messenger_winking:
 

Siri

Banned
So stutter while turning the camera is hard evidence for not enough available VRAM? Are you sure that nothing else could cause this?

I would suggest that you actually educate yourself about how VRAM works and how much is really used in games, before making such strong statements.

And I would suggest that you educate yourself about how VRAM works before making such strong statements.
 

regawdless

Banned
And I would suggest that you educate yourself about how VRAM works before making such strong statements.

Fair game :messenger_grinning:
But the stutter from turning the camera still isn't evidence for insufficient amounts of VRAM, so I cannot come to the same conclusion.
 

waylo

Banned
One thing I've realized while reading this thread is that a lot of people in here don't understand how VRAM works. Allocation =/= using.

But hey, if it keeps some of you that think you need a terabyte of VRAM from buying a 3080, be my guest. Leaves more chances for me to get one.
 

BluRayHiDef

Banned
I compromised because I'm impatient. My local Microcenter received a shipment of RTX 3080s but no RTX 3090s...and I just couldn't stomach the idea of going home empty handed again.

I'll just exchange this for an RTX 3090 in the next 30 days if they get a shipment of them in that time or I'll just buy an RTX 3090 and sell this card on ebay (for what I paid for it).

Don't mind my busted sneakers; they're for work.

GU81YMo.jpg
 
Last edited:

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
I compromised because I'm impatient. My local Microcenter received a shipment of RTX 3080s but no RTX 3090s...and I just couldn't stomach the idea of going home empty handed again.

I'll just exchange this for an RTX 3090 in the next 30 days if they get a shipment of them in that time or I'll just buy an RTX 3090 and sell this card on ebay (for what I paid for it).

Don't mind my busted sneakers; they're for work.
How long did you wait in line? Did you get a heads up they were getting a shipment?
 

Durask

Member

Rbk_3

Member
I compromised because I'm impatient. My local Microcenter received a shipment of RTX 3080s but no RTX 3090s...and I just couldn't stomach the idea of going home empty handed again.

I'll just exchange this for an RTX 3090 in the next 30 days if they get a shipment of them in that time or I'll just buy an RTX 3090 and sell this card on ebay (for what I paid for it).

Don't mind my busted sneakers; they're for work.

GU81YMo.jpg
Why not just enter the EVGA Stepup program?
 

BluRayHiDef

Banned
How long did you wait in line? Did you get a heads up they were getting a shipment?

I work from 12 AM to 8 AM and have been going there every morning after work this week. I've always managed to get there before they usually open, which is at 10 AM; however, on Tuesday they opened at 9 AM for some reason and I therefore arrived a few minutes after they opened.

They received 3080s on Tuesday afternoon, which was after I left; there was one left over yesterday morning, but it was claimed by the first guy on line (I was the second guy on line).

This morning I arrived at 9:25 AM and was the first guy on line; three other people showed up after me, which was convenient because there were exactly four cards (two Asus TUFs and two EVGA FTW3s).
 
Last edited:

Nydus

Member
I work from 12 AM to 8 AM and have been going there every morning after work this week. I've always managed to get there before they usually open, which is at 10 AM; however, on Tuesday they opened at 9 AM for some reason and I therefore arrived a few minutes after they opened.

They received 3080s on Tuesday afternoon, which was after I left; there was one left over yesterday morning, but it was claimed by the first guy on line (I was the second guy on line).

This morning I arrived at 9:25 AM and was the first guy on line; three other people showed up after me, which was convenient because there were exactly four cards (two Asus TUFs and two EVGA FTW3s).
What did you have to pay?
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
I work from 12 AM to 8 AM and have been going there every morning after work this week. I've always managed to get there before they usually open, which is at 10 AM; however, on Tuesday they opened at 9 AM for some reason and I therefore arrived a few minutes after they opened.

They received 3080s on Tuesday afternoon, which was after I left; there was one left over yesterday morning, but it was claimed by the first guy on line (I was the second guy on line).

This morning I arrived at 9:25 AM and was the first guy on line; three other people showed up after me, which was convenient because there were exactly four cards (two Asus TUFs and two EVGA FTW3s).
Wow, man. Thanks for this info. Happy you snagged one, although it isn’t the one you wanted. Enjoy for now!
 

Rbk_3

Member
I'll look into that. Thanks for the advice.

You just register your card, enter the que, and when your turn comes, you pay the difference between the cards, send your card back and they send you the 3090. It is the FTW 3 Ultra as well.

I picked up a 2070S and entered myself to get the 3080 FTW3.

Unless you can grab another 3090 and flip this card for more
 
Top Bottom