• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

bryjo3

Neo Member

In 2012, around the time of ps4 release everyone said 2 GB of VRAM is ALL you need. It very quickly became a complete necessity about 3 years in. Then it was 8gb of ram, now 16 is almost a complete necessity. It goes on an on, if you want your computer part to run high/ultra for the next 3-4 years I would always go more VRAM if that was an option.
 
Are you sure?

I'm thinking this wouldn't work as the poster is imagining. The reasons:

- PCI Express sockets often share bus or at least get capacity cut in half or can't be used at the same time (as in it has to queue data from one or the other)
- You wouldn't get the same performance from each card as if you had it in different computers because the CPU would become a bottleneck having to work with interrupts from 2 very fast devices.

It's not the same as using a GPU and an iGPU.

Haven't tried. Just thinking it wouldn't work for comparison purposes at least.

It was sarcasm. Sorry.
 
In 2012, around the time of ps4 release everyone said 2 GB of VRAM is ALL you need. It very quickly became a complete necessity about 3 years in. Then it was 8gb of ram, now 16 is almost a complete necessity. It goes on an on, if you want your computer part to run high/ultra for the next 3-4 years I would always go more VRAM if that was an option.
By the time it becomes an issue there'll be new cards out anyways. And and easy way to make it less of an issue to is to lower textures from Ultimate to High. The 3080 has such crazy high bandwidth that I really don't think it'll be a problem for at least 3 or 4 years. The reason AMD is putting so much memory in their cards is that they are using gddr6 which has a lot less bandwidth than 6x. We'll see soon enough, but I'd bet that 16gb of dr6 will equivocate to about the 10gb dr6x in the 3080.

Hypothetically speaking, if I wanted to connect both an RTX 3080 and an RTX 3090 to the same computer so that I could run games on each of them independently of the other and therefore compare the performance levels between them, could I do so?

Could I have Windows 10 display a separate desktop for each card and have each card connected to a separate HDMI port on my TV in order to do so?

Finally, would a 1000 PSU be enough to power both cards and a Ryzen 3950X?

Really not trying to be an asshole. But why are you asking such stupid questions? I'm starting to get why just about everyone is getting frustrated with you.
 
Last edited:

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
0v85E9F.webp


Hurry up, Zen 3.
 
Last edited:

BluRayHiDef

Banned
By the time it becomes an issue there'll be new cards out anyways. And and easy way to make it less of an issue to is to lower textures from Ultimate to High. The 3080 has such crazy high bandwidth that I really don't think it'll be a problem for at least 3 or 4 years. The reason AMD is putting so much memory in their cards is that they are using gddr6 which has a lot less bandwidth than 6x. We'll see soon enough, but I'd bet that 16gb of dr6 will equivocate to about the 10gb dr6x in the 3080.



Really not trying to be an asshole. But why are you asking such stupid questions? I'm starting to get why just about everyone is getting frustrated with you.

It's not a stupid question. I asked the same question in the Nvidia subreddit and received multiple, informative and respectful answers, one of which explained that I can use virtual machines to tie each GPU to a separate instance of Windows. The problem isn't me but is the negative attitude of some of the users here.
 
It's not a stupid question. I asked the same question in the Nvidia subreddit and received multiple, informative and respectful answers, one of which explained that I can use virtual machines to tie each GPU to a separate instance of Windows. The problem isn't me but is the negative attitude of some of the users here.
So you really want to put a 3080 and a 3090 in 1 machine to run separate instances of games? Really? Heat management, power management, hell, sounds like for you cable management, would all be issues. It doesn't even seem like you should be attempting it as it has no practical purpose in any sense at all.

It really comes across as you trying to be braggadocios more than anything else. I'm not the only one that thinks that. And I even went as far as to try and defend you not too long ago.
 
Last edited:

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Should've waited for the 2TB version with better endurance. :lollipop_confounded:
You’re right. Probably will get that one when it releases. There’s no date, though, other than end of year. Figured I’d grab this for now.
 

BluRayHiDef

Banned
So you really want to put a 3080 and a 3090 in 1 machine to run separate instances of games? Really? Heat management, power management, hell, sound like for you cable management, would all be issues. It doesn't even seem like you should be attempting it as it has no practical purpose in any sense at all.

It really comes across as you trying to be braggadocios more than anything else. I'm not the only one that thinks that. And I even went as far as to try and defend you not too long ago.
Did you not see the words "Hypothetically speaking" at the beginning of the post? Seriously? Also, in the hypothetical setup, both cards wouldn't have to run simultaneously, hence manageable heat output.
 
Did you not see the words "Hypothetically speaking" at the beginning of the post? Seriously? Also, in the hypothetical setup, both cards wouldn't have to run simultaneously, hence manageable heat output.
I did. And hypothetically it's a stupid question. And practically it's even dumber.

Look, I'll just let you be. I don't need to even get involved with your shenanigans. So by all means carry on with your "hypotheticals"
 
Last edited:

nemiroff

Gold Member
In 2012, around the time of ps4 release everyone said 2 GB of VRAM is ALL you need. It very quickly became a complete necessity about 3 years in. Then it was 8gb of ram, now 16 is almost a complete necessity. It goes on an on, if you want your computer part to run high/ultra for the next 3-4 years I would always go more VRAM if that was an option.

How exactly is 16GB now a complete necessity? I'm new at this, genuinely curious and want to learn how this works. I'm an engineer, so don't be shy with the technical explanations, demonstrations and benchmarks.
 
How exactly is 16GB now a complete necessity? I'm new at this, genuinely curious and want to learn how this works. I'm an engineer, so don't be shy with the technical explanations, demonstrations and benchmarks.
Simple answer. It's not. Longer answer, some are confusing memory allocation as memory needed, completely disregarding how fast the vram is in the new 3080 or how much crazy bandwidth it has and not understanding that things can be switched in and out very very quickly to the point where you'd not notice it.

Years ago when devs were releasing 4k texture packs, people were trying to run things that weren't really meant to run on 2gb cards and were getting stutters not realizing that their card wasn't meant for 4k textures.
 
Last edited:

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Simple answer. It's not. Longer answer, some are confusing memory allocation as memory needed, completely disregarding how fast the vram is in the new 3080 or how much crazy bandwidth is has and not understanding that things can be switched in and out very very quickly to the point where you'd not notice it.

Years ago when devs were releasing 4k texture packs, people were trying to run things that weren't really meant to run on 2gb cards and were getting stutters not realizing that there card wasn't meant for 4k textures.
This is exactly right. Everyone is comparing the GDDR6X VRAM in the 3080 with GDDR6 VRAM elsewhere. They’re not the same. And you are again exactly right about memory allocation and memory use. It’s not at all 12>10, though you’d never know it by the fear mongering.
 

J3nga

Member
Right...They guy "hypothetically" wanting to put a 3080 and 3090 in a PC that doesn't know about PSU's and CPU's is calling me a joke.

giphy.gif
Don't forget that not too long ago he hypothetically wanted to scalp his 3080. Such a convenient excuse, ain't it?
 

YCoCg

Member
How exactly is 16GB now a complete necessity?
Long term usage, but for myself it's more work related, image and video scaling can use a lot of VRAM the higher and higher you go resolution wise, so for me I'd just like a GPU that I can game on for a decent amount of years AND also do image work on without having to worry about running out of VRAM like I currently do.
 
Long term usage, but for myself it's more work related, image and video scaling can use a lot of VRAM the higher and higher you go resolution wise, so for me I'd just like a GPU that I can game on for a decent amount of years AND also do image work on without having to worry about running out of VRAM like I currently do.
But in your case wouldn't a 3090 be a better fit?
 

Iorv3th

Member
Did you not see the words "Hypothetically speaking" at the beginning of the post? Seriously? Also, in the hypothetical setup, both cards wouldn't have to run simultaneously, hence manageable heat output.

It's still a stupid question and there is 0 reason to do it.
 

BluRayHiDef

Banned
I've finished rebuilding my PC. The specs are as follows:

01. Ryzen 9 3950X
02. Asus X570 ROG Crosshair VIII Hero (WI-FI)
03. G. Skill Trident Z RGB 64GBs (2 x 32GBs) DDR4 RAM @ 3600 MHz
04. PNY XLR8 Gaming REVEL EPIC-X RGB RTX 3090
05. Samsung 850 EVO 500GB SATA SSD (Windows OS)
06. Sabrent "Rocket" PCIe Gen 4.0 NVMe 2TB SSD (games)
07. Western Digital Caviar Black 1TB HDD
08. Western Digital Blue 4TB & 6TB HDDs
09. Corsair TX850M 850 Watt 80+ Gold PSU
10. Lian Li O11 Dynamic XL ROG

IB7qQU7.jpg


KSRU8hc.jpg


CUqWYoV.jpg


XTwPPFT.jpg


zbOi6Hy.jpg


zbOi6Hy.jpg


03bXt0s.jpg
 

bryjo3

Neo Member
How exactly is 16GB now a complete necessity? I'm new at this, genuinely curious and want to learn how this works. I'm an engineer, so don't be shy with the technical explanations, demonstrations and benchmarks.

I was referring to system ram, not VRAM, I know that is confusing. Yes, 16gb of system ram it would be very foolish to build a gaming PC without it right now. On the bandwidth, yes I know the 3080 has very fast VRAM, but has anyone tested how it fairs when using 12 GB of VRAM or more?
 

BluRayHiDef

Banned
Now imagine wanting to have like 25-40 modern games installed simultaneously all on SSD. It gets expensive. I have 3 TB of SSD storage and it's simply not enough.

The games that I have via Steam and Origin are the reason I bought a 2TB Sabrent NVMe drive; my 500GB Samsung SATA SSD simply didn't have enough space, as I continually had to delete games to make space for others.

For now, the Sabrent seems more than sufficient because I mainly play single-player games with file sizes that range from 10GBs to 40GBs in size.
 

CrustyBritches

Gold Member
So we had the rumors for the 3070/3080 16GB variants, then recently those were said to be cancelled. There was a "3070ti" rumored model, too, the PG142 SKU 0. That was GA104-based(3070 chip), 6144 CUDA cores, and 16GB GDDR6 compared to 3070's 5888 CUDA cores and 8GB GDDR6.

Now we have a new potential model emerging as more info concerning Big Navi becomes available...a GA102-based 3070ti GA102-150. Rumored config: 7424 CUDA cores, 10GB GDDR6X, 320-bit memory bus.
 
Last edited:

BluRayHiDef

Banned
So we had the rumors for the 3070/3080 16GB variants, then recently those were said to be cancelled. There was a "3070ti" rumored model, too, the PG142 SKU 0. That was GA104-based(3070 chip), 6144 CUDA cores, and 16GB GDDR6 compared to 3070's 5888 CUDA cores and 8GB GDDR6.

Now we have a new potential model emerging as more info concerning Big Navi becomes available...a GA102-based 3070ti GA102-150. Rumored config: 7424 CUDA cores, 10GB GDDR6X, 320-bit memory bus.


This new SKU sounds believable because it seems to be nothing more than a stripped down RTX 3080. It has the same amount and type of VRAM and the same bus width; it's probably the failed/ defective yields of the RTX 3080 in terms of its number of functioning SMs.

From a business perspective, it's a sensical SKU because it minimizes manufacturing costs since a new manufacturing process doesn't have to be created.
 
This new SKU sounds believable because it seems to be nothing more than a stripped down RTX 3080. It has the same amount and type of VRAM and the same bus width; it's probably the failed/ defective yields of the RTX 3080 in terms of its number of functioning SMs.

From a business perspective, it's a sensical SKU because it minimizes manufacturing costs since a new manufacturing process doesn't have to be created.
How would you even price that? $500 for the 3070, $700 for the 3080. $600? I dunno. I guess it fits a spot, but almost seem like if you are gonna make that card, you may as well go all the way to the 3080 or cut it down to the 3070. I dunno, I just don't feel like there's a lot of performance leeway between the 3080 and 3070 that makes fiscal sense one way or the other. I'm probably wrong, just seems strange. I guess it'd probably be a reaction to whatever the 6800/6900XT come priced at. If they are aggressively priced it does force Nvidia to fill more space on the list on what's available I guess for every price range.
 
Last edited:

CrustyBritches

Gold Member
With Navi 10 AMD was rumored to be going for 2060/2070 performance, but they ended up slotting above those cards, respectively. Hence the Super series was necessary. AMD reportedly has 3 Navi 21 models with 16GB GDDR6: XTX(Full-die), XT("Big Navi"), and XL("Big Navi Jr.").

As we're getting a clearer picture of RDNA 2, I'm now pretty sure that AMD showed the Navi 21 XT in their teaser slide at the Zen 3 reveal event. Navi 21 XT is to target or beat the 3080 at $699, not a penny less going by Zen 3 pricing.

The XTX is likely being saved for the RDNA 2 event on the 28th. This will probably target the 3090 and be priced at $999. The XL will target 3070 at $499. If AMD can show better general raster performance at every tier, then Nvidia could be forced into releasing 3070ti and 3080ti models at competitive price points.
 
With Navi 10 AMD was rumored to be going for 2060/2070 performance, but they ended up slotting above those cards, respectively. Hence the Super series was necessary. AMD reportedly has 3 Navi 21 models with 16GB GDDR6: XTX(Full-die), XT("Big Navi"), and XL("Big Navi Jr.").

As we're getting a clearer picture of RDNA 2, I'm now pretty sure that AMD showed the Navi 21 XT in their teaser slide at the Zen 3 reveal event. Navi 21 XT is to target or beat the 3080 at $699, not a penny less going by Zen 3 pricing.

The XTX is likely being saved for the RDNA 2 event on the 28th. This will probably target the 3090 and be priced at $999. The XL will target 3070 at $499. If AMD can show better general raster performance at every tier, then Nvidia could be forced into releasing 3070ti and 3080ti models at competitive price points.
Or just lowering the 3090 price drastically...
 
With Navi 10 AMD was rumored to be going for 2060/2070 performance, but they ended up slotting above those cards, respectively. Hence the Super series was necessary. AMD reportedly has 3 Navi 21 models with 16GB GDDR6: XTX(Full-die), XT("Big Navi"), and XL("Big Navi Jr.").

As we're getting a clearer picture of RDNA 2, I'm now pretty sure that AMD showed the Navi 21 XT in their teaser slide at the Zen 3 reveal event. Navi 21 XT is to target or beat the 3080 at $699, not a penny less going by Zen 3 pricing.

The XTX is likely being saved for the RDNA 2 event on the 28th. This will probably target the 3090 and be priced at $999. The XL will target 3070 at $499. If AMD can show better general raster performance at every tier, then Nvidia could be forced into releasing 3070ti and 3080ti models at competitive price points.
That'd be pretty huge if true. That'd be great for everyone really.
 

BluRayHiDef

Banned
How would you even price that? $500 for the 3070, $700 for the 3080. $600? I dunno. I guess it fits a spot, but almost seem like if you are gonna make that card, you may as well go all the way to the 3080. I dunno, I just don't feel like there's a lot of performance leeway between the 3080 and 3070 that makes fiscal sense one way or the other. I'm probably wrong, just seems strange. I guess it'd probably be a reaction to whatever the 6800/6900XT come priced at. If they are aggressively priced it does force Nvidia to fill more space on the list on what's available I guess for every price range.

I can see this SKU having an MSRP of $629.99 USD. However, I think that this SKU - if it's real - isn't necessarily intentional but is the result of a contingency plan for defective yields of the RTX 3080, which is why it doesn't make sense from consumers' perspective. Nvidia wouldn't want to just waste the defective yields.
 
I can see this SKU having an MSRP of $629.99 USD. However, I think that this SKU - if it's real - isn't necessarily intentional but is the result of a contingency plan for defective yields of the RTX 3080, which is why it doesn't make sense from consumers' perspective. Nvidia wouldn't want to just waste the defective yields.
I dunno. I think those 3070's are gonna sell like hotcakes. It may well end up being the highest adopted card for it's price. It seems like it'd be easier to just manufacture the defunct 3080's with the rest of the 3070's. But you are right, it does fill a performance spot. But $630? I dunno, seems like no-one would buy that. It'd have to be at least $600. Who wouldn't spend the cost of a game to have waaaayyyy better performance with the 3080?
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I'm trying to gauge how much is it worth all the pain you guys are going through trying to get a 300x card... because I paid $900 more (3090) and it stung like the dickens.. BUT I have the card and never have to worry about trying to buy it day after day.. Hmm..
It must be nice to have that much spare money laying around to pay that much of a premium.
I see the argument of “its work related” but I fail to see what the 30 series can do that somehow the 20 series can’t do that justifies that kind of premium.
 

BluRayHiDef

Banned
I dunno. I think those 3070's are gonna sell like hotcakes. It may well end up being the highest adopted card for it's price. It seems like it'd be easier to just manufacture the defunct 3080's with the rest of the 3070's. But you are right, it does fill a performance spot. But $630? I dunno, seems like no-one would buy that. It'd have to be at least $600. Who wouldn't spend the cost of a game to have waaaayyyy better performance with the 3080?

I say $629.99 because $600 would make it only $100 more than the RTX 3070, which is a monetary difference that is less than the rough difference in the computational capabilities between the RTX 3070 and this SKU ([$599.99/$499.99 x 100] - 100% = 20%, [7424 CUDA cores/5888 CUDA cores x 100] - 100% = 26%); there's no way that Nvidia would provide 26% more performance for less than 26% more money. On the other hand, $629.99 is exactly 26% more money than $499.99. In regard to an MSRP of $629.99 for this new SKU relative to the MSRP of $699.99 for the RTX 3080, the latter's MSRP is only 11% more while its computational capability is roughly 17% more ([$699.99/$629.99 x 100] - 100% = 11%, [8704/7424 x 100] - 100% = 17%). Hence, $629.99 seems too low; however, most models of the RTX 3080 are priced above its MSRP.
 
Last edited:
Top Bottom