• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Which RTX AIB is best? What setup has the least bottlenecks?

Velius

Banned
Prices are definitely coming down. At this rate I wouldn't be surprised if we start seeing retail RTX 30 series at retail in a few months.

And... I'm getting the fever. I WANT A RIG.

With that in mind, I want to build a rig that can run Cyberpunk maxed at 1440p with ray tracing on, while never dipping below 60FPS. Here's what I'm thinking. Any criticisms or advice welcome, I'm looking to reduce bottleneck wherever possible.

Below are my suggested specifications to accomplish such a feat. But I'm running into all kinds of interesting puzzles here. I want the best AIB possible, and lining things up, it's not really clear. So here are some contenders, maybe you guys can give me some feedback-- hell maybe even a couple of you actually have one of these and can let us know your experience.

First up, the THICASS four slot card, the Gigabyte Aorus Xtreme.

7f0bb5d2-c3e8-4161-9b6b-8638a0bb8349.jpg

The 3090 has 10,496 CUDA Cores and a core clock of 1860 MHz, while the 3080 Ti has 10,240 Cuda Cores, and a boost cock of 1830 MHz. So it seems to me that these two are fairly close in their capabilities; if I could get them at retail it looks like the 3080 Ti would be the much better value, since they're very close in specs but with an enormous difference in price tag. Also they BOTH have 3x8-pin connectors, more power.

Then we have the not quite as thicass but still rather sexy ASUS ROG Strix OC. Again, 3090 and 3080 Ti.
1a47ac0f-954e-43b4-8b23-4cdf1a29da7f.jpg


The Cuda Cores for the 3080 Ti and 3090 are identical to their Gigabyte Aorus counterparts. But as far as MHz goes there's some difference.

ROG Strix OC 3090 has 1890 MHz in "OC Mode," and 1860 in "Gaming Mode." I don't know why gaming mode would be less than OC mode but there it is.
The 3080 Ti is less, with 1845 MHz in "OC" and 1815 in "Gaming." Should be noted that these two also have 3x8-Pin connectors.

So which is better? I mean for sheer numbers, it looks like 3090 ROG Strix OC, right? But how much more oomph are we talking about? How much of a difference does 30 MHz make?



Here is me proposed model for Cyberpunk 2077. Can any gurus tell me if this will run the game at:
1) No less than 60FPS
2) Max settings
3) 1440p
4) Ray Tracing ON



Ryzen 7 5800X
2x32GB Corsair Vengeance DDR4
Gigabyte Aorus X570 Elite WiFi
Gigabyte Aorus Xtreme RTX 3090

https://www.newegg.com/amd-ryzen-7-5800x/p/N82E16819113665
https://www.newegg.com/corsair-64gb-288-pin-ddr4-sdram/p/N82E16820236601?quicklink=true
https://www.newegg.com/gigabyte-x570-aorus-elite-wifi/p/N82E16813145165
https://www.newegg.com/gigabyte-geforce-rtx-3090-gv-n3090aorus-x-24gd/p/N82E16814932340
 
D

Deleted member 17706

Unconfirmed Member
I looked into the 3080 reviews quite a bit and there is very little meaningful difference if I remember correctly.

Considering the insane supply issues, just buy whatever you can manage to get at MSRP.

In terms of maxed out Cyberpunk 2077 with RT turned up all the way at 1440p? A 3080 will get you there, especially if you enable any kind of DLSS.
 
Last edited by a moderator:

LOLCats

Banned
For nvida gpu, Evga and asus are the decent AIB, msi next, then gigabyte, and finally zotac (dont buy a zotac)
 
Last edited:

Exentryk

Member
3090 is a waste of money for what you want. Cyberpunk 2077 has DLSS, so you can easily do what you need even on a 3080. I played at 4k with 40-50 fps using DLSS Balanced (all other graphics settings on and maxed) on a 3080. 40-50 fps isn't an issue with VRR as it is still smooth.

Also, there is barely any performance difference between the AIBs. Get whichever one you can find cheapest. Better to focus more on noise/temps.

So overall, I'd recommend just getting a 3080 and save yourself some money. If you still want more power, get a 3080ti.

HfwS6MvGbBWNWwDQrrNYRm.png
 
Last edited:
EVGA stuff is on the noisy side in my experience (Video cards/PSUs), they're always noisier than the competition... I love their looks, but I hate sound more.
 

Buggy Loop

Member
64 to 32 GB ram i think would make more sense, can’t think of a game where above 32 would matter, not even Microsoft flight simulator.

Directstorage/RTX IO will also alleviate system RAM/CPU use and go straight to GPU, so future titles would make 64GB completely overkill I think.

5800X is a hot beast, not enough chiplets like the 5900X to dissipate heat with double the area, and more cores (max of design) than 5600X, puts it in a strange place with a difficult hotspot to control. Some CPU coolers are better at this hotspot than others, but keep in mind if you pick that one. I would say that with the money saved from 64 to 32 GB, just upgrade to a 5900X
 

Buggy Loop

Member
Also, I hope you get a monitor with Freesync/Gsync at 144 Hz, because outside cyberpunk you’ll be way way over 60 fps for the majority of games out there.
 

kiphalfton

Member
I have a Nvidia Founder's Edition and EVGA FTW3 Ultra RTX 3080. The FTW3 Ultra has better memory junction temps, and seems to run cooler overall. However, the FE is smaller and only requires 2x8 pin connector, versus 3x8 pins for the FTW3 Ultra. I've also owned the Zotac Trinity OC. Not bad, but somewhat minimalistic compared to the other two.
 
Last edited:

CuNi

Gold Member
With how close the 3080 and 3090 already are, the price uplift you pay for a 3080ti is a joke.
You would be better off getting a 3080 and OC/UV it yourself and have 98% of the 3080ti performance for what feels like 70% of it's price.
 

Armorian

Banned
64 to 32 GB ram i think would make more sense, can’t think of a game where above 32 would matter, not even Microsoft flight simulator.

Directstorage/RTX IO will also alleviate system RAM/CPU use and go straight to GPU, so future titles would make 64GB completely overkill I think.

5800X is a hot beast, not enough chiplets like the 5900X to dissipate heat with double the area, and more cores (max of design) than 5600X, puts it in a strange place with a difficult hotspot to control. Some CPU coolers are better at this hotspot than others, but keep in mind if you pick that one. I would say that with the money saved from 64 to 32 GB, just upgrade to a 5900X

More than 16GB doesn't make any sense. People should invest in better performing RAM (latency, speed), not capacity that they won't even use once (aside chrome maybe lol).
 

benno

Member
The 3090 has 10,496 CUDA Cores and a core clock of 1860 MHz,
The 3090 just increases its clock until it throttles either by thermal or power supply. All 3090s go higher than 1860. I have a Zotac Trinity, which is regarded as one of the worst, which runs 1950mhz stock. Others prob get another 150mhz on that due to the power limitations Zotac placed on the card.
 
Last edited:

Ulysses 31

Member
I've familair with the Aorus 3090 and Asus Strix 3090, the Strix is better due to it's smaller size, better temps and lower power draw. Only has 1 DHMI port less which is 2.0 anyway.
 

Patrick S.

Banned
For nvida gpu, Evga and asus are the decent AIB, msi next, then gigabyte, and finally zotac (dont buy a zotac)
Have my Zotac 3080 since November and am super happy with it. Yes it has a lower power target than some other cards, but that results in what, 2 fps less while gaming? Pffft. And the card has a five year warranty.
 

Velius

Banned
The 3090 just increases its clock until it throttles either by thermal or power supply. All 3090s go higher than 1860. I have a Zotac Trinity, which is regarded as one of the worst, which runs 1950mhz stock. Others prob get another 150mhz on that due to the power limitations Zotac placed on the card.
Whoa..
In terms of performance what kind of practical impact does this have?
 

Patrick S.

Banned
Whoa..
In terms of performance what kind of practical impact does this have?
I wouldn't sweat it. Shortly after I managed to get my Zotac card, a colleague got an EVGA FTW3, and we both benchmarked our cards in 3DMark, and in some benchmark runs, the Zotac managed to get higher scores than the EVGA. I might still have screenshots of the benchmarks.
 

Jigsaah

Gold Member
I'm running a 3080 with a Ryzen 5800x right now. In cyberpunk you don't want to max out everything as some graphical setting give very little difference for a considerable dip to your frame rate. Gamer's Nexus broke this down pretty well when the game came out. I generally run the game at 1440p with the settings that matter maxed. I also am using DLSS in the recommended quality setting which lets me push upwards of 90 fps in most parts of the game.
 

Spukc

always chasing the next thrill
Only tools max out settings..
I can’t even see the diffrence..

unless you are from DIGITAL FOUNDRY and have 400% zoom vision..

ultra settings are to make sure 30xx owners don’t get mad
 

TheKratos

Member
My 2080ti help up pretty well. Tweaked few settings. RT on and DLSS performance, I did use NVIDIA sharpening 0.5 though because DLSS performance (I think it drops to native 1080p) made it hella blurry.
 

Buggy Loop

Member
Only tools max out settings..
I can’t even see the diffrence..

unless you are from DIGITAL FOUNDRY and have 400% zoom vision..

ultra settings are to make sure 30xx owners don’t get mad

Naw, even digital foundry optimized settings consider the ones that have an impact on visuals and those that don’t but come with a performance penalty. Their optimized settings are pretty good.
 
Top Bottom