• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Would My PC Bottleneck an RTX 3090?

The display that I use a monitor, the TCL 55R635, has a 120Hz panel but can display 120Hz via HDMI at only 1440p and below due to having HDMI 2.0b ports rather than HDMI 2.1. Hence, at 4K it can display only 60Hz via HDMI. I prefer 4K-60Hz over 1440p-120Hz, by the way.
A few months ago I made the mistake of swapping my 4k monitor for a 21;9 1440p monitor... I hate seeing pixels this big, especially in lightroom, but even just for text, the lower pixel density doesn't feel right.

On thw other hand lowering the resolution to 1440p or even 1080p to play games on my TV to hold 60fps (I have only a 1660ti in the living room).
 

BluRayHiDef

Banned
A few months ago I made the mistake of swapping my 4k monitor for a 21;9 1440p monitor... I hate seeing pixels this big, especially in lightroom, but even just for text, the lower pixel density doesn't feel right.

On thw other hand lowering the resolution to 1440p or even 1080p to play games on my TV to hold 60fps (I have only a 1660ti in the living room).
Yea, when I sample games at 1440p on my TV, I can tell the difference; they just don't look as sharp.
 

smbu2000

Member
DDR5's speed will be between 4800MHz and 6400MHz. So, it'll be much faster than DDR4 and is therefore worth waiting for since it'll be released relatively soon; I built my current computer back in 2016 (and upgraded to a 1080Ti in 2017), so it wouldn't make sense for me to not wait merely one more year for DDR5. I'll run the Cinebench R15 benchmark and post the results.
It will most likely have higher latency as well which reduces some of the benefits, especially with the early stuff. It was the same with higher end DDR3 vs. early DDR4.

So your Cinebench R15 score was 1109?
My stock 5960X score was 1328 and my overclock score (4.5GHz all core) was 1755, which is close to a Ryzen 2700X at stock speeds. (sorry for the blurry photo)


(I think my 3900X scored over 3000 (3100?) on R15 the last time I ran it and I haven't really tweaked it at all like before with the 5960X.)
 

BluRayHiDef

Banned
Sure, but doing the sacrifice on the TV doesn't annoy me as much as on the monitor (maybe I'm further away and my TV is smaller).
I sit direct in front of my TV, as if it's a monitor.

NkJaNJ5.jpg
 

teezzy

Banned
Youd be far better off just getting a 3080 and using the saved cash updating the rest of your rig

Putting a 3090 on old hardware like that is silly
 

BluRayHiDef

Banned
teezzy teezzy , it's already been demonstrated via Marvel's Avengers that 10GB of VRAM is a bottleneck for future games.

DSO Gaming said:
Speaking of 4K, we did notice some VRAM limitations when using the game’s HD Texture Pack. On Ultra Settings/Ultra Textures, our performance went downhill in the following scene. As you can see, our RTX2080Ti was used to its fullest and pushed 17fps.

However, when we used Ultra Settings/High Textures, our performance skyrocketed to 42fps. This appears to be a VRAM limitation. As we can see, the game’s Ultra textures used 10.5GB of VRAM, whereas High textures used 8GB of VRAM. Thus, it will be interesting to see whether the NVIDIA RTX3080 will be able to handle this game in 4K with its 10GB VRAM.

 

Calverz

Member
Firsly I am sure that there was a 6TF GPU for 800USD before and second 3090 have massive advantage with memory, so for futureproofing it's a great card. Also it's really cheap, when you take into account that is basically a "Titan" model. I have that card pre-ordered and cannot wait.
Where have you pre ordered that card?
 

Mister Wolf

Member
So everything that isnt a 3090 is a bottleneck? I dont think I buy it

The Avengers with Ultra textures at 4K uses 10.5 GB of VRAM bottlenecking the 11GB 2080ti causing the framerate to plummet. The 3080 only has 10GB. In the foreseeable future do you think we will see other games with such high VRAM demands?
 
Last edited:

888

Member
The Avengers with Ultra textures at 4K uses 10.5 GB of VRAM bottlenecking the 11GB 2080ti causing the framerate to plummet. The 3080 only has 10GB. In the foreseeable future do you think we will see other games with such high VRAM demands?

Are we sure this isn’t a one off? From the article..

“In case you’re wondering, the game still drops below 60fps in 4K on Low settings. This shouldn’t really surprise you as this final build is a bit similar to the Open Beta build (performance-wise). Below you can find some screenshots that showcase the game in 4K/Low. So yeah, you still won’t be able to get a locked 60fps in 4K on an RTX2080Ti, even when using Low settings.”

The game doesn’t seem very optimized. Considering most games don’t come near 8gb it makes me wonder if it’s just this game.
 

Mister Wolf

Member
Are we sure this isn’t a one off? From the article..

“In case you’re wondering, the game still drops below 60fps in 4K on Low settings. This shouldn’t really surprise you as this final build is a bit similar to the Open Beta build (performance-wise). Below you can find some screenshots that showcase the game in 4K/Low. So yeah, you still won’t be able to get a locked 60fps in 4K on an RTX2080Ti, even when using Low settings.”

The game doesn’t seem very optimized. Considering most games don’t come near 8gb it makes me wonder if it’s just this game.

Both DF and DSO Gaming have reported the same thing that its using a ton of VRAM. There are benchmark overlays that show you the VRAM usage.

"Speaking of 4K, we did notice some VRAM limitations when using the game’s HD Texture Pack. On Ultra Settings/Ultra Textures, our performance went downhill in the following scene. As you can see, our RTX2080Ti was used to its fullest and pushed 17fps. " - DSO Gaming

"However, when we used Ultra Settings/High Textures, our performance skyrocketed to 42fps. This appears to be a VRAM limitation. As we can see, the game’s Ultra textures used 10.5GB of VRAM, whereas High textures used 8GB of VRAM. Thus, it will be interesting to see whether the NVIDIA RTX3080 will be able to handle this game in 4K with its 10GB VRAM." - DSO Gaming

Has nothing to do with occasional drops. Its straight up bottlenecking.
 
Last edited:

888

Member
Both DF and DSO Gaming have reported the same thing that its using a ton of VRAM. There are benchmark overlays that show you the VRAM usage.

Im not doubting it’s happening. Just wondering if it due to poor optimization of this game in particular. I don’t even think flight sim maxes my vram.
 

Mister Wolf

Member
Im not doubting it’s happening. Just wondering if it due to poor optimization of this game in particular. I don’t even think flight sim maxes my vram.

Just know if you plan to buy a 3070 or 3080 with 8/10GB of VRAM you are taking a gamble. If this arises 2 years down the line then remember you've been warned.
 
If you want to run any next gen games above console fps you will have to have at least an 8 core cpu. Unless you reduce every cpu taxing setting. It will also be pretty hard to run any next gen games in 2x the framerate if the console version uses 100% of its CPU. Thats why the RTX 3090 is marketed as a 8k, 60fps card. So if you want to play next gen games with max setting and 60fps you have to get a ryzen 4000 8 core
 

teezzy

Banned
The Avengers with Ultra textures at 4K uses 10.5 GB of VRAM bottlenecking the 11GB 2080ti causing the framerate to plummet. The 3080 only has 10GB. In the foreseeable future do you think we will see other games with such high VRAM demands?

Meh?

Count me in the 'poor optimization' camp then. The very idea that I need to buy the top tier $1400 card to avoid any bottlenecking isn't one I'm going to adhere to. That's just dumb, and developers should know better.

I play 1440p anyhow.
 

S0ULZB0URNE

Member
Since DDR5 will be released next year, I intend to build a new computer then in order to include the new volatile memory standard and a compatible CPU and motherboard in it. So, even if my current build would bottleneck the RTX 3090, it wouldn't be an issue indefinitely; I'd just swap the RTX 3090 into that new build.

Despite this, I'm wondering whether or not my current PC would bottleneck the card because I intend to buy Cyberpunk 2077 for PC this year and want to play it at 4K with ray tracing and all other settings set to their maximum values without experiencing less than 60 frames per second.

My computer's specifications are as follows:

1. i7-5820k (boosts to 4GHz by default)
2. Asus X99 Deluxe
3. 8 x 4GB Crucial 2400Mhz DDR4 RAM
4. EVGA GTX 1080 Ti
5. Samsung EVO 500GB SATA SSD
6. Western Digital Caviar Black 1TB HDD
7. Western Digital Blue 6TB and 4TB HDD
8. Corsair 850M 850 watt modular PSU
9. NZXT H440 Mid-Sized Tower (the hard-drive trays can be removed to accommodate cards that are up to 428mm long)

Picture:

GNOnN0S.png

EDIT:

Here's a more recent picture, which includes the 1080Ti.

XGcVTP3.jpg

Yes, I know that the inside of my case is dusty; I'll get a compressed-air blower o clean it out soon (Link).
Not with 4K.
Below 4K to a extent yes.
 

BluRayHiDef

Banned
If you want to run any next gen games above console fps you will have to have at least an 8 core cpu. Unless you reduce every cpu taxing setting. It will also be pretty hard to run any next gen games in 2x the framerate if the console version uses 100% of its CPU. Thats why the RTX 3090 is marketed as a 8k, 60fps card. So if you want to play next gen games with max setting and 60fps you have to get a ryzen 4000 8 core

I don't think that 8 cores will be necessary to attain an FPS rate above those of the PS5 and XSX for a few years, because games aren't multi-threaded enough to saturate 8 cores/ 16 threads. Some of the 8 cores in the next generation of consoles will be used to handle the OS and other non-gaming functions or will just be idle.
 
I don't think that 8 cores will be necessary to attain an FPS rate above those of the PS5 and XSX for a few years, because games aren't multi-threaded enough to saturate 8 cores/ 16 threads. Some of the 8 cores in the next generation of consoles will be used to handle the OS and other non-gaming functions or will just be idle.
I would be really surprised if that was the case but we will see. PS4 games already used all cores (- OS) of the jaguar but only had 8 cores/threads running at ~1.6 Ghz . Once next gen hits they will sure make use of multithreading with the ryzen cpus and also put them to full use. I would still stand by it that 8 cores ryzen 3000 is the least you need next gen. Console OS shoudnt take much more if more at all CPU power than windows. Anyway OP can just wait and see if his CPU is enough and then decide on a purchase.
 

BluRayHiDef

Banned
I would be really surprised if that was the case but we will see. PS4 games already used all cores (- OS) of the jaguar but only had 8 cores/threads running at ~1.6 Ghz . Once next gen hits they will sure make use of multithreading with the ryzen cpus and also put them to full use. I would still stand by it that 8 cores ryzen 3000 is the least you need next gen. Console OS shoudnt take much more if more at all CPU power than windows. Anyway OP can just wait and see if his CPU is enough and then decide on a purchase.
I guess you're right. Well, Ryzen 4000 is right around the corner.
 

Rbk_3

Member
Just know if you plan to buy a 3070 or 3080 with 8/10GB of VRAM you are taking a gamble. If this arises 2 years down the line then remember you've been warned.

Yea, and you can then buy the 4080 in 2022 which will have more Vram than 10GB and be faster the the 3090 and the 3080/4080’combo will still be cheaper than the 3090 cost you. Plus you can then still sell the 3080 to recoup more costs.


If I was American I would probably buy the 3090, but in Canada it will be pushing $2500 after tax. I can’t justify that when I can get a 3080 for $1100 ish after tax
 
Last edited:

INC

Member
Both DF and DSO Gaming have reported the same thing that its using a ton of VRAM. There are benchmark overlays that show you the VRAM usage.

"Speaking of 4K, we did notice some VRAM limitations when using the game’s HD Texture Pack. On Ultra Settings/Ultra Textures, our performance went downhill in the following scene. As you can see, our RTX2080Ti was used to its fullest and pushed 17fps. " - DSO Gaming

"However, when we used Ultra Settings/High Textures, our performance skyrocketed to 42fps. This appears to be a VRAM limitation. As we can see, the game’s Ultra textures used 10.5GB of VRAM, whereas High textures used 8GB of VRAM. Thus, it will be interesting to see whether the NVIDIA RTX3080 will be able to handle this game in 4K with its 10GB VRAM." - DSO Gaming

Has nothing to do with occasional drops. Its straight up bottlenecking.


More reason to not run in 4k and use 1440p and dlss, wont that reduce the vram use?

Also, I have 3700x, I presume that'll be fine with a 3080? Right?
 

Mister Wolf

Member
More reason to not run in 4k and use 1440p and dlss, wont that reduce the vram use?

Also, I have 3700x, I presume that'll be fine with a 3080? Right?

You should be good but that's assuming every game you're looking forward to will have DLSS. You could always bump the down textures to a lower setting if that doesn't irk you.
 

INC

Member
You should be good but that's assuming every game you're looking forward to will have DLSS. You could always bump the down textures to a lower setting if that doesn't irk you.

Not really, I only play mp games on my pc, and most of the time you're bumping everything to low anyway, to be a sweat lord

VR is where I wanna the most support for dsll 2.0, unfortunately most are indie devs, so doubt that's gonna happen
 

Vaelka

Member
There will ALWAYS be a bottleneck, it's generally more important to have a balanced system but bottlenecks are unavoidable and software dependent too.
Typically 4k is GPU heavier than CPU tho.
But a 3090 with that CPU sounds so unbalanced to me and I also think that upgrading from a 1080 ti is totally unnecessary at this point. I'd look at a CPU especially if you like open world games and some games are really CPU heavy.
A 1080 ti is still really powerful.

A 3090 seems so overkill to me right now unless you're building from scratch. I'd just wait.
 

BluRayHiDef

Banned
There will ALWAYS be a bottleneck, it's generally more important to have a balanced system but bottlenecks are unavoidable and software dependent too.
Typically 4k is GPU heavier than CPU tho.
But a 3090 with that CPU sounds so unbalanced to me and I also think that upgrading from a 1080 ti is totally unnecessary at this point. I'd look at a CPU especially if you like open world games and some games are really CPU heavy.
A 1080 ti is still really powerful.

A 3090 seems so overkill to me right now unless you're building from scratch. I'd just wait.

My GPU is two generations behind and doesn't support ray tracing via dedicated hardware and doesn't support DLSS and you're telling to wait. No. I can always upgrade everything else afterwards, but I want the 3090 now because it's future proof and will enable me to experience ray tracing and DLSS in upcoming games, such as Cyberpunk 2077.
 

Vaelka

Member
My GPU is two generations behind and doesn't support ray tracing via dedicated hardware and doesn't support DLSS and you're telling to wait. No. I can always upgrade everything else afterwards, but I want the 3090 now because it's future proof and will enable me to experience ray tracing and DLSS in upcoming games, such as Cyberpunk 2077.

There is no such thing as '' future proof '' in PC hardware.
People have said the same about 16gb RAM for like over a decade now and yet games have barely been touching 8 if even that, and people who bought 16gb to be '' future proof '' have had to replace it anyways because new motherboards stopped supporting them or faster RAM was released.
If you want to spend that money for ray tracing in a handful if even that games then sure I guess. All I am saying is that a 1080 ti is still very powerful and capable of handling all modern games and games going forward for quite some time, and your GPU is not the weakspot in your system.
That's not to say that your system is weak because it isn't, but if anything I think that your GPU is more '' future proof '' than your CPU.

I just personally think that it's a bit silly to rush like this especially when it comes to newer technology that isn't even fully implemented or optimized yet in the industry.
And to say that your GPU is '' two generations behind '' means nothing, especially not with how short generations are now.
I mean the Xbox Series X GPU is basically the equivelant of a 2080 and it's fairly close to a 1080 ti. And that's going to last for a whole console generation...

If anything you could just wait and upgrade your CPU instead which generally last longer and then a 3090 will probably be cheaper anyways.
But you can seriously drop the notion of '' future proof '' because unless you're a seer you can't know that.
 

xPikYx

Member
Since DDR5 will be released next year, I intend to build a new computer then in order to include the new volatile memory standard and a compatible CPU and motherboard in it. So, even if my current build would bottleneck the RTX 3090, it wouldn't be an issue indefinitely; I'd just swap the RTX 3090 into that new build.

Despite this, I'm wondering whether or not my current PC would bottleneck the card because I intend to buy Cyberpunk 2077 for PC this year and want to play it at 4K with ray tracing and all other settings set to their maximum values without experiencing less than 60 frames per second.

My computer's specifications are as follows:

1. i7-5820k (boosts to 4GHz by default)
2. Asus X99 Deluxe
3. 8 x 4GB Crucial 2400Mhz DDR4 RAM
4. EVGA GTX 1080 Ti
5. Samsung EVO 500GB SATA SSD
6. Western Digital Caviar Black 1TB HDD
7. Western Digital Blue 6TB and 4TB HDD
8. Corsair 850M 850 watt modular PSU
9. NZXT H440 Mid-Sized Tower (the hard-drive trays can be removed to accommodate cards that are up to 428mm long)

Picture:

GNOnN0S.png

EDIT:

Here's a more recent picture, which includes the 1080Ti.

XGcVTP3.jpg

Yes, I know that the inside of my case is dusty; I'll get a compressed-air blower o clean it out soon (Link).
if you play at 4k60fps it is unlikely, but remember, the more the frames the faster the cpu must be, if you use the gpu at its best the cpu doesn't kick in much and you are fine, but I would recommend a little upgrade for that gpu
 

BluRayHiDef

Banned
There is no such thing as '' future proof '' in PC hardware.
People have said the same about 16gb RAM for like over a decade now and yet games have barely been touching 8 if even that, and people who bought 16gb to be '' future proof '' have had to replace it anyways because new motherboards stopped supporting them or faster RAM was released.
If you want to spend that money for ray tracing in a handful if even that games then sure I guess. All I am saying is that a 1080 ti is still very powerful and capable of handling all modern games and games going forward for quite some time, and your GPU is not the weakspot in your system.
That's not to say that your system is weak because it isn't, but if anything I think that your GPU is more '' future proof '' than your CPU.

I just personally think that it's a bit silly to rush like this especially when it comes to newer technology that isn't even fully implemented or optimized yet in the industry.
And to say that your GPU is '' two generations behind '' means nothing, especially not with how short generations are now.
I mean the Xbox Series X GPU is basically the equivelant of a 2080 and it's fairly close to a 1080 ti. And that's going to last for a whole console generation...

If anything you could just wait and upgrade your CPU instead which generally last longer and then a 3090 will probably be cheaper anyways.
But you can seriously drop the notion of '' future proof '' because unless you're a seer you can't know that.

Ray tracing will be in Cyberpunk 2077, which will be released this November; that alone will make the upgrade worthwhile for me. However, there are other games, such as Control (which I bought for PC yesterday) that support ray tracing.

Also, my 1080Ti can't even hit 60 frames per second when running Control at default settings in 4K (with ray tracing deactivated, of course); it hovers in the 40s. And the CPU isn't the bottleneck, because even the system in the video below can't hit 60 frames per second.

 
Last edited:
PC?

given all the raving by fanbois here I was beginning to believe all I needed for nextgen would be to plug my new 3090 card into my tv

it really demands an expensive cpu, memory and a tower the size of a mini fridge
 
Last edited:

Armorian

Banned
PC?

given all the raving by fanbois here I was beginning to believe all I needed for nextgen would be to plug my new 3090 card into my tv

it really demands an expensive cpu, memory and a tower the size of a mini fridge

It doesn't, Ryzen 3600 will do just fine, with ~3200MHz RAM.
 

Armorian

Banned
He has 2 HDD's in his rig, i think 500 GB is a little to small for him, he can better get the 1tb thats more future proof for him😉

I don't think he should invest in SATA SSD if he has one already, for system and few crucial games it's enough. When he gets PCIE4/5 MB then it makes sense to get NVME drive with better parameters than what PS5 will have, DX12 storage solution will also be ready.
 
He has no SSD card, only 2 HDD's in his rig, the Samsung SSD 1tb he can buy for €138,- and he has a good card. Your sugestion wil cost €285- in Europe
 

Armorian

Banned
He has no SSD card, only 2 HDD's in his rig, the Samsung SSD 1tb he can buy for €138,- and he has a good card. Your sugestion wil cost €285- in Europe



1. i7-5820k (boosts to 4GHz by default)
2. Asus X99 Deluxe
3. 8 x 4GB Crucial 2400Mhz DDR4 RAM
4. EVGA GTX 1080 Ti
5. Samsung EVO 500GB SATA SSD
6. Western Digital Caviar Black 1TB HDD
7. Western Digital Blue 6TB and 4TB HDD
8. Corsair 850M 850 watt modular PSU
9. NZXT H440 Mid-Sized Tower (the hard-drive trays can be removed to accommodate cards that are up to 428mm long)
 
Last edited:

BluRayHiDef

Banned
I bought a new case: the Lian Li O11 Dynamic XL ROG, and I've already transferred my system into it.

My other case, the NZXT H440, can accommodate the RTX 3090 via its removable hard-drive trays, two of which I subsequently removed in preparation for the card; however, it was still cluttered due to the remaining three hard-drive trays being occupied and being located at the front of the case. Also, it's old and worn; it's got a few scratches and areas of chipped paint, and it's quite dusty.

Anyhow, the Lian Li O11 Dynamic XL has a novel design that places the hard-drive trays (and the PSU) behind the rear panel, where they cannot be seen. This maximizes space in the central compartment of the case, allowing more room for the motherboard, CPU, RAM, and graphics card to breathe.

I've only got one fan in the new case at the moment, which is the CPU fan that's attached to the CPU's heatsink. However, I am expecting six more to be delivered tomorrow (Cooler Master SickleFlow 120 V2 Blue Led 120mm Square Frame Fan).

Picture:

w1A0Usq.jpg
 
Top Bottom