• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"I Need a New PC!" 2022. The GPU drought continues...

Status
Not open for further replies.

RoboFu

One of the green rats
Okay so I want to get anew gpu now that I can for under a thousand dollars. What should i get a 6900 xt or 3080 12gig?
 

RoboFu

One of the green rats
Do you care about raytracing?
well isnt that the question? lol Ive been looking at recent comparisons and the 6900 beats the 3080 in a lot of non RT games by a pretty good margin, but then there is ray tracing .. the thing i see with that is the 6900 is always about 20 fps behind in RT games but the games that tank it below 60 are usually run badly on the 3080 as well. only a few cases where the 3080 is right at 60 with the 6900 at 40. 🤷‍♂️
 

draliko

Member
well isnt that the question? lol Ive been looking at recent comparisons and the 6900 beats the 3080 in a lot of non RT games by a pretty good margin, but then there is ray tracing .. the thing i see with that is the 6900 is always about 20 fps behind in RT games but the games that tank it below 60 are usually run badly on the 3080 as well. only a few cases where the 3080 is right at 60 with the 6900 at 40. 🤷‍♂️
I don't think actual implementation of raytracing is that meaningfull, so I'd go with a 6900 (especially for the vram), I'm pretty sure than when raytracing will be a real game changer our gpu will be useless :) for now it's really fancy reflection and nothing more, worth in maybe 1 or 2 games... naturally imho, other than that do you need nvenc?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Okay so I want to get anew gpu now that I can for under a thousand dollars. What should i get a 6900 xt or 3080 12gig?
3080 12 gb. 6900xt is a great fucking card but ray tracing is going to be pretty much standard in next Gen games when they do arrive. Ue5 matrix demo with ray tracing seem to be doing fine on AMD cards but we don’t know if Avatar and Star Wars fallen order 2 will behave the same way.

Then there is dlss which lets face it is something you will be using in every single game and benchmarks without dlss dont really show that.
 

manfestival

Member
Okay so I want to get anew gpu now that I can for under a thousand dollars. What should i get a 6900 xt or 3080 12gig?
that is a tough question these days. I convinced a friend of mine to buy the 3080 12GB(which he did) because it was on sale for $770. At that price, it is hard to justify a 6900xt. Even if the 6900xt does beat it in rasterization by quite a bit. Price per frame, the 3080 12GB I think is the hands down best value aside from maybe a 6600xt at the moment. That is of course if you are dead set on buying a card right now. If this was a few months ago? 6900xt since it was the first card to hit msrp and even go under for the most part. Rasterization is still king and the 6900 xt > 3080 12gb in that manner.

This is even when considering resolution and etc. DLSS and FSR are both meh to me. Still prefer native over these temporal solutions but I guess they are nice to have. Especially for the 4k gamers since these cards under 4k should not really need temporal assistance to get high framerate outside of some titles.
Raytracing? Also still not really at a place where IMO(important) matters. It is a nice thing to have the RT cores. Only game that made me care so far about ray tracing is Cyberpunk. Fortunately, devs are actually caring about ray tracing finally but it might still be a while before it truly becomes the staple that makes the ray tracing cores a more meaningful consideration. However, I guess this is where these temporal solutions like DLSS, XeSS, and FSR are going to matter the most and squeeze out that future performance.
 
Now there are rumors saying the 4070 will only have 10gb.

If true it's out of the question for me. I'm not going from 12gb vram to 10. Nvidia deserves to get rocked by amd if true.

Not gonna lie if there is no 12gb card from Nvidia on 4070/60/60 ti I am probably going to get a sapphire rdna3 gpu.
 
Last edited:

GreatnessRD

Member
I'm of the opinion, grab the 6900 XT. It's the same money (or cheaper) than an RTX 3080 10/12 GB and better. Sure, Nvidia is winning the Raytracing race as of now, but the 6900 XT just straight brute forces and it appears FSR 2.0 can stand up to DLSS, so you'll have gains there. (Once more universal support, of course) You'll be in good hands if you get either card, but again, why not get the more powerful AND its usually cheaper?
 

Irobot82

Member
I'm of the opinion, grab the 6900 XT. It's the same money (or cheaper) than an RTX 3080 10/12 GB and better. Sure, Nvidia is winning the Raytracing race as of now, but the 6900 XT just straight brute forces and it appears FSR 2.0 can stand up to DLSS, so you'll have gains there. (Once more universal support, of course) You'll be in good hands if you get either card, but again, why not get the more powerful AND its usually cheaper?
I would also imagine more FSR 2.0's with RT in the future and you'll get roughly the same perf or better.
 
I'm of the opinion, grab the 6900 XT. It's the same money (or cheaper) than an RTX 3080 10/12 GB and better. Sure, Nvidia is winning the Raytracing race as of now, but the 6900 XT just straight brute forces and it appears FSR 2.0 can stand up to DLSS, so you'll have gains there. (Once more universal support, of course) You'll be in good hands if you get either card, but again, why not get the more powerful AND its usually cheaper?
This is exactly what I did, for all the reasons you mentioned.

Love my 6900 XT.
 

twilo99

Member
Ok, since RDNA3 won't be out for a while anyway, I can just wait and see how my system does with the new CoD and that will probably give indication on how it would do with Warzone 2 as well.. to an extent. If I can't get 200+fps @ 1440p on the smaller MW2 maps, there is no way I'm getting 165 FPS in warzone 2. There is also VRR which helps a lot..

Would sit out the RNDA3 and next cpu gen. No point in upgrading every gen.

I agree... but if I want to run a specific game at high settings I might have to. We will see.

Definitely wait. I have a 12600k, 165hz monitor, and 6800xt. I can USUALLY get that little bit extra from OCing with a little undervolt. Haven't been playing warzone but definitely gonna be playing MW2 2022. RDNA3 likely won't be out by then anyways. 5800x3d would still require the wait to see if it is worth it

Plus AMD has been killing it with their drivers.... albeit the situation is weird since their best drivers are OPTIONAL and have to be manually downloaded.

They've improved their drivers a lot. I used to get stutters in warzone but they fixed in a few months ago. Its true, there is no way RDNA3 comes out before MW22 but probably around the same time warzone 2 comes out..

Both CPUs can get 180+ FPS in warzone



So I imagine any limitations you would have would be your GPU and settings. I'd wait for it to come out and run your own benchmarks and compare it to others.


No, the CoD engine is CPU limited, for me anyway..

AMD GPUS are not great in Warzone. They can get higher highs and a higher average than an equivalent NVIDIA card, but the 1% lows are much worse and overall the frametimes are not as smooth so despite a higher average it just doesn't feel as smooth. Just check the RTSS frametime graph on that 2nd 6800XT video. A lot of spikes. For comparison, my 12900KS and 3090 the frametime graph is pretty much smooth. I can take a video tonight. Their CPUs also suffer from lower lows compared to their Intel equivalents, but not to the extent of their GPUS.

Oh yeah, I know they are not, but at the time it was between the 6800xt and a 3070ti and the AMD card was much better so I got that. Things might change with RDNA3..

VRR helps a lot.

The x3D is great but since you already have a zen 3 chip, maybe not. It will be a big jump in many games but 5600x is still very good.

If you had an older Ryzen and hadn't already jumped to 5600x I would say for sure get the x3D. But just wait for the next Intel socket and for am5 to mature. I agree with Kenpachii Kenpachii .

I upgraded from a 1600af to an x3D which is huuuuge.

The 5600x is an awesome chip, its just that cod doesn't like it all that much.

stay with the 5600x unless you're certain that you're cpu bound, as for gpu i'd wait some real bench, maybe preorder on amazon or somewhere whit easy refound policy. I won't go with ddr5 for at least another 24 months... top ddr4 still beat ddr5 without considering the price/perf advantage... you have a very high end build, i'd wait some real evidence any upgrade would be usefull

CPU bound indeed, at least on that one game.

why would you need anything above a 5600X? do you use it as a workstation? because unless you are an absolute pro CS:GO or Valorant player you don't need anything better than a 5600X

because call of duty ..
 

manfestival

Member
Now there are rumors saying the 4070 will only have 10gb.

If true it's out of the question for me. I'm not going from 12gb vram to 10. Nvidia deserves to get rocked by amd if true.

Not gonna lie if there is no 12gb card from Nvidia on 4070/60/60 ti I am probably going to get a sapphire rdna3 gpu.
Saw that rumor yesterday. I imagine it is just a rumor despite coming from a trusted leaker. Especially since 10GB isn't enough for even a handful of titles. Which is 1 too many titles for such a premium card. Far Cry 6 being an example of this with HD textures and max graphics set up. The regular 3080 chugs along. Agree, I would rather go with RDNA 3 over Lovelace(I think that is what it is called) if that were to be true.
I'm of the opinion, grab the 6900 XT. It's the same money (or cheaper) than an RTX 3080 10/12 GB and better. Sure, Nvidia is winning the Raytracing race as of now, but the 6900 XT just straight brute forces and it appears FSR 2.0 can stand up to DLSS, so you'll have gains there. (Once more universal support, of course) You'll be in good hands if you get either card, but again, why not get the more powerful AND its usually cheaper?
If they are the same price as each other and you don't care about ray tracing(most people realistically don't, I am one of those) then the 6900 XT would make more sense than the 3080 12GB.
 

rofif

Can’t Git Gud
I would like to think Nvidia aren't THAT stupid to make a 10gb 4070 but, hey. They did make an 8gb 3070 :/
they might do that to milk people for more vram high end models.
Although it's not a problem yet with 3080. I had 1 crash with resi2 remake rt when changing settings too much
 
they might do that to milk people for more vram high end models.
Although it's not a problem yet with 3080. I had 1 crash with resi2 remake rt when changing settings too much
This time I think it's going to bite them. Rdna3 rumors are looking strong.

Crypto is going down again so they won't be able to sell anything instantly no matter how much of a turd it is.

If it weren't for crypto/shortages they never would have released 3000 series with so little vram.
 

rofif

Can’t Git Gud
This time I think it's going to bite them. Rdna3 rumors are looking strong.

Crypto is going down again so they won't be able to sell anything instantly no matter how much of a turd it is.

If it weren't for crypto/shortages they never would have released 3000 series with so little vram.
RDNA, as powerful and nice as it is, does not have DLSS and few other nvidia features.
Sadly, the amd counterpart is very bad looking in motion
 

64bitmodels

Reverse groomer.
i got a 6650 xt. I would have gotten a regular 6600 xt but they were at the same price so i figured why not.
i wonder if 8gb is enough for games in the future at 1440p though
 
RDNA, as powerful and nice as it is, does not have DLSS and few other nvidia features.
Sadly, the amd counterpart is very bad looking in motion
Dlss is overrated in my experience. I do need to fiddle with the different versions between 2.3 and 4 though. If I can't get rid of the ghosting I will start shitting on dlss all the time here lol.

Esp. if you want the most crisp image, you don't want dlss. Or taa. Example native 4k with msaa/smaa 1x is what you want for a most sharp image.
 

rofif

Can’t Git Gud
Dlss is overrated in my experience. I do need to fiddle with the different versions between 2.3 and 4 though. If I can't get rid of the ghosting I will start shitting on dlss all the time here lol.

Esp. if you want the most crisp image, you don't want dlss. Or taa. Example native 4k with msaa/smaa 1x is what you want for a most sharp image.
I can't do native IQ anymore. I must have something. TAA is usually pretty good but it can weirdly shimmer.
For example, Death stranding on pc does this bad foliage TAA shimmer. YOu rotate camera and the bushes go crazy... and it does not happen on ps5.
But DLSS is amazing for most part. Especially when you put in the right version
 
I can't do native IQ anymore. I must have something. TAA is usually pretty good but it can weirdly shimmer.
For example, Death stranding on pc does this bad foliage TAA shimmer. YOu rotate camera and the bushes go crazy... and it does not happen on ps5.
But DLSS is amazing for most part. Especially when you put in the right version
We are exact opposites here as well. I always prefer sharpness over lack of jaggies.

Native 4k at the distance I sit (I know you sit closer based on your set up pics) with msaa/smaa 1x has extremely minimal aliasing. I only use taa if it's forced i.e. metro Exodus. Otherwise I inject smaa or use in game msaa if it works.
 

manfestival

Member
I can't do native IQ anymore. I must have something. TAA is usually pretty good but it can weirdly shimmer.
For example, Death stranding on pc does this bad foliage TAA shimmer. YOu rotate camera and the bushes go crazy... and it does not happen on ps5.
But DLSS is amazing for most part. Especially when you put in the right version
I have tried DLSS and FSR. I really don't like how either Temporal solution. There is always something off about it. Native is the only way I can play games. MAYBE this experience changes at 4k but I doubt it. Either way, using powerful GPUs like 6800xt/3080 should not really require these solutions. They are better suited to the less powerful GPUs but surely can be used to push frames if you so wish of course.
 

64bitmodels

Reverse groomer.
Is it worth it to buy 1080 144hz monitor in 2022 or would 1440p 144hz be better? For rtx 3060
TBH for a 3060 i think 1080 144hz would make more sense
1440p 144hz would only work if you had a strong card like 3070ti or above
i still think you should get a 1440p monitor since it's far better looking than 1080
 

rofif

Can’t Git Gud
I have tried DLSS and FSR. I really don't like how either Temporal solution. There is always something off about it. Native is the only way I can play games. MAYBE this experience changes at 4k but I doubt it. Either way, using powerful GPUs like 6800xt/3080 should not really require these solutions. They are better suited to the less powerful GPUs but surely can be used to push frames if you so wish of course.
I've not tried THAT many dlss games but Death Stranding is the best looking I've seen.
4k with quality DLSS looks better than native. except for lack of motion vectors on flying bugs but this is fixable
 

manfestival

Member
Is it worth it to buy 1080 144hz monitor in 2022 or would 1440p 144hz be better? For rtx 3060
think of getting better specs on a monitor as a form of future proofing yourself. That monitor will likely be fine but the industry is trying really hard to move towards 4k. It is struggling with the consoles lacking the power. Which of course means squeezing more life out of 1080p but you will likely get more value out of a 1440p 144hz monitor especially when upgrading down the road. Plus 1440p looks much better than 1080p imo
 

Rbk_3

Member
I have gotten some bad ghosting on dlss. Version 2.4.

For me personally, DLSS Quality is amazing in 1440p on Warzone. It looks better than native to me due to the superior AA. The extra 10-15 FPS is a bonus. i don’t use it for the FPS gains, I genuinely think it looks better, as I am over 200 most of the time anyway.

I can’t stand the jaggies and shimmering on anything less than 2x AA and 2x and Fimic make things too blurry for me. DLSS Quality is perfect balance
 
Last edited:

MidGenRefresh

*Refreshes biennially
When can we expect some kind of Nvidia presentation of new cards? I'm hyped for 4000 series. I think I will look for a new power supply later today just to get my rig ready.

At this rate 5000 series will most likely come with its own external power supply and wall plug. :messenger_grinning_sweat:
 

Jayjayhd34

Member
When can we expect some kind of Nvidia presentation of new cards? I'm hyped for 4000 series. I think I will look for a new power supply later today just to get my rig ready.

At this rate 5000 series will most likely come with its own external power supply and wall plug. :messenger_grinning_sweat:
July Aug with September release hopefully
 

GreatnessRD

Member



Because yes, please! I personally still want the 5800X3D, but man, that would be a nice little addition to the AM4 platform. Would also line up with what AMD was saying about AM4 continuing to live a fruitful life. Love to see it. If true, of course.
 

Jayjayhd34

Member
My rtx 2080 msi ventus v2 has just died playing plague tale was really enjoying it. Now its either wait for rtx4090 or just bite on entirely new system with a rtx3090.
 

GreatnessRD

Member
My rtx 2080 msi ventus v2 has just died playing plague tale was really enjoying it. Now its either wait for rtx4090 or just bite on entirely new system with a rtx3090.
Sorry to hear your 2080 died. Please don't spend over a thousand on a GPU that will be superseded in like 3 months. Wait for that 4090 and if you need a right now GPU, grab one of those hold me overs.
 

Jayjayhd34

Member
Sorry to hear your 2080 died. Please don't spend over a thousand on a GPU that will be superseded in like 3 months. Wait for that 4090 and if you need a right now GPU, grab one of those hold me overs.

I've thankfully manage to get stop restarting my system and going full fan speed by updating my drivers before the system restarted just done couple of hours on plague tales and it's been fine.
 
Last edited:
I intended to replace the paste and pads on my 2080 Ti yesterday, went to open it and noticed all the screws are badly stripped. Guess that's what I get for buying it second hand. It runs crazy hot temps at stock, like 90C with 110C hotspot with the fans at 2500RPM... I have the power limit at 71% and the voltage capped off at like 750mV to keep the hotspot under 90.

I'll have to figure something out. I want to sell it eventually but I can't do that if it's going to melt.

My rtx 2080 msi ventus v2 has just died playing plague tale was really enjoying it. Now its either wait for rtx4090 or just bite on entirely new system with a rtx3090.

Condolences, I live in fear of this.
 
Status
Not open for further replies.
Top Bottom