• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"I Need a New PC!" 2023. 6-24 Cores, Frame Generation, Enhanced Ray Tracing & Direct Storage.

Status
Not open for further replies.

OverHeat

« generous god »
ueELW67.jpg
 

Sleepwalker

Gold Member
So im looking to upgrade my 3600x, was ready to press the buy button on a 5800x3d and then someone sent me a bundle at a store here with a 5900x + 64gb 3200mhz DDR4 RAM (32gb dual stick) + company of heroes 3 (id sell this honestly) for only $10 more than the 5800x3d alone.

What to do?

I have 32GB 3200mhz ram currently, but more ram is more ram innit?

For gaming I will play at 4k most of the times but I use my pc for productivty a hell of a lot more than gaming.
 

GreatnessRD

Member
So im looking to upgrade my 3600x, was ready to press the buy button on a 5800x3d and then someone sent me a bundle at a store here with a 5900x + 64gb 3200mhz DDR4 RAM (32gb dual stick) + company of heroes 3 (id sell this honestly) for only $10 more than the 5800x3d alone.

What to do?

I have 32GB 3200mhz ram currently, but more ram is more ram innit?

For gaming I will play at 4k most of the times but I use my pc for productivty a hell of a lot more than gaming.
If you use your PC More for productivity, then that 5900x deal is a no brainer. Hell, appears to be a better deal than the 5800X3D anyway with the ram and a free game.
 

OverHeat

« generous god »
The Jump from my 7950X to the 7950x3d is a little disappointing good thing I did not sold my regular X for a lost
 

dave_d

Member
If you use your PC More for productivity, then that 5900x deal is a no brainer. Hell, appears to be a better deal than the 5800X3D anyway with the ram and a free game.
Wait a sec, I got the free game when I bought the 5800x3d a few weeks ago from Amazon. Isn't it a deal where you get it with either of those cpu's?
 

GreatnessRD

Member
Wait a sec, I got the free game when I bought the 5800x3d a few weeks ago from Amazon. Isn't it a deal where you get it with either of those cpu's?
Not that I'm aware of. AMD change promotions all the time. Even still, based on the quoted person's use case, it still wouldn't make sense for the 5800X3D since they don't game like that and use productivity.
 

dave_d

Member
Not that I'm aware of. AMD change promotions all the time. Even still, based on the quoted person's use case, it still wouldn't make sense for the 5800X3D since they don't game like that and use productivity.
Ok found the link to the promotion here. Apparently you get the game for the following

AMD Ryzen™ 9 5950X
AMD Ryzen™ 9 5900X
AMD Ryzen™ 7 5800X3D
AMD Ryzen™ 7 5800X
AMD Ryzen™ 7 5700X
AMD Ryzen™ 5 5600X
AMD Ryzen™ 5 5600
AMD Ryzen™ 5 5500

So like you said if you're doing productivity do the 5900x but if you're doing gaming 5800x3d but you get the game in either case.
 

hinch7

Member
Interesting discussion on Anand

https://forums.anandtech.com/threads/the-8gb-not-enough-thread.2595331/page-15?view=date


And I think they're right. Nvidia is skimping on VRAM and it will obsolescence the fuck out of a lot gaming PCs the next few years.
They 've been doing this for several generations. Offering the bare minumum for the lower end SKU's. Reminds me of Apple, and their paultry RAM offerings and upgrades on their iPhones.

Granted 16GB isn't going to be a limiting factor any time soon. 12GB on the other hand is is going to be problematic at 1440P+ when more demanding games release, without the use of DLSS or other TAAU upscaling.
 
Last edited:
They 've been doing this for several generations. Offering the bare minumum for the lower end SKU's. Reminds me of Apple, and their paultry RAM offerings and upgrades on their iPhones.

Granted 16GB isn't going to be a limiting factor any time soon. 12GB on the other hand is is going to be problematic at 1440P+ when more demanding games release, without the use of DLSS or other TAAU upscaling.
But it's not just the 4060. Their higher end cards are VRAM starved as well. I just don't get it, AMD doesn't have this issue.
 

LiquidMetal14

hide your water-based mammals
For those of you with an Asus rog x670 e motherboard or in general any of those motherboards that support the Zen 4 CPUs there is a new bios update that just came out that apparently better performance. I know that the new Nvidia graphics driver definitely fix the high GPU CPU usage issue being reported because modern warfare 2 ran in the mid to high 80s and then it wouldn't come down from that temperature or voltage until you restart it.

Now with that fix coupled with this new bios, I've never runs so fast the hardware with overclocks while running so cool. The graphics card is always been solid (Gaming OC 4090) but now my temps don't even go above 60 anymore in modern warfare which is semi-taxing. I do disable core isolation and other certain fixes and windows 11 and this build is the most performant one I've ever had given I'm pushing PBO OC and tweaks in DIGIVRM with a good GPU OC on top of that. On the best jest loads I get 71c at the most after an hour of gaming and CPU is as solid as I've ever had and that's saying a lot.

Helps that I'm running an Arctic Liquid Freeze II (push/pull) in a Corsair 7000D.
 

DanEON

Member
For those of you with an Asus rog x670 e motherboard or in general any of those motherboards that support the Zen 4 CPUs there is a new bios update that just came out that apparently better performance. I know that the new Nvidia graphics driver definitely fix the high GPU CPU usage issue being reported because modern warfare 2 ran in the mid to high 80s and then it wouldn't come down from that temperature or voltage until you restart it.

Now with that fix coupled with this new bios, I've never runs so fast the hardware with overclocks while running so cool. The graphics card is always been solid (Gaming OC 4090) but now my temps don't even go above 60 anymore in modern warfare which is semi-taxing. I do disable core isolation and other certain fixes and windows 11 and this build is the most performant one I've ever had given I'm pushing PBO OC and tweaks in DIGIVRM with a good GPU OC on top of that. On the best jest loads I get 71c at the most after an hour of gaming and CPU is as solid as I've ever had and that's saying a lot.

Helps that I'm running an Arctic Liquid Freeze II (push/pull) in a Corsair 7000D.
yep, i saw it. But is still beta version for my mobo (X670E Tuf Gaming). I will wait for the final version.
 

SmokedMeat

Gamer™
Looking to upgrade my GPU.

Which do you feel is more important to newer games?

Higher VRAM or DLSS 3?

I’m deciding between a 4070ti and an AMD 7900XT. I’m not sure I really want a 12GB card in 2023, but frame generation seems to be decent, and Nvidia outperforms in Ray tracing.
That 20GB VRAM sure sounds nice though. Especially with Resident Evil 4, where I had to turn down settings cause my card’s only 8GB.
 
Last edited:

winjer

Gold Member
Looking to upgrade my GPU.

Which do you feel is more important to newer games?

Higher VRAM or DLSS 3?

I’m deciding between a 4070ti and an AMD 7900XT. I’m not sure I really want a 12GB card in 2023, but frame generation seems to be decent, and Nvidia outperforms in Ray tracing.
That 20GB VRAM sure sounds nice though. Especially with Resident Evil 4, where I had to turn down settings cause my card’s only 8GB.

Depends on how much you value those things.

DLSS 3 is only good for single player games, as it ads latency. For competitive games is something to avoid.
But it can be smooth looking. It reduces CPU usage and is less prone to be bottlenecked.
AMD said they are also developing FSR3, with frame generation. But when it's released and if it's any good, it's anyone's guess at this point.

Do you value RT? If so, then the 4070 Ti the best choice.
But RT also increases vram usage.

Vram usage in games is constantly increasing. But the point when a certain amount will be an issue is difficult to predict.
12GB seems ok for most games today, but it's hard to tell for how long. 20 GB is sure to last much longer.

The 40707 Ti is also more power efficient. And at a time when energy cost are high, it's something to also take into account.

Another thing to consider is that AMD has lower driver overhead. So if you have a less powerful CPU, that should be taken into consideration.
 

SmokedMeat

Gamer™
Depends on how much you value those things.

DLSS 3 is only good for single player games, as it ads latency. For competitive games is something to avoid.
But it can be smooth looking. It reduces CPU usage and is less prone to be bottlenecked.
AMD said they are also developing FSR3, with frame generation. But when it's released and if it's any good, it's anyone's guess at this point.

Do you value RT? If so, then the 4070 Ti the best choice.
But RT also increases vram usage.

Vram usage in games is constantly increasing. But the point when a certain amount will be an issue is difficult to predict.
12GB seems ok for most games today, but it's hard to tell for how long. 20 GB is sure to last much longer.

The 40707 Ti is also more power efficient. And at a time when energy cost are high, it's something to also take into account.

Another thing to consider is that AMD has lower driver overhead. So if you have a less powerful CPU, that should be taken into consideration.

I hadn’t looked at the difference in efficiency, but yeah 4070ti wins that one. Even uses less power than my 3070ti, so that’s nice.

I’m looking for an all around card. If a game makes use of Ray tracing then I want to make use of it.

I’m going to keep reading up on the two, but I think they really are pretty evenly matched in benchmarks.
 
Last edited:
Looking to upgrade my GPU.

Which do you feel is more important to newer games?

Higher VRAM or DLSS 3?

I’m deciding between a 4070ti and an AMD 7900XT. I’m not sure I really want a 12GB card in 2023, but frame generation seems to be decent, and Nvidia outperforms in Ray tracing.
That 20GB VRAM sure sounds nice though. Especially with Resident Evil 4, where I had to turn down settings cause my card’s only 8GB.
DLSS 3.0 is nice to have but not every game supports it or will. More VRAM can potentially benefit every game.

I wouldn't buy anything with less than 16GB VRAM. That's why I passed on the 3080 with 10 or 12GB. I got a 4080 which has 16GB, DLSS3.0, and awesome RTX performance. Now I've been using the card for a couple months I wish I had got the 4090 with 24GB because I'm seeing games use up to 14GB VRAM. I don't know if it's just because it's there that more games use it or if it really needs it. I'm not even playing at 4K.

If I had to pick between those two cards it'd probably go for the 7900XT as long as you wanted more VRAM and weren't bothered about raytracing. I mean the raytracing on the 7900XT ain't bad but If you're coming from a 3070 Ti it might not be a huge upgrade.

If you want DLSS 3 and the best Raytracing then get the 4070 Ti. 12GB would probably be fine if you're playing at 1080-1440p. I made the decision to move up to the 4080 for the improved performance and extra 4GB VRAM.
 

winjer

Gold Member
I hadn’t looked at the difference in efficiency, but yeah 4070ti wins that one. Even uses less power than my 3070ti, so that’s nice.

I’m looking for an all around card. If a game makes use of Ray tracing then I want to make use of it.

I’m going to keep reading up on the two, but I think they really are pretty evenly matched in benchmarks.

For today, the best choice is the 4070 Ti. It just wins more than it loses.
But don't be surprised if a couple of years from now, you have to upgrade again, just because of the vram.
 

hinch7

Member
DLSS 3.0 is nice to have but not every game supports it or will. More VRAM can potentially benefit every game.

I wouldn't buy anything with less than 16GB VRAM. That's why I passed on the 3080 with 10 or 12GB. I got a 4080 which has 16GB, DLSS3.0, and awesome RTX performance. Now I've been using the card for a couple months I wish I had got the 4090 with 24GB because I'm seeing games use up to 14GB VRAM. I don't know if it's just because it's there that more games use it or if it really needs it. I'm not even playing at 4K.

If I had to pick between those two cards it'd probably go for the 7900XT as long as you wanted more VRAM and weren't bothered about raytracing. I mean the raytracing on the 7900XT ain't bad but If you're coming from a 3070 Ti it might not be a huge upgrade.

If you want DLSS 3 and the best Raytracing then get the 4070 Ti. 12GB would probably be fine if you're playing at 1080-1440p. I made the decision to move up to the 4080 for the improved performance and extra 4GB VRAM.
Theres that and AMD will be showing off FSR 3 at GDC tommorow. Probably worth waiting to see whats the deal with that, if you're into fake frames.

Hardware Unboxed did a head to head comparison recently with those two GPU's



From most benchmarks I've seen. If Raytracing is high priority and you play 1440P, the 4070Ti if the probably the one you want. Its also quite bit more efficient GPU. Want 4K.. the 7900XT and 20GB will give you much more headroom and perhaps longevity. Nvidia cards do have better resale value though.
 
Last edited:
Theres that and AMD will be showing off FSR 3 at GDC tommorow. Probably worth waiting to see whats the deal with that, if you're into fake frames.

Hardware Unboxed did a head to head comparison recently with those two GPU's



From most benchmarks I've seen. If Raytracing is high priority and you play 1440P, the 4070Ti if the probably the one you want. Its also quite bit more efficient GPU. Want 4K.. the 7900XT and 20GB will give you much more headroom and perhaps longevity. Nvidia cards do have better resale value though.

"fake frames"? lol

between enabling and disabling frame generation in Hogwarts Legacy or Cyberpunk let me tell you there is a huge difference between playing at 50fps and 100-140fps (Hogwarts) or 55fps and 105-120fps (cyberpunk). If they are fake they are damn convincing. There is basically no difference between real and fake frames. Yeah yeah i know the "fake" frames add latency or some shit but I can't feel any difference. Maybe if it was an esports game or something it'd matter but every esports game I have can hit 200+fps anyway.
 

SmokedMeat

Gamer™
For today, the best choice is the 4070 Ti. It just wins more than it loses.
But don't be surprised if a couple of years from now, you have to upgrade again, just because of the vram.

This (and price) is what initially drew me to AMD. For $800+ I’d like more than a couple years.
 
damn i think im getting old now...
it's a 6 year old GPU...
This (and price) is what initially drew me to AMD. For $800+ I’d like more than a couple years.
AMD does appear better value but then you have to deal with the quality control of the hardware and the quality of the drivers.

I don't want to sound like an Nvidia fanboy and as much as I hate to say it I will always pick Nvidia. I seriously considered an AMD card but the hardware issues put me off. And the better raytracing and DLSS 3 convinced me to stay wth Nvidia.

I spent £1,200 on my 4080 and people might laugh at me for that but i'm happy to pay extra. The 4070 Ti is more of a 1080-1440p card that you'd use for maybe 2 years at 1440p or 3-4 at 1080p. I want my card to last 4+ years so went with the 4080.
 
damn i think im getting old now...
I just don't like paying so much for GPU. When I built my 2012ish (whenever Diablo III came out) PC, I got a 560 ti used for maybe 150 or something, it served my purposes well then. And then the recent PC built in 2022 I got the 1080ti from the same guy coincidentally, used for about 150 also give or take. It's doing great for me so far, but I don't play many new games. But those I've tried so far have run really well, or at least tolerable enough for my standards (Elden Ring, RE4Remake, etc). Mainly I just wanted a decent CPU and decent amount of RAM. There's just no way I could stomach paying as much for some top end GPU as I spent on the entire machine! :messenger_loudly_crying: So pretty much I'm in a cycle that whenever I build a new PC, it just so happens my pal is upgrading his card (he likes to stay pretty up to date) and I'll take the old one! :messenger_grinning_sweat: Then again, I'm the one guy still running 720p in 2023. I swear I'm buying a new monitor soon, lol.
 

Sleepwalker

Gold Member
If you use your PC More for productivity, then that 5900x deal is a no brainer. Hell, appears to be a better deal than the 5800X3D anyway with the ram and a free game.
I ended up getting the 5900x, lol productivity performance went up A LOT with this combined with the RAM upgrade.

Next up a 4090 somewhere in the near future.
 

Sleepwalker

Gold Member
And my husky just threw my PC down, glass shattered all over the place. So much for the upgrade project going smoothly :messenger_tears_of_joy:
 

hinch7

Member
"fake frames"? lol

between enabling and disabling frame generation in Hogwarts Legacy or Cyberpunk let me tell you there is a huge difference between playing at 50fps and 100-140fps (Hogwarts) or 55fps and 105-120fps (cyberpunk). If they are fake they are damn convincing. There is basically no difference between real and fake frames. Yeah yeah i know the "fake" frames add latency or some shit but I can't feel any difference. Maybe if it was an esports game or something it'd matter but every esports game I have can hit 200+fps anyway.
That's what they are though. Intropolated frames. Didn't say that there's anything inherently wrong with the technology. But its not exactly new either. Now its used on GPU's or going to be integrated into drivers/software stack.
 
Last edited:
That's what they are though. Intropolated frames. Didn't say that there's anything inherently wrong with the technology. But its not exactly new either. Now its used on GPU's or going to be integrated into drivers/software stack.
I know what it is but I've just seen a lot of people say it's fake and as if you should use it or see it as a selling point. Yeah it's not being rendered by the GPU like the "real" frames but the GPU is still generating them through AI. They make be "fake" but it gives me better performance so basically real frames to me.

Now interpolated frames on TV I wouldn't use them because it causes issues like artifacting. I know in games it adds latency but it doesn't feel any different to me so it's a win win and better implemented than on TVs
 

how am i doing? how is the mobo? Thats where my lack of knowledge is at. I have a 3080, but will be getting a 4090 most likely

already have a case and 800w psu and fans and what not
Looks fine but a Z690 board is for 12th gen cpus. A 13th gen will work but the BIOS may need to be updated to support it. I think Asus boards should let you update the BIOS without the CPU installed. If you want to avoid this then for 13th gen CPUs you would get a Z790 board as you shouldn't need to worry about BIOS. Of course if the 690 is cheaper then go with that. I'd recommend looking at the support site for that 690 board to see if supports BIOS updates without CPU

Why are you buying 1x 2TB SSD and 2x 1TB SSD?

I'd recommend going with 32GB RAM instead of 16GB unless you plan on doing that later.

Edit: seems you can update BIOS on the Z690 even with a 13th gen cpu. So you should be good :)
 
Last edited:

dorkimoe

Member
Looks fine but a Z690 board is for 12th gen cpus. A 13th gen will work but the BIOS may need to be updated to support it. I think Asus boards should let you update the BIOS without the CPU installed. If you want to avoid this then for 13th gen CPUs you would get a Z790 board as you shouldn't need to worry about BIOS. Of course if the 690 is cheaper then go with that. I'd recommend looking at the support site for that 690 board to see if supports BIOS updates without CPU

Why are you buying 1x 2TB SSD and 2x 1TB SSD?

I'd recommend going with 32GB RAM instead of 16GB unless you plan on doing that later.

Edit: seems you can update BIOS on the Z690 even with a 13th gen cpu. So you should be good :)
the ram is 2x16gb. I have a lot of games installed so going to use 1tb for windows. 2tb for steam and 1tb for Xbox games. I’ll get the z790 then. I hate fucking with the bios, it always bricks on me. Thanks!

are those ssds still good for gaming ?
 
Last edited:
the ram is 2x16gb. I have a lot of games installed so going to use 1tb for windows. 2tb for steam and 1tb for Xbox games. I’ll get the z790 then. I hate fucking with the bios, it always bricks on me. Thanks!
my bad! i should learn to read better :messenger_grinning_sweat:

fair enough. i'd probably do the same. i have a boot drive for windows and a drive for games. if i was fucking about with installing games from Microsoft's store i'd want to put them on their own drive. I've had too many issues installing games from the MS store and it's resulted in me having to wipe the ssd.

the Z690 should be fine but yeah if you don't want to mess about with it then go for the Z790. i was looking at that exact same board but the Z790 one and it's expensive here. Hopefully you find a board that isn't much more expensive than the Z690.
 

dorkimoe

Member
my bad! i should learn to read better :messenger_grinning_sweat:

fair enough. i'd probably do the same. i have a boot drive for windows and a drive for games. if i was fucking about with installing games from Microsoft's store i'd want to put them on their own drive. I've had too many issues installing games from the MS store and it's resulted in me having to wipe the ssd.

the Z690 should be fine but yeah if you don't want to mess about with it then go for the Z790. i was looking at that exact same board but the Z790 one and it's expensive here. Hopefully you find a board that isn't much more expensive than the Z690.
Might go with this https://pcpartpicker.com/product/T2...x-atx-lga1700-motherboard-z790-aorus-elite-ax

or did I read people don’t like gigabyte anymore?
 

dorkimoe

Member
Can anyone explain like I’m 5 what these lanes mean? i want to use 3xm2 drives but see people talking about lanes and taking processor lanes
 

.hacked

Member
7800xt and a 5800x3d showing up tomorrow time to resurrect my gaming PC!

small step down from the 4080 I had before, but now using a 1440p monitor so should be more than enough to get the job done.
 

HeisenbergFX4

Gold Member
Snagged an open box buy from Best Buy while doing a TV spot and have been playing the Diablo IV beta and honestly super impressed with the 7700x 4080 combo and since I am playing on this 1440p ultrawide making me rethink if I really need a 4090 build thats going to be like $1500 more than this thing

Kind of runs a little loud when pushed but thinking a larger rad would fix that

https://www.bestbuy.com/site/ibuypo...dd-1tb-nvme-ssd-black/6527262.p?skuId=6527262
 

Puscifer

Gold Member
Fellas would a 30" desk be ok for a 48 oled strictly for gaming? Reason being is that lol, Best Buy has the 48 for the same price as a 42 if you're a totaltech member (such a based membership) and it's just so damn tempting!
 

LiquidMetal14

hide your water-based mammals
Added a 3rd front 140mm Corsair fan to the 7000D. GPU resides in that space and that extra air brings it down significantly in idle state and on average around 4-7C during heavy gameplay.
 

Verchod

Member
I have a RAM related query.
I currently only have 16GB in 2x 8GB sticks. I was going to add 2 more 8GB but found I could get 2x 16GB for only a little more. Obviously I'd get exactly same speed and latency, but is there any weird performance affects from combining different capacity modules together?
 

Sakura

Member
I have a RAM related query.
I currently only have 16GB in 2x 8GB sticks. I was going to add 2 more 8GB but found I could get 2x 16GB for only a little more. Obviously I'd get exactly same speed and latency, but is there any weird performance affects from combining different capacity modules together?
So you will have 2 8GB sticks paired together and 2 16GB sticks paired together right? It should work fine, as long as you aren't trying to pair a 16GB stick with an 8GB one. I'm not an expert though.
 
I have a RAM related query.
I currently only have 16GB in 2x 8GB sticks. I was going to add 2 more 8GB but found I could get 2x 16GB for only a little more. Obviously I'd get exactly same speed and latency, but is there any weird performance affects from combining different capacity modules together?

I would get the 16.x2. At least on AMD, not sure about Intel.
 

HeisenbergFX4

Gold Member
Fellas would a 30" desk be ok for a 48 oled strictly for gaming? Reason being is that lol, Best Buy has the 48 for the same price as a 42 if you're a totaltech member (such a based membership) and it's just so damn tempting!
Big enough for width? The 48" has a center stand so you are fine there.

Depth wise? I would just back up a little
 

rofif

Can’t Git Gud
Ok it's time to consider upgrading the tired, thick ikea malm desk.
Since I am upgrading, why not get one of these fancy motorized desks? I am a tall 1,88cm guy. The secretlab chair is a tall guy. The malm desk it a small guy. it's 72cm.
It is fine for the most part but because the countertop is so thick (5cm), it is annoying to slide a chair under there or game with a controller sometimes.
Also, and this is more important. The ikea malm is 140cm x 65cm. So a bit more depth could be useful. Not sure if I need more width, since as you see on the picture, it is perfect. I could go for 180cm width but then I need to remove one shelving unit on the left and put ps5 on the desk. Not sure if it's a good idea.

I am not working from home and I do not plan to use my pc standing, but I would like the ability to have different console gaming height, pc gaming height or movies height.... or just be able to tune it without reassembling the whole desk.
Anyone used this? I like the sloped edges and reviews say it's sturdy and don't wobbe too much. I know it's an overkill but who cares. All I do is sit in front of the pc all the time. Might as well do it correctly.

Top is my eyesight level when I sit correctly, lower line is when I slouch for console gaming a bit more
9H1c7jm.jpg
 
Last edited:

Yerd

Member
I have an old desk from IKEA that I would hate to give up. The front facade is falling off but the rest of it is fine. I need to find double sided glue strip to reattach it. They don't make it anymore and I've never seen a desk like it anywhere else. It's nice and deep and has a curved front for fat belly.

The problem I see with most desks is that they are not very sturdy. My desk is metal frame with the desk being one big piece of MDF. Full MDF desks end up being flimsy piles of crap.
 
Status
Not open for further replies.
Top Bottom