• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Baldur's Gate 3 runs faster on 5800X3D than 13900K

hinch7

Member
People like me. Hoping black friday has some crazy deal on the 5800x3d to replace my 5600. Otherwise, i’ll hold out until the 8000 series.
I think they'll drop by then.. they were going for under 290 euros or 260 pounds from Amazon Germany a couple months ago. If that gets closer to the 200 mark that'll be a no brainer upgrade. Sell off the 5600 and upgrade for not much more. May not be the latest and greatest but for the price and ease of upgrade, sure kicks ass for gaming still.

And true, that'll last you for some time. Heck, I'd say that the 5600 will easily last you the entire console generation if gaming performance is one your main priority on PC. I have a AM4 system with 5800X3D and don't plan on upgrading until AM6 lol.
 
Last edited:

LiquidMetal14

hide your water-based mammals
Was talking about this on Reddit a few days back. It's impressive no matter how you cut it.

It also feels good to be on top with the 7800X3D.
 

Buggy Loop

Member
Wasn't expecting this game to like the 3D V-Cache so much.

I would imagine there's tons of calculations in the background constantly, all the dice rolls for perceptions, the combat with enemies finding the best path and skills to attack, etc. These calculation heavy games, such as civilization, DCS flight sim, etc, purrs on 3D V-Cache.
 

TMONSTER

Member
No doubt 3D-V-Cache has its advantages when it comes to certain games, but why are they running that 13900k with budget DDR5 5600 RAM? It should be at least 7200.
 

LiquidMetal14

hide your water-based mammals
No doubt 3D-V-Cache has its advantages when it comes to certain games, but why are they running that 13900k with budget DDR5 5600 RAM? It should be at least 7200.
It's even more impressive when you see the 5800X3D running 3200 DDR4 vs 5200 DDR5 for the Intel high end piece.
 
Last edited:

Zathalus

Member
That's a rather bad test though, both CPUs are using bad RAM although it impacts the 13900k more as that CPU can scale really well with something like 7200-8000 RAM.

Not sure if it would impact the end result, just pointing it out.
 

analog_future

Resident Crybaby
I kinda feel bad about building a new rig a month ago with a 7700X. Still a good CPU though.

Maybe I'll upgrade it to an 8800X3D next year when that comes out.
 

Bojji

Member
I kinda feel bad about building a new rig a month ago with a 7700X. Still a good CPU though.

Maybe I'll upgrade it to an 8800X3D next year when that comes out.

If you plan to use your rig for gaming there is no point to NOT go with 3D chips, you don't even have to waste time/money on super fast memory because it's not that important with large amounts of cache.
 
Last edited:

OverHeat

« generous god »
7800x3D is a gamer dream come true I’m « stuck » with a 7950X3D because I needed the extra core for productivity lol
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'm playing this game in mid high settings with a 2500k, almost 60fps.

Hahahaha the 2500K is frikken bulletproof.
My old PC rig has been repurposed as a media center powered by a 2500K running at 4.5GHz........somehow it still actually manages to play most modern titles at 60fps.
 

Interfectum

Member
Hahahaha the 2500K is frikken bulletproof.
My old PC rig has been repurposed as a media center powered by a 2500K running at 4.5GHz........somehow it still actually manages to play most modern titles at 60fps.
I built a 2500K PC years ago, it's on its third owner now (gave to my brother who gave to a friend) and that CPU is still crushing it. So crazy to see it still in action after all these years. Best CPU ever.
 

MikeM

Member
I think they'll drop by then.. they were going for under 290 euros or 260 pounds from Amazon Germany a couple months ago. If that gets closer to the 200 mark that'll be a no brainer upgrade. Sell off the 5600 and upgrade for not much more. May not be the latest and greatest but for the price and ease of upgrade, sure kicks ass for gaming still.

And true, that'll last you for some time. Heck, I'd say that the 5600 will easily last you the entire console generation if gaming performance is one your main priority on PC. I have a AM4 system with 5800X3D and don't plan on upgrading until AM6 lol.
Yeah im perfectly happy with my 5600 (hell, even my PS5 that I tend to use far more). This PC is just insurance against 30fps, albeit a nuclear option. I’m only going 5800x3d if it gets dirt cheap, otherwise I’ll run the 5600 for another two years minimum.
 

XesqueVara

Member
14th gen is just a refresh of the 13th gen.

Even Meteor/ArrowLake isnt looking to be a sizeable upgrade cept on GPU side of things.
The true upgrades are gonna drop with LunarLake.

images-3.jpg

14th gen is just a refresh of the 13th gen.

Even Meteor/ArrowLake isnt looking to be a sizeable upgrade cept on GPU side of things.
The true upgrades are gonna drop with LunarLake.

images-3.jpg
Lunar Lake is a Mobile focused arch based on Lion Cove, think of apple M1 power Tier... Desktop is ARL and that looks mid a Fuck
Image-02.jpg

 
Last edited:
somehow [the 2500k] still actually manages to play most modern titles at 60fps.
it's single thread performance is still decent when OCed, and games still love single thread performance.

read that intel is considering going hard on single thread performance in a few generations from now... like a couple super-sized P cores for single thread performance (maybe even just one ultimate P core), then the rest E cores.
 

Tsaki

Member
If the game needs cache, X3D is a winner. The Zen4 chips that have it are even better. Great performance per watt as well.
 

rnlval

Member
Lunar Lake is a Mobile focused arch based on Lion Cove, think of apple M1 power Tier... Desktop is ARL and that looks mid a Fuck
Image-02.jpg

The main point for AM5's platform longevity is continuous CPU improvement, not just from design "refresh."

Intel has AVX10.1 and AVX10.2 that redefined post-IceLake AVX-512's instruction set improvements into a 256-bit version. https://www.anandtech.com/show/1897...sas-unifying-avx512-for-hybrid-architectures-

Intel kitbashing post AVX2 instruction set into a PowerPC-like mess.
rZUW31W.png


Additional information from https://en.wikipedia.org/wiki/Advanced_Vector_Extensions
 
Last edited:

XesqueVara

Member
The main point for AM5's platform longevity is continuous CPU improvement, not just from design "refresh."

Intel has AVX10.1 and AVX10.2 that redefined post-IceLake AVX-512's instruction set improvements into a 256-bit version. https://www.anandtech.com/show/1897...sas-unifying-avx512-for-hybrid-architectures-

Intel kitbashing post AVX2 instruction set into a PowerPC-like mess.
rZUW31W.png


Additional information from https://en.wikipedia.org/wiki/Advanced_Vector_Extensions
AVX10 was the way that Intel found to implement AVX 512 on desktop because of the Atom Cores didn't support 512-bit registers.
And about AM5 longevity yeah, we are gonna get a new core with Zen 5 while for Intel you will need a New Socket for ARL
 

rsouzadk

Member

vM2TRbb.jpg


Pretty impressive showing for AM4's swan song, the Zen 3-based 5800X3D, against Intel's latest and greatest. If you haven't been willing to overhaul your entire PC to move to AM5 or whatever socket Intel is on these days, which requires a new motherboard and also DDR5 memory, then you should have bought a 5800X3D ages ago now.
Holy smokes. That 7800X3D is a monster.
 

Reallink

Member
I'm really glad I went AMD this time..
My build started with a Ryzen 1700X, needed more oomph for VR so upgraded to a 3950X
was thinking.. I was really lucky I could do that but I'm guessing it's time for a whole new build now..
look up my Mobo support.. what? I can get a 5000 series now?

think I'm going to get a 5800x3D at some point and hopefully one day I'll be able to afford a GPU that actually feels like an upgrade to my GTX 1080TI

They've only committed to 2025 for AM5, which could potentially only include Ryzen 7000's and 8000's.
 
The main point for AM5's platform longevity is continuous CPU improvement, not just from design "refresh."

Intel has AVX10.1 and AVX10.2 that redefined post-IceLake AVX-512's instruction set improvements into a 256-bit version. https://www.anandtech.com/show/1897...sas-unifying-avx512-for-hybrid-architectures-

Intel kitbashing post AVX2 instruction set into a PowerPC-like mess.
rZUW31W.png


Additional information from https://en.wikipedia.org/wiki/Advanced_Vector_Extensions
The strangest thing about Intel sabotaging their own AVX-512 instruction set by rushing to implement the E-cores without AVX is that my gaming laptop with an i7-11800H (11th gen Tiger Lake) was the last Intel consumer part to support AVX-512. Starting with 12th gen, AVX-512 support was removed because Intel couldn't find a way to not crash operating systems when the P-cores technically supported it but the E-cores didn't, so even though the actual P-cores in 12th/13th gen supported it, it was disabled.

The story goes that Intel engineers who spent years working AVX-512 were livid when they found out that all their hard work was being thrown in the garbage in the rush to implement x86 big.LITTLE using recycled Atom cores for the E-cores.

In general, no one knows what the E-cores actually do except allow Intel to inflate multi-core benchmark scores and also make the 12th/13th gen CPU's run fucking insanely hot for no good reason compared to the 11th gen.
 
Last edited:

rnlval

Member
The strangest thing about Intel sabotaging their own AVX-512 instruction set by rushing to implement the E-cores without AVX is that my gaming laptop with an i7-11800H (11th gen Tiger Lake) was the last Intel consumer part to support AVX-512. Starting with 12th gen, AVX-512 support was removed because Intel couldn't find a way to not crash operating systems when the P-cores technically supported it but the E-cores didn't, so even though the actual P-cores in 12th/13th gen supported it, it was disabled.

The story goes that Intel engineers who spent years working AVX-512 were livid when they found out that all their hard work was being thrown in the garbage in the rush to implement x86 big.LITTLE using recycled Atom cores for the E-cores.

In general, no one knows what the E-cores actually do except allow Intel to inflate multi-core benchmark scores and also make the 12th/13th gen CPU's run fucking insanely hot for no good reason compared to the 11th gen.
With multitasking OS, register states needs to be saved before switching to another context's register state. Context switching can get complicated with different CPU instruction sets.

E-Cores inflate 128-bit SSE, 128-bit AVX 2 subset, and 256-bit AVX2 (with a double cycle on E-Core's 128-bit hardware) scores. E-Core are three instruction issues out-of-order X86-64 CPUs i.e. they are effectively Pentium M (three instruction issue decoders) with X86-64 and AVX2 (on 128-bit SIMD hardware) support.
 
Last edited:
Top Bottom