• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD presents Zen 5

Leonidas

Member
I sincerely hope you are trolling at this point, because nobody can be this oblivious to their own bias.
Not sure what you are responding too.

The RAM point is moot as AMD is now up to 5600, just like Intel in spec.

The main BS is AMD using 7900 XTX, sabotaging 14900K in several of their benchmarks. Makes no sense, when there are faster GPUs. Only explanation is they did it to make 14900K look worse.

Very impressive power wattage wise, Intel can only dream of getting their CPUs back to this level, hope it pushes them to do SOMETHING though.
Intel moved on from 10nm/Intel 7 to a much better node with Arrow Lake, so power efficiency should be greatly improved. AMD no longer has a 2-node advantage.
 
Last edited:

Zathalus

Member
Not sure what you are responding too.

The RAM point is moot as AMD is now up to 5600, just like Intel in spec.
Where do I start?

- Your ranting about AMD fanboys
- Your complaint about AMD having misleading marketing, and then claiming Intel marketing has held up under your investigation. A simple Google search would clear that right up, Intel makes misleading marketing and statements all the time.
- In a previous thread you claimed the new Intel changes to the default CPU behaviour was a good thing as your CPU ran cooler and lost no performance. But suddenly it is now a bad thing.
- Posting aggregate benchmarks that include data in which AMD CPUs were hamstrung by slower RAM compared to the Intel offerings. You claim this is within spec, but have a problem when AMD pairs both the AMD and Intel CPU with the same RAM. Both Intel and AMD (with Zen 5) can take advantage of faster RAM, but the point was to level the playing field.
- Further nonsense about how the 7900 XTX runs worse on Intel CPUs, there is zero reliable proof that this is the case. It is 'known' the 14900k looses performance with that card, no it doesn't! (Side note, of course AMD is going to use their flagship card for benchmarks).

There is only one thing that benchmark does which is a bit problematic, the use of Intel Default - but gaming related benchmarks at that resolution the performance difference is minimal.
 

Leonidas

Member
Where do I start?

- Your ranting about AMD fanboys
winjer winjer , you and some other AMD fanboys come up with all kinds of excuses for AMD. Can't wait to see the excuses that come up when Zen5 barely (if at all) beats 14900K in gaming reviews.

- Your complaint about AMD having misleading marketing, and then claiming Intel marketing has held up under your investigation. A simple Google search would clear that right up, Intel makes misleading marketing and statements all the time.
I'm not going off ancient history, only 2022 and later. They were the closest to CPU peformance, being off by a little with Raptor estimate.

Meanwhile AMD lied about 7600X being 5% faster than 12900K, while it was actually slower.
And today I see a glaring example where they cut 14900K performance by ~20% in one of the games by using 7900 XTX. Disgusting. If you all don't find out what game I'm talking about now you will know when the reviews with 4090 used hit.
- In a previous thread you claimed the new Intel changes to the default CPU behaviour was a good thing as your CPU ran cooler and lost no performance. But suddenly it is now a bad thing.
It was great for 13600K, but it obviously was going to hurt the high power draw parts since some of those used over 253 watts at full load.
- Posting aggregate benchmarks that include data in which AMD CPUs were hamstrung by slower RAM compared to the Intel offerings. You claim this is within spec, but have a problem when AMD pairs both the AMD and Intel CPU with the same RAM. Both Intel and AMD (with Zen 5) can take advantage of faster RAM, but the point was to level the playing field.
We're crying about aggregate reviews now? Really? You and winjer winjer have lost it on the aggregate review thing.

Instead of aggregate reviews how about we take a look at Hardware Unboxed o TechPowerup when they post their results? Or are they BIASed too?

- Further nonsense about how the 7900 XTX runs worse on Intel CPUs, there is zero reliable proof that this is the case. It is 'known' the 14900k looses performance with that card, no it doesn't! (Side note, of course AMD is going to use their flagship card for benchmarks).
Just like winjer winjer , you haven't seen the data I've seen. If you had you wouldn't make stupid claims as you have here, zero proof my ass, I'm looking right at it. If you don't find out what game it is by review time I'll point it out to you then, and you'll see 14900K destroy Zen5 in at least of these games when they use a 4090, and several others will be a lot closer with 14900K winning.

Zen5 will be lucky to beat 14900K by 4% on average.

If you think I'm lying ban bet me!

(Side note, of course AMD is going to use their flagship card for benchmarks).
They absolutely will if it keeps AMD performance about the same but drops Intel by ~50 FPS in a game, vs. 4090. And you guys bought it, AMD is clever, they know what their fanboys will buy into.
 
Last edited:

winjer

Member
winjer winjer , you and some other AMD fanboys come up with all kinds of excuses for AMD. Can't wait to see the excuses that come up when Zen5 barely (if at all) beats 14900K in the real world.


I'm not going off ancient history, only 2022 and later. They were the closest to CPU peformance.

Meanwhile AMD lied about 7600X being 5% faster than 12900K, while it was actually slower.
And today I see a glaring example where they cut 14900K performance by ~20% in one of the games by using 7900 XTX. Disgusting. If you all don't find out what game I'm talking about now you will know when the reviews with 4090 used hit.

It was great for 13600K, but it obviously was going to hurt the high power draw parts since some of those used over 253 watts at full load.

We're crying about aggregate reviews now? Really? You and winjer winjer have lost it on the aggregate review thing.

Instead of aggregate reviews how about we take a look at Hardware Unboxed when they post their results? Or are they BIASed too?


Just like winjer winjer , you haven't seen the data I've seen. If you don't find out what game it is by review time I'll point it out to you then, and you'll see 14900K destroy Zen5 in at least of these games when they use a 4090, and several others will be a lot closer with 14900K winning.

Zen5 will be lucky to beat 14900K by 4% on average.


They absolutely will if it keeps AMD performance the same but drops Intel by ~50 FPS in a game, vs. 4090.

Of course that everyone that does not agree with you, has to be a fanboy.
You continue to insult other people, just for not subscribing to what you say.
 

Leonidas

Member
If AMD wasn't so egregious about using 7900 XTX, and then me easily finding a 20% swing against 14900K, I'd have nothing to say right now. AMD had to know that someone would find out their BS tactics to boost thier cherry picks.

You guys will find out in due time.
 

Celcius

°Temp. member
yeah the tdp on that 9700X is seriously tempting if it can at least match the 7800X3D in performance
I'd love to just have a cpu that runs super cool and 8c/16t is all you need these days. I have serious doubts that Intel will get anywhere near 65w but it's annoying not having all the cards on the table yet...
 
Last edited:

Schmendrick

Member
If AMD wasn't so egregious about using 7900 XTX, and then me easily finding a 20% swing against 14900K, I'd have nothing to say right now. AMD had to know that someone would find out their BS tactics to boost thier cherry picks.
Aaron Paul He Cant Keep Getting Away With This GIF by Breaking Bad


You're a full time clown.
 
Last edited:

recursive

Member
winjer winjer , you and some other AMD fanboys come up with all kinds of excuses for AMD. Can't wait to see the excuses that come up when Zen5 barely (if at all) beats 14900K in gaming reviews.


I'm not going off ancient history, only 2022 and later. They were the closest to CPU peformance, being off by a little with Raptor estimate.

Meanwhile AMD lied about 7600X being 5% faster than 12900K, while it was actually slower.
And today I see a glaring example where they cut 14900K performance by ~20% in one of the games by using 7900 XTX. Disgusting. If you all don't find out what game I'm talking about now you will know when the reviews with 4090 used hit.

It was great for 13600K, but it obviously was going to hurt the high power draw parts since some of those used over 253 watts at full load.

We're crying about aggregate reviews now? Really? You and winjer winjer have lost it on the aggregate review thing.

Instead of aggregate reviews how about we take a look at Hardware Unboxed o TechPowerup when they post their results? Or are they BIASed too?


Just like winjer winjer , you haven't seen the data I've seen. If you had you wouldn't make stupid claims as you have here, zero proof my ass, I'm looking right at it. If you don't find out what game it is by review time I'll point it out to you then, and you'll see 14900K destroy Zen5 in at least of these games when they use a 4090, and several others will be a lot closer with 14900K winning.

Zen5 will be lucky to beat 14900K by 4% on average.

If you think I'm lying ban bet me!


They absolutely will if it keeps AMD performance about the same but drops Intel by ~50 FPS in a game, vs. 4090. And you guys bought it, AMD is clever, they know what their fanboys will buy into.
Dude just post your sources or stop shitting up the thread. This is out of control.
 

SoloCamo

Member
LOL nice joke but I'm far from being poor. GPU will be always a limit if we are talking about today PC gaming. My 12600K will last much longer than I thought.

12600k shows it's age in plenty of area's even at 4k. My 11900k can be a bottleneck for my 6900XT even at 4k... and no despite the bad rep a 11900k paired with cl14-14-14 dual rank 3733mhz in gear one is not a slow cpu.

And now with 9950XT they used a 7900 XTX, which tanks Intel performance at 1080p in some of their selected games,
Uhhh, using a gpu from AMD for better performance in cpu bound scenarios such as lower resolution are exactly why I continue to use them and get more life out of my cpu's... It's the best case scenario no matter the platform which is why the 4090 can and does get beat at 1080p in titles.


Wouldn't faster GPU show even more difference in CPU power?

I have never seen anything about Radeon cards somehow making AMD CPUs perform better. I know that AMD have lower CPU overhead (vs Nvidia) for DX12/Vulcan but this is true for Intel CPUs as well.

Exactly.
 
Last edited:

OverHeat

« generous god »
I had 10900k after that a 11900k and 12900k my 7950X3D shit on a 13900k for majority of games( so it must shit on a 14900 too) stop being a freaking fanboy dude
 
Last edited:

Leonidas

Member
You're a full time clown, seriously..
The only clown is those who think I must be lying, with no proof.

I've verified the data, there is definateley a 20% swing against Intel in one of the games and another 5% swing against Intel in another game.

Dude just post your sources or stop shitting up the thread. This is out of control.
I'd rather wait the ~55 or so days to elapse, or one of you guys figuring it out.

I can't be the only person on the internet to know that Intel loses 20% with 7900 XTX in one of the games, where AMD loses nothing dropping from 4090 to 7900 XTX in an example.
 

SoloCamo

Member
15th Gen will be beat by the x3D variant of zen5 like I said it’s a freackin ping pong games since zen 3

All I know is this competition is great. AMD has essentially pulled a socket 754/939 with the Athlon 64 all over again and intel desperately needs another core 2 duo moment. It took me years to see meaningful cpu improvements when I had a 4790k and now I consider replacing my 11900k all the time (despite it being more than good enough for 4k/60)
 

Leonidas

Member
15th Gen will be beat by the x3D variant of zen5 like I said it’s a freackin ping pong games since zen 3
Possible but I'm not sure AMD continues to get 15% uplift with V-Cache indefinately. At what point will there be diminishing returns?

I also never liked the idea of paying a $100-$150 premium for only maybe 5-15% extra performance.

Difference between Zen3 and now is AMD no longer has a 2-node advantage. I don't see AMD easily taking the crown when they barely beat ancient optimized Intel 10+++/Intel 7 by a few percent.
 
Last edited:
12600k shows it's age in plenty of area's even at 4k. My 11900k can be a bottleneck for my 6900XT even at 4k... and no despite the bad rep a 11900k paired with cl14-14-14 dual rank 3733mhz in gear one is not a slow cpu.
Still about GPU. And higher the resolution more GPU power you need. And what that about 'even in 4K' - do you have 8k display? Give the some concrete games/situations that CPU/RAM is your bottleneck versus a GPU.
 

Zathalus

Member
winjer winjer , you and some other AMD fanboys come up with all kinds of excuses for AMD. Can't wait to see the excuses that come up when Zen5 barely (if at all) beats 14900K in gaming reviews.


I'm not going off ancient history, only 2022 and later. They were the closest to CPU peformance, being off by a little with Raptor estimate.

Meanwhile AMD lied about 7600X being 5% faster than 12900K, while it was actually slower.
And today I see a glaring example where they cut 14900K performance by ~20% in one of the games by using 7900 XTX. Disgusting. If you all don't find out what game I'm talking about now you will know when the reviews with 4090 used hit.

It was great for 13600K, but it obviously was going to hurt the high power draw parts since some of those used over 253 watts at full load.

We're crying about aggregate reviews now? Really? You and winjer winjer have lost it on the aggregate review thing.

Instead of aggregate reviews how about we take a look at Hardware Unboxed o TechPowerup when they post their results? Or are they BIASed too?


Just like winjer winjer , you haven't seen the data I've seen. If you had you wouldn't make stupid claims as you have here, zero proof my ass, I'm looking right at it. If you don't find out what game it is by review time I'll point it out to you then, and you'll see 14900K destroy Zen5 in at least of these games when they use a 4090, and several others will be a lot closer with 14900K winning.

Zen5 will be lucky to beat 14900K by 4% on average.

If you think I'm lying ban bet me!


They absolutely will if it keeps AMD performance about the same but drops Intel by ~50 FPS in a game, vs. 4090. And you guys bought it, AMD is clever, they know what their fanboys will buy into.
I'm not going to bother to respond to all of... that. But labeling me an AMD fanboy is hilarious when I use a 13900k. Before that a 12900k, 5950x, 9900k, 6700k, etc... The only AMD CPU I have owned other then the aforementioned 5950x was my old Athlon 64.

I hold no allegiance to any CPU manufacture as the very concept is absurd, and I fully agree Arrow Lake can likely beat Zen 5, but your behavior is basically a parody at this point. I'm half convinced you're the owner of Userbenchmark.
 

Leonidas

Member
Love me some Leonidas mental gymnastics breakdown. Always quality stuff.
The real quality stuff will be when winjer winjer and AMD fanboys defend AMD with mountains of excuses when the 4090 review benchmarks and the actual results fall well short of AMDs best case cherry picked scenerio that some of those dudes fell for.
 

OverHeat

« generous god »
The real quality stuff will be when winjer winjer and AMD fanboys defend AMD with mountains of excuses when the 4090 review benchmarks and the actual results fall well short of AMDs best case cherry picked scenerio that some of those dudes fell for.
Nobody that only play games care for the stock 9000 series
 

Leonidas

Member
Nobody that only play games care for the stock 9000 series
Not everyone who only plays games wants to spend up to 50% more for a possible 15% uplift that many of them will never see, because they are only on 120-165 Hz monitors, and don't game at 1080p with a 4090.
 
Last edited:

Leonidas

Member
In case anyone was wondering all 6 games they cherry picked run at ~200 FPS or higher.

Can anyone tell me the point of this? I don't care about the 14900K getting 280 FPS and the 9950X getting 250 FPS (after 4090 is correctly chosen, as an example) in these old titles.

Why wouldn't they test some of the more demanding games that have released of late where the extra performance could matter?
 
Last edited:

Closer

Member
The real quality stuff will be when winjer winjer and AMD fanboys defend AMD with mountains of excuses when the 4090 review benchmarks and the actual results fall well short of AMDs best case cherry picked scenerio that some of those dudes fell for.
Nah, yours are always better. Keep fighting man. I always wait for some Nvidia/Intel/AMD thread so you can show up and do what you do. So entertaining.
 
Last edited:

SoloCamo

Member
Not everyone who only plays games wants to spend up to 50% more for a possible 15% uplift that many of them will never see, because they are only on 120-165 Hz monitors, and don't game at 1080p with a 4090.

As someone who has only used Intel since 2014 for his main pc, I'd also like to not have a space heater as a gaming cpu. I'm at the limits of air cooling with a 11900k in a full size Fractal Torrent case using a Noctua NH-D15 (chromax black) it's actually absurd that I would need water cooling to properly use a 12900k or newer i9 at this point.

I live in Florida, I don't need another reason for my central A/C to work harder, nor do I need to get swamp ass when gaming in my office. At this point when I do upgrade, AMD's X3D chips are the only option that makes sense. I never cared about efficiency in the past (I had a AMD FX-9590 prior to going 4790k) but Intel is out of control at this point regarding it.
 
Last edited:

Chiggs

Gold Member
I live in Florida, I don't need another reason for my central A/C to work harder, nor do I need to get swamp ass when gaming in my office. At this point when I do upgrade, AMD's X3D chips are the only option that makes sense. I never cared about efficiency in the past (I had a AMD FX-9590 prior to going 4790k) but Intel is out of control at this point regarding it.

I'm in South Florida, and I thoroughly welcome Arm's assault on Windows PCs.
 

Silver Wattle

Gold Member
Intel 15th gen P core (Lions Cove) claim a 14% IPC improvement, so they should likely retain the edge over standard Zen 5(though less than 14th gen has over Zen 4), but at what power and heat who knows.
Once X3D launches AMD will be the clear winner again.

Those wondering about the benefits of X3D for Zen 5 just need to compare Zen 4 standard cache vs Zen 5 standard cache, they are identical, so I expect similar performance advantage in gaming from X3D.
 

Panajev2001a

GAF's Pleasant Genius
Intel 15th gen P core (Lions Cove) claim a 14% IPC improvement, so they should likely retain the edge over standard Zen 5(though less than 14th gen has over Zen 4), but at what power and heat who knows.
Once X3D launches AMD will be the clear winner again.

Those wondering about the benefits of X3D for Zen 5 just need to compare Zen 4 standard cache vs Zen 5 standard cache, they are identical, so I expect similar performance advantage in gaming from X3D.
Possibly so, they did a nice set of changes for this core: https://chipsandcheese.com/2024/06/03/intels-lion-cove-architecture-preview/ (HT / SMT is not out of the picture, but it is now optional… biggest reason seems that it complicates scheduling for the Thread Director component with the kind of very heterogeneous setup Intel can now have… P core, P core thread, E core, and Low power E core… I do not blame them :)).

AMD’s cores are all P cores though, so the effective performance in any multithreading scenario that even decently uses enough P cores should still favour AMD which has a generous TDP headroom to keep investing in higher clocks.
 
Last edited:

Darkone

Member
So if i have now 5600X what value for money is the better option to upgarde to considering i now have 4080 and will upgrade to 5080/5090.
 

KungFucius

King Snowflake
Should go for AMD Zen 5 or wait for Intel Arrowlake?

Itching to refresh my ancient PC that I could barely get games running.
Do whatever you want, but with 2 new platforms around the corner, it might be best to see what both have to offer before pulling the trigger. Timing is a factor too. July vs ? October or later. What about GPU?

With all this AI on CPU shit, what will happen with standards in games? Will game AI APIs require AI on the CPU or can they use the GPU? How quickly will things converge?
 
I just bought 7800X3D but I want 9800X3D to offer notable improvement, that way I can get cheap upgrade in few years from now. Who knows, maybe even AM5 platform will support ZEN 6 (10800X3D) in 2027.

As far as Intel CPUs are concerned, before I always bought CPUs from them, but now I totally lost my trust to this company. Core i7 14/13 gen power consumption is just too absurd and with instability issues (even with intel limits) and fast degradation (some people reported degradation on a monthly basis) I would have to be an intel fanboy to go with intel platform this time. I hope Intel Arrow Lake fixes these problems, otherwise Intel may not recover from another defeat and that would be bad for all of us (less competition means more expensive CPUs).
 
Last edited:
Top Bottom