• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Ryzen 7000: 5 nm desktop CPUs coming September 27th

Panajev2001a

GAF's Pleasant Genius
Zen4/RDNA3 APU combo for entry level gaming for laptops will be unmatched with DDR5, PCIE5. I think they will be the only company to offer 1440p/60fps+ on entry level gaming. They really emphasized the performance/power/efficiency usage. Don't know too much about RDNA 3, but perhaps you can finally get ray tracing, along with machine learning\DLSS super sampling on an entry level laptop/pc at 60pfs+

DirectstorageAPI with 1-4 sec load time icing on the cake
Yep, they landed a strong punch with the SteamDeck APU, delivered an even stronger one with the Ryzen 7 6800U SoC (super popular for handheld PC’s), and a next generation APU could deliver a super strong uppercut that gets them a strong strong market hold.

If I were AMD I would be strongly courting Nintendo for a semi-custom solution based on this APU (the lowest power variant possible with some minor Nintendo customisations). Worth considering although it would be a big change and require emulation for Switch BC, but I think that would be doable… for older titles? Well, Nintendo could again drip feed them to customers again 😂.
 
Last edited:

Tams

Member
Yep, they landed a strong punch with the SteamDeck APU, delivered an even stronger one with the Ryzen 7 6800U SoC ( super popular for handheld PC’s), and a next generation APU could deliver a super strong uppercut that gets them a strong strong market hold.

If I were AMD I would be strongly courting Nintendo for a semi-custom solution based on this APU (the lowest power variant possible with some minor Nintendo customisations). Worth considering although it would be a big change and require emulation for Switch BC, but I think that would be doable… for older titles? Well, Nintendo could again drip feed them to customers again 😂.

I'd love that. The AMD Nintendo console that is. I doubt it though unless it's some sort of ARM one, like a Samsung SoC with RDNA. One can dream though.

But if so: New Nintendo Swap! By your Switch games again, but for the Swap! Or! Pay for Nintendo Swap Online and get access to poorly emulated Switch games!
 

Rickyiez

Member
You’re going to be GPU bound at 1440p?

Are you new to this ?

Please do some research before you post ... you should know that even the 5600x and 5900x when it comes to gaming, there is no performance difference worth the 300$ price tag difference. Not even 10 frames at 4k.

The same goes for intel. For example the i5 12600 and the i7 12700 in gaming, the performance is about the same . Yet there is big price difference.

The difference in these cpu are bigger in content creation. But this is not our topic here since we are focused on gaming.

When you look at the slide in the OP about the performance gain between both high end CPUs 12900k and 7950k... In gaming .... Then refer back to my previous post. It's not impressive or even close to being good in terms of gaming. They are not even close to the 13th gen intel at this rate.
I'm talking about the leap in performance from my perspective, probably my bad for not including it. From 3700x to a 7700x would easily yield me 15FPS increase or more even at 4k, provided I'm not totally GPU bounded.

There are already circumstances where I wished I could lock certain games at 4k120 on my C1, but couldn't because of the CPU as I'm definitely not GPU bound with a 3080TI.
 
Last edited:

//DEVIL//

Member
On average, the 12900K was around 12% faster than the 11900K in 1080p gaming. The 11900k was around 1% (!) faster than the 10900k. If Raptor Lake manages even a 10% boost it will be impressive given that it is on the same process with no major architectural changes.

Then the 7800X3D is going to arrive...
Which is why the 5800x3d is such a great product. Too bad it was released way too late. With the death of the ddr4 soon and the high asking price, I can't really recommend it to anyone. If the 7800x3d is going to be released next year, then it would be a great product since it technically should hold till 2025 when the new amd Zen5 or whatever they wanna fall it is out.

But what they announced today ? Nope not worth it. Add to the cost the new motherboard and ddr5 ram.. you spending 900$ US after tax at least 1000$. Just nope lol
I'm talking about the leap in performance from my perspective, probably my bad for not including it. From 3700x to a 7700x would easily yield me 15FPS increase or more even at 4k, provided I'm not totally GPU bounded.

There are already circumstances where I wished I could lock certain games at 4k120 on my C1, but couldn't because of the CPU as I'm definitely not GPU bound with a 3080TI.
No worries.

This video ( love this channel btw) talk about the same points I mentioned. This cpu as a general upgrade from let's say 5800x3d is not even a positive upgrade. Probably in negative in terms of frames . Plus you have to buy new motherboard and ddr5. No stick with the 5800x3d is cheaper lol
 

FireFly

Member
Which is why the 5800x3d is such a great product. Too bad it was released way too late. With the death of the ddr4 soon and the high asking price, I can't really recommend it to anyone. If the 7800x3d is going to be released next year, then it would be a great product since it technically should hold till 2025 when the new amd Zen5 or whatever they wanna fall it is out.

But what they announced today ? Nope not worth it. Add to the cost the new motherboard and ddr5 ram.. you spending 900$ US after tax at least 1000$. Just nope lol
I was addressing the claim that AMD would get "destroyed" by the 13900K. After my post I found an AMD slide that claims the Zen 4 core can beat a 12900K (according to the footnotes a 12900KS), by 11% in gaming.

https://www.techpowerup.com/298318/amd-announces-ryzen-7000-series-zen-4-desktop-processors

So if Raptor Lake is ~10% faster than its predecessor that should be an effective tie with Zen 4. Or if it's 15% faster, it will beat Zen 4 by <5%. Hardly a destruction.
 
Last edited:

//DEVIL//

Member
I was addressing the claim that AMD would get "destroyed" by the 13900K. After my post I found an AMD slide that claims the Zen 4 core can beat a 12900K (according to the footnotes a 12900KS), by 11% in gaming.

https://www.techpowerup.com/298318/amd-announces-ryzen-7000-series-zen-4-desktop-processors

So if Raptor Lake is ~10% faster than its predecessor that should be an effective tie with Zen 4. Or if it's 15% faster, it will beat Zen 4 by <5%. Hardly a destruction.
We don't know. in Theory, you are right. but I do not see Intel releasing something inferior to AMD 4 months at least after AMD release.

We could be more disappointed with Intel than what AMD had to offer.

for what we know so far. AMD next generational leap is a joke. and there is no leap really from going to ZEN4. you would think switching socket and socket design, adding DDR5 ram and on top of that a much higher TDP, you would think you get a huge boost in performance. but nope that didn't happen.

to me, it sounds like it's the same processor under a different socket, it's just the DDR5, and the higher TDP is the reason we are getting this uplift ( I know it's not really the case but honestly, it almost sounds like it. as if the engineering team was playing with their balls the last 2 years I guess )
 

FireFly

Member
We don't know. in Theory, you are right. but I do not see Intel releasing something inferior to AMD 4 months at least after AMD release.

We could be more disappointed with Intel than what AMD had to offer.

for what we know so far. AMD next generational leap is a joke. and there is no leap really from going to ZEN4. you would think switching socket and socket design, adding DDR5 ram and on top of that a much higher TDP, you would think you get a huge boost in performance. but nope that didn't happen.

to me, it sounds like it's the same processor under a different socket, it's just the DDR5, and the higher TDP is the reason we are getting this uplift ( I know it's not really the case but honestly, it almost sounds like it. as if the engineering team was playing with their balls the last 2 years I guess )
If you think about the last 10 years of Intel and AMD GPUs, the only case I can think of where you saw >25% performance improvement in gaming on average was with Zen 3. And that's really only at low resolutions. Zen 3 is a special case, because Zen 2 was being significantly held back by the dual CCX design, which caused big latency penalties. By moving all 8 cores to one CCX and doubling the shared L3 cache, AMD gained a massive advantage over their old architecture.

But going from Zen 1 to Zen 2 (ignoring Zen+) only improved gaming performance by 15%-20% on average. And in the same time period, Intel's gaming performance stagnated completely and only moderately increased with Alder Lake. The performance gain AMD are predicting with Zen 4 would be in the range of 15%-20%, which would be in line with the Zen 1 to Zen 2 transition and completely in line with historical precedent. The fact that one architecture (Zen 3) managed a huge boost in gaming, doesn't mean we're going to see that with every new generation.

And looking outside of gaming for a moment, the 29% single threaded boost and 40%+ multithreaded boost *is* better than what AMD achieved with Zen 3, and a bigger improvement on a per core basis than anything since the Bulldozer to Zen 1 transition. That doesn't sound like AMD sitting on their hands.
 
why can't amd make efficiency cores just like raptor lake? Also is intel's tile based architecture just another word salad for APU's?

With meteor lake, they are going to put some sort of ray tracing cores/processing embedded within the CPU core. I have a feeling zen 5 is all going to be about AI and ML.
 

Ironbunny

Member
why can't amd make efficiency cores just like raptor lake? Also is intel's tile based architecture just another word salad for APU's?

They can but why would one need those? Interestingly I could actually buy the 7900x and downclock it to 5900x levels and it would use 62% lower power at the same perf.
 
Last edited:
They can but why would one need those? Interestingly I could actually buy the 7900x and downclock it to 5900x levels and it would use 62% lower power at the same perf.

This is just a guess but:

1) The efficiency cores of raptor lake are contributing to higher multi-threaded Cine bench scores
2) The efficiency cores would be useful for mobile laptops
 

Tams

Member
why can't amd make efficiency cores just like raptor lake? Also is intel's tile based architecture just another word salad for APU's?

With meteor lake, they are going to put some sort of ray tracing cores/processing embedded within the CPU core. I have a feeling zen 5 is all going to be about AI and ML.
Because they don't seem to need them? Zen seems to scale very well, only really falling off a bit at the very high end (high clocks and power consumption). Impressive for a single architecture.

Meanwhile, Intel are winning the performance crown by making space heaters. So they need the efficiency cores.

In theory, efficiency/little cores should help reduce power draw a lot, provided you only do lightish tasks. But that requires proper OS support, and even then, takes up space on the die that could be used for something else.
 

Tams

Member
This is just a guess but:

1) The efficiency cores of raptor lake are contributing to higher multi-threaded Cine bench scores
2) The efficiency cores would be useful for mobile laptops
Intel already have efficiency cores in laptops. They seem to do jack shit for now though.
 
Last edited:

PeteBull

Member
3 things:
1) Great progress from amd, hoping it puts preasure on intel, doesnt matter which company has strongest cpu, what matters is- good products from both companies result in better prices for us, consumers, as current owner of oced 8700k im fully aware its only coz ryzens put preasure on intel so they finally released something decent back then(after releasing 7700k with 4c/8t only half a year earlier, that was such a turd).

2)Some ppl on this forum, and i guess they have to be pc players have no clue how to test/interprete cpu benchmarks, good thing there is at least some other group that is sane and explains how it works(no, u dont test cpu in 4k, u test them in 1080p paired with strongest possible gpu to avoid gpu bottleneck, otherwise u get even or close to even results, like probably in amds case of cp2077 benchmark- title is very gpu heavy, so thats my suspricion).

3) Lets wait for independend benchmarks to get final results, but so far, as an 8700k owner, very optimistically looking into the future for replacement, if not this year than next year, whenever i get product i cant refuse(doesnt matter if from amd or intel(again guys plz remember early 2017 where u only had choice between bad clocks/ipc ryzens and 4core cpus from intel, we dont wanna have stagnation coz only us, consumers will suffer then) :)
 

b0uncyfr0

Member
Intel's raptor lake will probably beat the regular zen 4 chips though. The chips that have a chance to take the throne are the zen 4 v cache models.
 

blastprocessor

The Amiga Brotherhood
Do we think AMD will do a 8 core 65w part (possibly lower cache/frequency)?

Want a part that's less TDP then what l own today.
 

Cryio

Member
People being shocked about 1080p is so weird to me.

You don't like 120+ fps gameplay? Add RT on top and getting 120+ is getting really hard, really fast. If you have GPU horsepower to spare, you downsample from higher rez.

If you get a 4K monitor, you have to deal with lower performance or poor image scaling because your GPU is too slow for 60, 120 fps or more. I'm not dropping 2000$+ on GPUs every 12-24 months.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Do we think AMD will do a 8 core 65w part (possibly lower cache/frequency)?

Want a part that's less TDP then what l own today.
Undervolt the 7700X to ~65W and itll likely still outperform the 5800X.
Otherwise wait for a 7700 none X if that ever comes along. (Probably going to be an OEM only part, cutting ~100 dollars off the 7700X will have it cannibalizing the 7600X)
 
People being shocked about 1080p is so weird to me.

You don't like 120+ fps gameplay? Add RT on top and getting 120+ is getting really hard, really fast. If you have GPU horsepower to spare, you downsample from higher rez.

If you get a 4K monitor, you have to deal with lower performance or poor image scaling because your GPU is too slow for 60, 120 fps or more. I'm not dropping 2000$+ on GPUs every 12-24 months.
just lower settings to get 60fps @4k.
you dont really need ultra extreme shadows.

1080p should only be for the switch.
 
LOL, according to steam survey 67% still plays at 1080p. 1080p will still be here for a long time specially not everyone has the money to get 1440p monitors
I have money, I just think their current pricing less than 10% cheaper than non-gaming 4K monitor is nuts.

There's differentiation in high refresh rate, 4K becomes super expensive there, where 1440p is acceptable, but damn. The entry point is shit.

1080p monitors on the other hand, are cheap, so a lot of people get by just fine with them, specially during a GPU drought.
Do we think AMD will do a 8 core 65w part (possibly lower cache/frequency)?

Want a part that's less TDP then what l own today.
It's time instead of doing 65 and 35W processor variations of the same high performance part they add motherboard official presets for the CPU's to conform to those power draws.
 
Last edited:

Tams

Member
Bunch of snobs in here. Sure, we're enthusiasts, but that's no excuse.

1080p is perfectly fine up to about 24", 27" at a push. This is not only being realistic, but as far as we know how most PC gamers play.

480p --> 720p was a big and very noticeable jump.

720p (HD) --> 1080p (Full HD) was/has been a less noticeable jump, though is quite easy to notice if you try.

1080p --> 1440p is only really noticeable on bigger displays. On smaller displays you have to be looking for a something to moan about.

1440p --> '4k' only really matters for large displays and proper '4k' is out of reach of most people.

'8k' is currently pointless for pretty any consumer. Looks incredible though.
 

Leonidas

Member
I can't even go back to 1440p, let alone 1080p.

I've been on 4K on PC since 2017.

As such I rarely need to upgrade CPU, but unfortunately my rig has been malfunctioning recently so I'll probably be forced to upgrade to Raptor Lake or Zen 4 this fall.
 
I hope Zen4/RDNA3 APU combo will establish entry level gaming at 1440p/60fps stable gaming. I think 1080p needs to die.
 
Last edited:

Ironbunny

Member
Bunch of snobs in here. Sure, we're enthusiasts, but that's no excuse.

1080p --> 1440p is only really noticeable on bigger displays. On smaller displays you have to be looking for a something to moan about.

22" and below the difference is negligible but at 27" 1080p is already soft. 1440p is somewhat of a sweetspot for that size. Then again 1440p over 27" starts to become soft quite fast.

1440p --> '4k' only really matters for large displays and proper '4k' is out of reach of most people.

Anything over 27" this matters. For 32" 4k imo its already a must.
 
Last edited:

Tams

Member
22" and below the difference is negligible but at 27" 1080p is already soft. 1440p is somewhat of a sweetspot for that size. Then again 1440p over 27" starts to become soft quite fast.



Anything over 27" this matters. For 32" 4k imo its already a must.
Perfectly passable for gaming and watching videos though.

Specific cutoff points wasn't my point either. Quite a few members here are detached from reality (for most others). There's nothing wrong with enjoying '4k', etc. but stuff like '1080p needs to die' is just ignorant and lacking in empathy.
 
Last edited:

Ironbunny

Member
Perfectly passable for gaming and watching videos though.

Specific cutoff points wasn't my point either. Quite a few members here are detached from reality (for most others).

I guess so but we are talking about a 7xxx series CPUs here. Pinnacle of the CPU for today. These will run anything atleast at 1440p and with ok GPU fine.


There's nothing wrong with enjoying '4k', etc. but stuff like '1080p needs to die' is just ignorant and lacking in empathy.

I feel like 1080p really needs to die and I say that with the highest embathy I have. :messenger_squinting_tongue:
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
AMD have no chill




Atleast let Intel actually announce their chips before planning the 1, 2 punch.

V95 = 7950X 3D V-Cache
V9 = 7900X 3D V-Cache
V8 = 7800X 3D V-Cache

If they are following last years prototype for their 3D V-Cache, then it might really be gameover.
Cuz that will mean something like 200MBs of Cache for the 7900X3D and 7950X3D.
If the 7950 is already heavily binned, and the X3D version is from an even better bin, then sweet jesus....what have AMD done.

This was their protoype from last year for dual CCD 3D V-Cache:
AMD-3D-VCACHE.jpg
 

Mr1999

Member
7950/7900 will be tempting. Currently got a 12700K which I delided, I want something new lol. 7950 would be just out of my range though maybe 7900, yea i know im a consoomer or maybe I should just wait for raptor lake. the last AMD cpu I had was the Athlon XP 2400+, it's been intel way before that and from it. Can someone say they actually "feel" the difference at this point. That's the only reason I would dump my 12700K, to see if I can "feel" it, that and I want to try something new cause its not all just games anymore. If that's the case its going to be a costly upgrade from DDR4 to DDR5, new MB but I guess I can put that in the living room and replace the 2500K, can't believe it still runs, but I hear AMD memory is all about spreadsheets to get it right, not sure if this is overblown or not. I like how it looks aesthetically too.
 

bbeach123

Member
Bunch of snobs in here. Sure, we're enthusiasts, but that's no excuse.

1080p is perfectly fine up to about 24", 27" at a push. This is not only being realistic, but as far as we know how most PC gamers play.

480p --> 720p was a big and very noticeable jump.

720p (HD) --> 1080p (Full HD) was/has been a less noticeable jump, though is quite easy to notice if you try.

1080p --> 1440p is only really noticeable on bigger displays. On smaller displays you have to be looking for a something to moan about.

1440p --> '4k' only really matters for large displays and proper '4k' is out of reach of most people.

'8k' is currently pointless for pretty any consumer. Looks incredible though.
Tbh Im perfectly fine with 1080p(24 inch) unless the game have really shitty AA solution .

Some game Im "fine" with 1080p : cyberpunk/dying light2 /GOW/ most dlss 2.5 game .

Game really shitty at 1080p : witcher 3 (shitty aliasing-the tree branch was terrible ) total warhammer 2 (shitty aliasing) , X4 foundation (shitty aliasing) RDR2 (the tree look really bad at 1080p ) .

witcher 3 1080p vs 4k




 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not

At "stock" clocks I think the 13900K and the 7950X are actually gonna be pretty close in MT.
In gaming I think the 13900K has this on lock DDR5 vs DDR5.

I dont know how much more juice the 13900K is gonna have in the tank, but AMDs ES with liquid cooling is pulling some insane numbers.
Almost 40,000 in Cinebench?

jyq2MUU.png
 
Top Bottom