Gaiff
SBI’s Resident Gaslighter
14900K is a piece of crap.How does it compare to the 14900K? I was planing to get it by the end of the year
14900K is a piece of crap.How does it compare to the 14900K? I was planing to get it by the end of the year
That’s bad…lol14900K is a piece of crap.
Same performance as the 13900K for a higher power consumption. It's trash.That’s bad…lol
13700F should still be a good cpu that lasts you for awhile. If you aren’t planning on upgrading it anytime, then you should be okay with the whole system. (Can probably take 14th gen in the mobo, but that is basically the same as 13th gen.)I'm in a qundry.
Back in November I bought a prebuilt OEM lenovo legion with a 13700F. I find out it's a dead end of life motherboard.
All I have to do is stick an rtx 5000 series in it to last me several more years. By then the AM5 will be dead end of life motherboards.
I guess I'm out of shopping around till whatever is next on the horizon.
Would be nice to have more umph in my PC right now though.
Great news for the 3 guys playing at 1080p on a 4090.
0.01% over a 13700k at 4k (but 60% more expensive) or equal to a 13900k with PBO max and undervolt (and 6% more expensive).
I hear you but (respectfully) disagree with pretty much all points.
It doesn’t tell you which CPU is best for games because 1080p is unlikely to ever be a resolution gamers will use with a state-of-the-art, latest-gen CPU. I’d actually challenge anyone to find a gamer playing on this CPU at 1080p. I fully understand how resolution at the lower end is CPU dependent and at the higher end is GPU dependent. But here I am with a 13700K and a 4090 wondering if this CPU is going to do much for me at 4K in terms of CPU bottlenecking and… well, I don’t know. It’s a review of high-end equipment at preposterous gaming resolutions that doesn’t tell you pretty much anything.
It seems like these reviewers are trying to sell their product (reviews) to gamers when really it should be productivity software they ought to be focused on. But that won’t get gamer clicks.
I have two monitors. I need them for my digital art.Most people don't have 2 monitors.
i reach high fps in CP77 even in dogtown market with a 4080 , 4k/max settingsIf you have a newer GPU then your frametimes will be taking a hard beating in modern games even if you're only targeting 60fps.
Just grab a 5700x3d if you can.
I'm in a qundry.
Back in November I bought a prebuilt OEM lenovo legion with a 13700F. I find out it's a dead end of life motherboard.
All I have to do is stick an rtx 5000 series in it to last me several more years. By then the AM5 will be dead end of life motherboards.
I guess I'm out of shopping around till whatever is next on the horizon.
Would be nice to have more umph in my PC right now though.
High is relative 100+ fps?i reach high fps in CP77 even in dogtown market with a 4080 , 4k/max settings
i dont see games this gen becoming more cpu demanding this gen
The money you saved getting a 7800x3d is like a whole gpu tier upgrade worth of performance vs gaining 1 or 2 fps at 200+fps so no one would ever notice.And I just bought a 7800x3d for my new pc...
Yes it does tell you which CPU is best for games because you are able to take the 1080p data and extrapolate it to other situations. You can't do that with 4K data since there are less significant differences, which makes 4K data much less useful. You're utilizing benchmark data wrong. You are expecting to look at a review and see your exact CPU/GPU configuration and see the data for that. Sometimes you can do that because the reviewer benchmarked the exact same configuration that you have. Lucky you! Many other times, you cannot do that because there are a shit ton of possible configurations out there in the wild that it would be unreasonable for a reviewer to benchmark every single possible configuration in existence.
How do we compromise while still retaining meaningful data? We have the reviewer do tests that actually allow people to do meaningful comparisons for the specific product category they're looking for. 1080p tests allow consumers to see meaningful differences between CPUs. 4K tests don't do that because they're mostly the same and they don't show you what differences there are.
Check out this line at 11:44
"Then we have the 4K data, which in my opinion is completely worthless for CPU testing, but it's often quite heavily requested, so here we are. When benchmarking the GPU, the CPU doesn't really matter in almost all games because you're GPU limited."
If you need a more in depth explanation, here are two more videos where they covered this topic.
What a long winded response. Stopped reading after the straw man "there's lots of different configurations". No shit.
It is two additional charts (one at 1440p and one at 2160p) per game. Not asking much.
They gave benchmarks at 1080p, and only at 1080p, because at 1080p the difference between CPUs is most pronounced.
Same. I got an 49" ultrawide monitor with 5120 x 1440 resolution. No need for the second monitor now. I can split windows however I want on each desktop with MS power tools or a similar tool on Ubuntu.I had 2 want back to 1.
i dont think you can reach 100 fps even with a 4090 and top cpu in dogtown marketHigh is relative 100+ fps?
i reach high fps in CP77 even in dogtown market with a 4080 , 4k/max settings
i dont see games this gen becoming more cpu demanding this gen
Even if 7800X3D and 7950X3D are faster and more efficient, 13700F is still a very good CPU. Pretty much any 6+ core, 12th-14th gen Intel or Zen4 AMD CPU is going to do a great job with modern games.
Yes it does tell you which CPU is best for games because you are able to take the 1080p data and extrapolate it to other situations. You can't do that with 4K data since there are less significant differences, which makes 4K data much less useful. You're utilizing benchmark data wrong. You are expecting to look at a review and see your exact CPU/GPU configuration and see the data for that. Sometimes you can do that because the reviewer benchmarked the exact same configuration that you have. Lucky you! Many other times, you cannot do that because there are a shit ton of possible configurations out there in the wild that it would be unreasonable for a reviewer to benchmark every single possible configuration in existence.
How do we compromise while still retaining meaningful data? We have the reviewer do tests that actually allow people to do meaningful comparisons for the specific product category they're looking for. 1080p tests allow consumers to see meaningful differences between CPUs. 4K tests don't do that because they're mostly the same and they don't show you what differences there are.
Check out this line at 11:44
"Then we have the 4K data, which in my opinion is completely worthless for CPU testing, but it's often quite heavily requested, so here we are. When benchmarking the GPU, the CPU doesn't really matter in almost all games because you're GPU limited."
If you need a more in depth explanation, here are two more videos where they covered this topic.
All these reviews need to say, as far as games go with high-end CPU/GPUs is = they're all about the same, save your money, nobody should be buying these. It's just marketing to run these at 1080p, and I've been a PC enthusiast for a good 35 years now.
Hah, gramps. That's funny.The future is now gramps People aren't stuck to just 60hz anymore and there are plenty of people, including myself, that want the longest life possible out of their system. If I listened to bad advice such as you are implying, I would have ended up swapping platforms multiple times in the same window that I was able to keep my 4790k (2014-2020). If you've really been PC gaming for 35 years you should understand this... I guess I'm a young gun having PC gamed for only 25+ years but this testing was pretty obvious even in the early 2000's.
I imagine that's less true than it once was because of DLSS since you're technically rendering the game at a lower internal resolution and then upscaling it to 4K.But…but… who on earth is buying a CPU like this and a 4090 GPU just to play 1080p? At 4K I’d guess all these CPUs perform similarly.
If you have a newer GPU then your frametimes will be taking a hard beating in modern games even if you're only targeting 60fps.
Just grab a 5700x3d if you can.
To be fair, PC gaming is a bit of a quagmire these days. Shader compilation stutters etc are so commonplace. I miss the days of smooth frame rates, whenever those days were...I imagine that's less true than it once was because of DLSS since you're technically rendering the game at a lower internal resolution and then upscaling it to 4K.
I believe ray tracing is also quite heavy on the CPU, which is probably worth considering if you have a 4090.
Depending on the region you're in and your budget, I think the 5800X3D might be a better bet? In my region the price difference between the 5700X3D and the 5800X3D is virtually nothing so it seems worth it to spend a tiny bit extra. I have a 3900X myself and I'm finding it is bottlenecking my 3090 somewhat even in 4K and I'm getting a few more stutters than I'd like.
Planning to upgrade next month.
Hah, gramps. That's funny.
Damnit. Trying not to get too old, and now you made me feel old. I like high-refresh stuff, too. I mean, that's why I have a 4090, too! But 1080p ain't never, ever happenin' again.
Nearly 60% of users still use 1080p according to the most recent Steam Hardware survey, making it by far the most common resolution and nothing else really comes close, so I don't think we can claim "nobody games like that."How many times is it necessary to have videos showing gaming on high-end hardware at low resolutions as a determiner of a CPU’s worth when just about nobody games like that? What is the point?
I think you didn't read my post: I said "high-end hardware at low resolutions." I also said, "just about nobody games like that."Nearly 60% of users still use 1080p according to the most recent Steam Hardware survey, making it by far the most common resolution and nothing else really comes close, so I don't think we can claim "nobody games like that."
I absolutely understand that this shows CPU performance, in a meaningless way.
. I mean, let's say for example - this thread. How many people in this thread are shopping for a top CPU and top GPU to play games at 1080p? Raise your hands. Nobody? Color me surprised
I read your post, I simply disagree. As silly as you and I might think it is, and believe me I think it's silly, there are plenty of people out there with high end GPU's and CPU's and 64 unnecessary ass gb of DDR5 on a 1080p non HDR VA or IPS monitor. I was in the Army a while back and my wife is still in the Air Force and in College, and I can tell you for sure that quite a lot of young people just buy "the best CPU's and GPU's", (read: whatever the Best Buy guy recommends), without any attention paid to monitor at all. I don't really get it, but it's definitely a thing. Also, "just about nobody".....my apologies, definitely cannot be applied to 60% of a demographic. I could go with it if you were at least talking about a minority, but applying "just about nobody" to the majority just seems silly. I get what you're saying though, just my two cents.I think you didn't read my post: I said "high-end hardware at low resolutions." I also said, "just about nobody games like that."
It'll do - relatively speaking - more than CPUs that perform worse at 1080p, and less, than CPUs that perform better.But here I am with a 13700K and a 4090 wondering if this CPU is going to do much for me at 4K in terms of CPU bottlenecking and… well, I don’t know.
If you look at the 60% running 1080p, they're statistically not using high-end rigs of any sort. Look at the GPU charts. Most people are using dogsh*t for a GPU.I read your post, I simply disagree. As silly as you and I might think it is, and believe me I think it's silly, there are plenty of people out there with high end GPU's and CPU's and 64 unnecessary ass gb of DDR5 on a 1080p non HDR VA or IPS monitor. I was in the Army a while back and my wife is still in the Air Force and in College, and I can tell you for sure that quite a lot of young people just buy "the best CPU's and GPU's", (read: whatever the Best Buy guy recommends), without any attention paid to monitor at all. I don't really get it, but it's definitely a thing. Also, "just about nobody".....my apologies, definitely cannot be applied to 60% of a demographic. I could go with it if you were at least talking about a minority, but applying "just about nobody" to the majority just seems silly. I get what you're saying though, just my two cents.
They do tests like this to test CPU max theoretical framerate output capacity trying to diminish the GPU influencing the result as much as posible (i would even prefer if they did 720P Lowest). If it can't do 120/240fps in a specific game at the lowest graphics settings then chances are it won't ever be able to do it even with a next gen GPU.But…but… who on earth is buying a CPU like this and a 4090 GPU just to play 1080p? At 4K I’d guess all these CPUs perform similarly.