• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia is basically a giant advertisement for AMD at this point.

AMD needs a serious answer to CUDA, you just can't build a real workstation without an Nvidia card, and I'm guessing a very large portion of their sales comes from non gaming/personal sectors like creative or anything requiring serious compute. Even with these mad prices, I'd spec a PC with 3090 if work needs a new workstation. the extra $$$ is just worth it then. For gaming? Feck no lol.
This is actually an issue with any potential new consoles, AMD are just not on the same level as Nvidia but Nvidia are money grabbing parasites.
 
Whoever said prices were going down when mining stops is a fucking liar.
They will when they expire the ultra enthusiast market sales. Either that or hold onto stock indefinitely.

And we're looking at founders card prices only... look at AIBs, especially in Europe :messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy:

Good luck this gen nvidia, you'll need it
 

yamaci17

Member
Nobody knows how AMD's cards will perform. The bigger difference between then and now is that DX12/Vulkan have taken off in a big way since 2020. AMD often had performance issues on DX11/OGL but I don't really see nvidia holding the same driver advantage moving forward.

It otherwise just depends on how badly you need to upgrade. I'm waiting myself because there's zero chance that I'm getting a 12GB card for a 4k monitor.
At this point, NVIDIA went so beyond that it won't matter how AMD's cards will perform in "specific" CPU bound cases.

Let's take Flight Simulator for example. In heavy CPU bound photogrametery regions, a 5950x or a 12900k usually becomes CPU bound near 65-70 FPS. Now, you can throw GPU power at it and get 4K/60 with high settings. That's no problem. You can double that (4090/7800xt) and get 4K/120. But problem starts when there is no CPU and won't be a CPU in at least 5 years that will be able to push 120 FPS CPU bound in Flight Sim. What NVIDIA did with their tricky is allows them to circumvent and go around the CPU bottlenecks. All of a sudden a RTX 4080 which enables DLSS 3 at 4K will render something like 150+ frames. A framerate amount that no CPU can achieve natively, as of today. AMD can produce a GPU that has the ability to render 4K 300 frames in Flight Sim, but if current CPUs bog it down near 70 FPS in CPU bound locations, then that power is meaningless for that usecase.

Same for Spiderman. Just look at their marketing videos: They're literally targeting CPU bound games like mad. Spiderman with ray tracing at 200 FPS? 12900k in Times Square bogs down near 100 FPS CPU bound. AMD can potentially make a GPU that renders 200+ FPS with RT enabled at native 1080p, but it will never reach past 100 FPS with something like 12900k, which is the problematic part, whereas with NVIDIA's tricky, you will get north of 160 frames even with a mediocre CPU because their tech is now inserting "frames".

Implications of interpolation, if it really works great, is pretty damning for competition and AMD. If it works good, that is. BECAUSE then, there will be no alternative to that, unless AMD engineers its own similar solution. Which they will most likely have with RDNA4. Because this is not a pish posh technology either, it is a hard thing to get right, and definetely needs a huge R/D, serious hardware accerelation and a software to back it up. It is literally real time video game frame interpolation, which is crazy.

So yeah. It goes beyond pure GPU power. They've literally hacked their way through CPU bottlenecks now. If AMD does not answer back somehow, and if interpolation ends up being a success overall, and if RDNA3 won't have the capability to run it, it will be another generation where AMD will hopelessly watch NVIDIA dominate. And by the time RDNA4 comes and they put similar tech into their hardware, I'm sure Jacket guy will have invented something new.

They become so big that they can invest huge amounts of money into R/D and discover and invent all kinds of crazy stuff. DLSS and Interpolation seems like just the beginning.
 
Last edited:

GymWolf

Member
More like 3090ti, yah. I'm looking at a hardware unboxed video right now and it looks like they trade blows at 1440, rt aside. The 3090ti takes the lead in 4k, tho.
I watched some test and the gpu is still capable of 4k60+ ultra details on any game except cyberpunk so it is enugh for me (you can probably reach 60 locked by turning down 1-2 settings a notch), i saw the versus test with the 3090ti and yeah in 4k is like 5% faster but like double the wattage consume, so not sure if that make the 6950 looking even better in perspective.


Any chance of amd releasing something that goes noticeably faster than a 6950xt for under 900 dollars?

The only negative thing is that it has hdmi 2.1, not even 2.2...dafuq amd??
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
That Power draw too is just abhorrent, too. 320w for the 4080 alone. 320w!!!!! Can nvidia not optimize their cards for shit??? The 1080 was 150 watts! 3 more gens later and it's doubled. Isn't tech supposed to be smaller, less power hungry and better as time goes on, what happened here???
In this I agree with their CEO, Moore’s Law is essentially dead or dying from a certain interpretation of it.

It takes longer and longer to deliver meaningful technology improvements in semiconductors and it is also getting more and more expensive: designing and manufacturing a chip in the latest manufacturing process is more and more expensive and the gains we are getting from each node improvement are smaller and smaller and it takes longer to deliver them.

We are also in an eta were diminishing returns call for bigger and bigger performance increases to deliver noticeable results (you can chat higher quality and now higher framerates with DLSS 3.0 and the like solutions but up to a point, funny thing about DLSS 3.0 is that improved framerate generated that way does not improve latency as much as native high framerate)…

So how do you add more and more TFLOPS? You brute force it by making bigger and bigger chips and clocking then higher and higher (and bumping up voltage to make it work or not lowering it enough if you looo at it that way). Results? Expensive chips and power consumption going through the roof.
Sure there are ways to get better results and lower power consumption but they are still risky investments and fewer and fewer companies can afford them.

————

Now when people say they want console generations to be shorter and Pro consoles well… prepare to pay a lot more and the more you support that the slower the software ecosystem innovation will be. See PC Vs consoles here: https://www.neogaf.com/threads/2-ye...of-ssd-in-new-consoles.1641701/post-266600648 ( added some thoughts there too, fundamentally the more variety you have the more complexity there will be in the software stack and innovation there will be slower and slower as legacy builds up).
 
Last edited:

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
yet I’m still getting a 4090 next month 😎
Payment during checkout :

OIP.AppJZvA0KH9D9yIRXqONQAHaEI
 

Admerer

Member
I'm actually impressed at how well the 6800 XT matches the 3080 in performance, whilst also drawing less peak power. Similarly 6900 XT to the 3090.

If they're able to knock it out with RDNA3 with a card that outperforms the 3090 Ti at a good price, I'll be very impressed.
I just like to point out that the 3000 series from Nvidia was using an inferior Samsung node vs Tsmc node for AMD's 6000 series.

Had Nvidia used Tsmc for the 3000 series, we would have gotten less power draw and most likely higher clocks equaling more performance from Nvidia's current gen cards ( also explains why the 4000 series is so much more faster than the 3000 series).

Now that both companies are using TSMC that's one less advantage in AMD's favor.
 

OZ9000

Banned
I just like to point out that the 3000 series from Nvidia was using an inferior Samsung node vs Tsmc node for AMD's 6000 series.

Had Nvidia used Tsmc for the 3000 series, we would have gotten less power draw and most likely higher clocks equaling more performance from Nvidia's current gen cards ( also explains why the 4000 series is so much more faster than the 3000 series).

Now that both companies are using TSMC that's one less advantage in AMD's favor.
The 4000 series doesn't appear to be "much more faster" at a given wattage.

The power requirements have gone up and the performance of the 12GB 4080 is only better than the 3090 Ti thanks to DLSS 3.0

Full benchmarks will be telling. But at the given price point and wattage the 4000 series seems like crap value.

I would expect RDNA3 to improve upon RDNA2. Hopefully they maintain a similar close parity with the 4000 series.
 
Nvidia makes some amazing demos but they are not representative of real world game performance. They're just there to look really good. And I still don't see what features nvidia has added to again, warrant spending a grand on a graphics card.
It's not needed. Only if you have dual 4k monitors or something. If that is the case, then you are rich and money isn't an issue and will probably flex about it like an asshole too...

Really, there is no game that isn't playable at max settings with a 2000 series card. How many people have expensive 4k monitors (they are way more pricey then tvs). A 3060/ti - 3070 is more then adequate for 1440p, and runs 1080p with room to spare.

Until we get crysis level exclusives, there is no point. Which won't happen. Pc gaming is held back graphically by console level. So why are people buying these cards at all?

As for the people praising amd, ha. I'm not seeing cheaper prices there. Their last cards were just as expensive and lacked features. What makes you think they aren't going to charge premium too?

Mining has ruined gpu prices. The average gamer like me who plays on a 60 series card is priced out. Last time I was able to afford a 80 card was when they were $400 or under, meaning 9800gt, 8800gts, 6800, geforce 4, 3, 2...basically low end 80 cards, that were still affordable. Hell the high end 80 cards topped out at $600, until the 2000 series.
If the 60 series cards are $500+ then I'm out. I'll buy used old cards. No way am I blowing half a grand on a graphics card.
 
Last edited:

Ellery

Member
The play they are currently doing is of their own making and the result of the bullwhip effect. Too much 30 series on the shelves left for a proper 40 series launch and naming convention.
They would compete with their own product otherwise. Have to create new segments basically.

Too many chips and other components were ordered and manufactured while the demand decreased due to economic uncertainties, crypto decline and hello entire tech world that operated on the Federal Reserve and other Central banks printing actual trillions of fiat money to enter circulation and propel inflation from the 70s back to our timeline making us suffer on all fronts.

Well played and I can't fathom how anyone (especially in europe at that ghastly conversion rate) would buy those 4080. They are both absolute awful products. It is astounding because the 4090 is exactly what I expected it to be, but the rest is absolute garbage and if people pay 1100€+ for a 4080 12gb or 1500€ for a 4080 16gb then ouch.

I am voting with my wallet and not buying. I am impressed by how much I genuinely hate the abominations of 4080 cards that Nvidia presented. At that price and relative hierarchy compared to the 4090 it is an insult to gamer's intelligence. The 4090 is simply too expensive for me personally at the moment for a products that increases graphical fidelity in games that don't really exist because the PC gaming landscape is not really known for it's graphical prowess in recent years.
 
Last edited:

Rickyiez

Member
If history were to repeat again, this is the time Radeon 5870 repeated again by whopping the Fermi ass while being cheaper in MSRP. Knowing AMD radeon team today, I'm quite skeptical.

Don't get me wrong, RDNA2 was quite decent but they failed at the pricing war and feature wise.
 
Last edited:

MarkyG

Member
Have Nvidia ever been customer focused? Seems they've gone downhill since the GTX days. I really hope AMD deliver something killer come Nov 3rd. It's sorely needed.
 

Ellery

Member
If history were to repeat again, this is the time Radeon 5870 repeated again by whopping the Fermi ass while being cheaper in MSRP. Knowing AMD radeon team today, I'm quite skeptical.

Don't get me wrong, RDNA2 was quite decent but they failed at the pricing war and feature wise.

Agree.

Deep down I hope AMD smashes them, but Nvidia has been operating on a significantly higher R&D budget for so many years now and they are well and above AMD in engineering.

The jump from Samsung 8nm to TSMC 4N further cements my believe that AMD has no chance of closing that gap. The 30 series was a bigger chance for AMD because they were already on 7nm TSMC.

Would be awesome if AMD gets the architectural side right and releases a competitive product and I am certain they will release good price/perf products, but I doubt they have a chance to come close to the 4090.

Have Nvidia ever been customer focused? Seems they've gone downhill since the GTX days. I really hope AMD deliver something killer come Nov 3rd. It's sorely needed.

Companies are only "customer focused" when they are the underdog. AMD also has changed their pricing structure significantly after they made a huge wave in the CPU market.
If AMD were to overtake NVidia (which I doubt) then eventually AMD is the son of the bitch again and nvidia the lovely customer focused underdog.

It is always like that. Also with Playstation and Xbox. Look at Jim Ryan vs Gamepass compared to Shawn Layden vs Always Online Xbox One Sports shitbox.

Basically cyclical.
 
Last edited:

Rickyiez

Member
Strange logic

I too find these prices to be insane, but a 3060 Ti would still cover most of your needs (especially compared to consoles) for the gen. Why create a nonexistant problem by aiming for the newest generation's top tier cards, but go back to consoles if prices are "hundreds" away from what's there.

Seth Meyers Whatever GIF by Late Night with Seth Meyers
It doesn't work like that all the time. Some of us has 4k 120 TV that needed to be fed with unlimited frames. A 3060 Ti just wouldn't cut it.
 
Last edited:

yamaci17

Member
Have Nvidia ever been customer focused? Seems they've gone downhill since the GTX days. I really hope AMD deliver something killer come Nov 3rd. It's sorely needed.
I'd say Maxwell/Pascal were customer focused products.

After 6-8 years, they still run latest ports fine, without issues, and with good enough performance. These cards also do not appear to be degrading in terms of performance. When compared to Fermi/Kepler which was made obsolete in mere couple years, these cards aged very good. At this point, they're still consistently performing whereas older GCN cards such as RX 580 is having troubles in certain games. For example GCN cards have terrible performance in Halo Infinite, graphical bugs in Spiderman, and so on. Whereas Pascal GPUs perform like they should without graphical errors in both titles. The amount of problematic titles on GCN has increased in recent 2 years whereas Pascal, also thanks to its huge userbase, probably gets a decent amount of "optimization" focus from developers, since they can't also realistically ignore them. RTX cards are one thing, but Pascal GPUs, especially ranging from 1050ti to 1080ti has a huge presence in gaming PCs, even today.

Also, Pascal had ample amount of VRAM, midrange ones having 6 GB to high end ones having 8 GB, even the 1070 had 8 GB VRAM, which helped the card to endure all those years at ultra textures / high settings. Imagine a 1070 being 6 GB, or a 1060 being 4 GB, you would have to sacrifice more than your card is capable of.

GTX 970 is an outlier, but the fact that the defective 500 mb is already used by background apps, it is another reason why GTX 970 also survived all those years without huge performance drops. Only thing you have to do with that card is to keep your expectations in check and use optmized settings and where needs be, use a tad bir lower texture resolution setting. 980/980ti are still decent, and perform similar to their Pascal counterparts.

So yeah. They did become customer friendly with those. Gamers also awarded them, I mean look at how much hold Pascal has over Steam surveys. The hold is too strong that developers cannot still leave those GPU lines behind. The other reason why they also aged good because Nvidia did not design them in a way that they were dependent on "game ready" drivers unlike Kepler and Fermi. You can run RDR2 on a Pascal GPU with a 2017 driver and you would most likely get the same performance you get with an updated 2020-post-RDR2 driver. This alone proves that Pascal and above did not need GRD drivers at all. Kepler/Fermi actually needed drivers to get advertised performances. You literally actually had to update your GPU. They were amazingly reliant on NVIDIA's engineers to tweak game by game. Their design was made in a way that only NV knew how to maximize its performance potential. This design alone is anti consumer, considering they have the power to scrape entire architectures every 2 years. So leaving that design alone in favor of more generalized compute cards in Maxwell and Pascal was also huge steps towards being somewhat "customer" friendly.

Those are all I can remember. From what I'm seeing Turing and Ampere also feels like they're not dependent on driver optimizations, thankfully, so as long as they have a hold on gaming community, these cards should last very long into the generation.

Only thing I'm suspicious is Ray Tracing. Ray Tracing on Spiderman only works with their latest GRD driver, which could have a reasoning that they had to optimize Ray Tracing specifically to perform good with Turing/Ampere's way of doing ray tracing. Maybe in that respect, they might get obsolete in terms of performance if ray tracing implementations in future will require special care from developers or NVIDIA's side. But that's just theoritical guess at this point. There are also newer Ray Tracing games that worked well on older drivers, so I'm not quite sure on that front either.
 

Cryio

Member
Y'all are asking/complaining about a DLSS2 alternative from AMD.

It's out now. It's FSR 2.1.1 and you can mod it in most DLSS2 games. Performance boost is identical to DLSS2 and visual quality is 95% there.

Just go try it already.
 
It's not needed. Only if you have dual 4k monitors or something. If that is the case, then you are rich and money isn't an issue and will probably flex about it like an asshole too...

Really, there is no game that isn't playable at max settings with a 2000 series card. How many people have expensive 4k monitors (they are way more pricey then tvs). A 3060/ti - 3070 is more then adequate for 1440p, and runs 1080p with room to spare.

Until we get crysis level exclusives, there is no point. Which won't happen. Pc gaming is held back graphically by console level. So why are people buying these cards at all?

As for the people praising amd, ha. I'm not seeing cheaper prices there. Their last cards were just as expensive and lacked features. What makes you think they aren't going to charge premium too?

Mining has ruined gpu prices. The average gamer like me who plays on a 60 series card is priced out. Last time I was able to afford a 80 card was when they were $400 or under, meaning 9800gt, 8800gts, 6800, geforce 4, 3, 2...basically low end 80 cards, that were still affordable. Hell the high end 80 cards topped out at $600, until the 2000 series.
If the 60 series cards are $500+ then I'm out. I'll buy used old cards. No way am I blowing half a grand on a graphics card.
It's true that console is holding back PC but for a relatively poor person like me it means I only have to upgrade every 5 years.
 

Ev1L AuRoN

Member
I would love to jump AMD boat.

But I need them to be competitive with technologies like DLSS, RT, NVENC. When I bought my RTX 2060 I did it paying the same price of a 5700 XT, I took the hit in rasterization performance and didn't regret a bit. I'll wait for the xx60 card and for RDNA3 before decide which one I will get.
 
After that joke of an announcement, I’m done with Nvidia products for the foreseeable. I’m lucky enough to have a 3080 so won’t be needing a card for a long time but I’ll be going AMD in the future.
 

Crayon

Member
I watched some test and the gpu is still capable of 4k60+ ultra details on any game except cyberpunk so it is enugh for me (you can probably reach 60 locked by turning down 1-2 settings a notch), i saw the versus test with the 3090ti and yeah in 4k is like 5% faster but like double the wattage consume, so not sure if that make the 6950 looking even better in perspective.


Any chance of amd releasing something that goes noticeably faster than a 6950xt for under 900 dollars?

The only negative thing is that it has hdmi 2.1, not even 2.2...dafuq amd??

I have no idea. The only other thing I picked up from that video is that the 6950 used significantly more power than the 6900 so you may want to take a look at that one, too. The prices dropped across the board, too. I think I read the 6900XT is like $700 or something like that.
 
Last edited:

GymWolf

Member
I like how they made the new


I have no idea. The only other thing I picked up from that video is that the 6950 used significantly more power than the 6900 so you may want to take a look at that one, too. The prices dropped across the board, too. I think I read teh 6900XT is like $700 or something like that.
In america maybe, here in europe prices are still 100-300 euros higher than they should be.

I heard people finding 3090ti for 850 dollars...
 

DonkeyPunchJr

World’s Biggest Weeb
I watched some test and the gpu is still capable of 4k60+ ultra details on any game except cyberpunk so it is enugh for me (you can probably reach 60 locked by turning down 1-2 settings a notch), i saw the versus test with the 3090ti and yeah in 4k is like 5% faster but like double the wattage consume, so not sure if that make the 6950 looking even better in perspective.


Any chance of amd releasing something that goes noticeably faster than a 6950xt for under 900 dollars?

The only negative thing is that it has hdmi 2.1, not even 2.2...dafuq amd??
6950XT is a terrible value. It’s $250 more than the 6900XT while offering only a tiny performance boost and consuming more power. And yeah I think there’s a good chance that there’ll be a 7000 series that beats it for under $900

And what’s HDMI 2.2? I wasn’t even aware that existed
 

RayHell

Member
It's opening up for AMD so they can't take a good chunk of the mid range market share. If the 7800XT is priced at the same 650$ than last gen and perform better than the 4080 12GB 900$ card, which is probably the case, Nvidia won't be able to sell their 4080 12GB but also all their 3090/3090 TI/3080/3080 TI stock pile that is rumored to be enormous. Unless they apply an heavy discount and sell them at lost. Realistically AMD will probably raise the 7800 XT price but even if it's a 800$ card, It will still be a great value over Nvidia and it's comes with 4GB extra Vram. The only this that will save Nvidia is all those prebuilt partners that doesn't give shit about AMD because Nvidia is probably threatening them.
 

I Master l

Banned
Does AMD even have the technology/Talent to match Nvidia hardware?
In pure rasterization they are on bar with Nvidia if not better, They just need RT/Tensor cores which is something
doable since Intel managed to make ai hardware in their first gen dedicated GPUs
 
Last edited:

GymWolf

Member
6950XT is a terrible value. It’s $250 more than the 6900XT while offering only a tiny performance boost and consuming more power. And yeah I think there’s a good chance that there’ll be a 7000 series that beats it for under $900

And what’s HDMI 2.2? I wasn’t even aware that existed
I made a mistake with the hdmi port, we are still at 2.1, my bad :lollipop_grinning_sweat:

On amazon italy the difference in price between a 6950 and 6900 is around 30 euros for now, it's the only reason why i even asked for this one.
 
Last edited:

Crayon

Member
I made a mistake with the hdmi port, we are still at 2.1, my bad :lollipop_grinning_sweat:

On amazon italy the difference in price between a 6950 and 6900 is around 30 euros for now, it's the only reason why i even asked for this one.

Sounds like definitely wait a month to see what amd comes up with and if ebay finally floods with used high end cards. (been waiting for that... it seems like it's inevitable but I was under the impression it would have happenned by now.)
 

GymWolf

Member
Sounds like definitely wait a month to see what amd comes up with and if ebay finally floods with used high end cards. (been waiting for that... it seems like it's inevitable but I was under the impression it would have happenned by now.)
Never bought a used gpu and i'm not gonna start now, i'm already super paranoid with picking a new gpu and i don't even overclock :lollipop_grinning_sweat:

i want to wait but i fear to not find anything in a month or 2...
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
I made a mistake with the hdmi port, we are still at 2.1, my bad :lollipop_grinning_sweat:

On amazon italy the difference in price between a 6950 and 6900 is around 30 euros for now, it's the only reason why i even asked for this one.
Ahh okay fair enough, no idea what the market is like over there. As for the question whether 7000 series will have a GPU that outperforms the 6950 for cheaper, I’d be very disappointed if that didn’t happen. It makes sense to wait IMO.

And I highly doubt there will be shortages in a couple months. Worst case you’ll be able to buy a RTX 3000 or Radeon 6000 for cheaper than they are today.
 
Last edited:
Have they confirmed that Racer is actually a playable thing and not just a real-time movie?
Pretty much this.
Why the hell would AMD leave money on the table? People have blindly bought Nvidia for the last 10 years and have directly contributed to the present market conditions.
AMD won't come to gamers' rescue, because they've previously sold GPUs that have offered the same or similar performance as Nvidia at half the price (HD 4870, HD5870, R9 290X) and nobody bought them. Which meant they had no marketshare and no profit.
Times are different now. They're pivoting to becoming a premium brand. Datacentre is where they make most of their money and their presence in consumer markets is nominal. They'll make a bunch of very nice performing cards at low volume, and charge a hefty premium for it.

We made our choice way back when. We chose to pay more, even when better value was available. And so it shall be.
 
Strange logic

I too find these prices to be insane, but a 3060 Ti would still cover most of your needs (especially compared to consoles) for the gen. Why create a nonexistant problem by aiming for the newest generation's top tier cards, but go back to consoles if prices are "hundreds" away from what's there.

Seth Meyers Whatever GIF by Late Night with Seth Meyers
You don’t buy a pc to just equal a console…
 
AMD just needs a card that beats the 4080 (16Gb version) in rasterization - especially at 1440p and 4k. People who game at 1080p dont spend this kinda money usually.

Something new with FSR would help, and Raytracing has to be close if not match the 3 series performance.

All that for $700 or max $750 and it would sell very very well. Now , what are the chances they actually do this - i'd say about 30%.
I just hope the 7900xt remains the same price as the 6900 xt
 
At this point, NVIDIA went so beyond that it won't matter how AMD's cards will perform in "specific" CPU bound cases.

Let's take Flight Simulator for example. In heavy CPU bound photogrametery regions, a 5950x or a 12900k usually becomes CPU bound near 65-70 FPS. Now, you can throw GPU power at it and get 4K/60 with high settings. That's no problem. You can double that (4090/7800xt) and get 4K/120. But problem starts when there is no CPU and won't be a CPU in at least 5 years that will be able to push 120 FPS CPU bound in Flight Sim. What NVIDIA did with their tricky is allows them to circumvent and go around the CPU bottlenecks. All of a sudden a RTX 4080 which enables DLSS 3 at 4K will render something like 150+ frames. A framerate amount that no CPU can achieve natively, as of today. AMD can produce a GPU that has the ability to render 4K 300 frames in Flight Sim, but if current CPUs bog it down near 70 FPS in CPU bound locations, then that power is meaningless for that usecase.

Same for Spiderman. Just look at their marketing videos: They're literally targeting CPU bound games like mad. Spiderman with ray tracing at 200 FPS? 12900k in Times Square bogs down near 100 FPS CPU bound. AMD can potentially make a GPU that renders 200+ FPS with RT enabled at native 1080p, but it will never reach past 100 FPS with something like 12900k, which is the problematic part, whereas with NVIDIA's tricky, you will get north of 160 frames even with a mediocre CPU because their tech is now inserting "frames".

Implications of interpolation, if it really works great, is pretty damning for competition and AMD. If it works good, that is. BECAUSE then, there will be no alternative to that, unless AMD engineers its own similar solution. Which they will most likely have with RDNA4. Because this is not a pish posh technology either, it is a hard thing to get right, and definetely needs a huge R/D, serious hardware accerelation and a software to back it up. It is literally real time video game frame interpolation, which is crazy.

So yeah. It goes beyond pure GPU power. They've literally hacked their way through CPU bottlenecks now. If AMD does not answer back somehow, and if interpolation ends up being a success overall, and if RDNA3 won't have the capability to run it, it will be another generation where AMD will hopelessly watch NVIDIA dominate. And by the time RDNA4 comes and they put similar tech into their hardware, I'm sure Jacket guy will have invented something new.

They become so big that they can invest huge amounts of money into R/D and discover and invent all kinds of crazy stuff. DLSS and Interpolation seems like just the beginning.
Not every game will have dlss3 though?
 
Top Bottom