• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3090 to have 24GB GDDR6X VRAM, RTX 3080 10GB VRAM

thelastword

Banned
Not remotely correct. LOD is higher. Geometery, shadoows, lighitng, effects. Games are more than just resolution and framerate.
Of course it's more than rez and framerate, but geometry, shadow, lighting built from a 1080p core machine is not going to make the game look a generation apart from the 1080p version in geometry, shadows lighting. We've all seen it before.......A boost to AF will give you more clarity on textures and so will resolution give you some better IQ, but essentially the asset base will not be transformative...

If anything, higher PC component prices ends up giving them a reason to price the consoles higher. They can say 'hey look you're still technically getting a much better deal than PC!'
Not really, that's why there are APU's which are getting more powerful everyday at ridiculously low power draw..... A typical target market for a 3090 is very small compared to a $500 console that will sell 100+ million units.....It's why PC right now is pretty much just pushing some higher rez shadows, better rez, better AF+AA and of course better framerate over the consoles......But to say, that COD or any game really is a different visual and aesthetic experience from console to PC is not at all truthful...

The divide in hardware parts vs the uptick in visual fidelity is very much compromised.......I played Horizon Zero Dawn on my PRO at 4K CB 30fps, amazing visuals. The pluses you can slide on the PC version to the max does not make it look that much different than what I played on the PRO version. Still, the very best GPU right now with the very best CPU, only plays HZD at around 40fps at 4K, that's around $500+ for an i9 and $1200 for a 2080ti, of course, if only you didn't do other upgrades during the last 7 years before you got to such a rig, which means that your spending would have been much more, just to keep up with ultra sliders...My PRO I have had since launch, only $400....and the OG PS4 still renders the game beautiful and playable, giving the same experience of Horizon when the credit rolls...

This is not to discredit that the better hardware has it's perks, just that it's not substantial vs the amount of money placed on the table...
 
  • Like
Reactions: HTK

OmegaSupreme

advanced basic bitch
Of course it's more than rez and framerate, but geometry, shadow, lighting built from a 1080p core machine is not going to make the game look a generation apart from the 1080p version in geometry, shadows lighting. We've all seen it before.......A boost to AF will give you more clarity on textures and so will resolution give you some better IQ, but essentially the asset base will not be transformative...


Not really, that's why there are APU's which are getting more powerful everyday at ridiculously low power draw..... A typical target market for a 3090 is very small compared to a $500 console that will sell 100+ million units.....It's why PC right now is pretty much just pushing some higher rez shadows, better rez, better AF+AA and of course better framerate over the consoles......But to say, that COD or any game really is a different visual and aesthetic experience from console to PC is not at all truthful...

The divide in hardware parts vs the uptick in visual fidelity is very much compromised.......I played Horizon Zero Dawn on my PRO at 4K CB 30fps, amazing visuals. The pluses you can slide on the PC version to the max does not make it look that much different than what I played on the PRO version. Still, the very best GPU right now with the very best CPU, only plays HZD at around 40fps at 4K, that's around $500+ for an i9 and $1200 for a 2080ti, of course, if only you didn't do other upgrades during the last 7 years before you got to such a rig, which means that your spending would have been much more, just to keep up with ultra sliders...My PRO I have had since launch, only $400....and the OG PS4 still renders the game beautiful and playable, giving the same experience of Horizon when the credit rolls...

This is not to discredit that the better hardware has it's perks, just that it's not substantial vs the amount of money placed on the table...
What's 500 dollars to you might be 100 dollars for someone else. Money is not equal.
I would argue that a pc game can be hugely apart from its console counterpart. Especially when you count modding. For some the jump to 60 fps from 30 is enough.
 

thelastword

Banned
What's 500 dollars to you might be 100 dollars for someone else. Money is not equal.
I would argue that a pc game can be hugely apart from its console counterpart. Especially when you count modding. For some the jump to 60 fps from 30 is enough.
To me modding is just another way to justify expensive rigs, because the ultra sliders don't offer enough of a visual upgrade over the base console or base dev platform.......So a guy is willing to wait for a game to release on PC, then wait a few months for a guy to patch in some higher rez textures, better lighting, better foliage and vegetation. At that point it's gravitating more to justifying how you can exploit the unused power of your GPU that you paid so much money for, because the released game is not using that power to it's max, so you will never get that.....Spending months just to make a game look better as opposed to playing the game for the experience , seems like a hobby that's not exactly gaming to me....

Look at that guy still working on the RE4 texture and asset upgrade and if Capcom decides to remake RE4, it may very well look much better on their new engine with RT in tow. It's the same for HZD. It plays 4k 40fps with lows of 29fps on an i9 and 2080ti, but I'm sure Guerilla can probably make this look much better and 4K 60fps on PS5. The price of PS5 vs just the i9 should be about the same, so we won't even include the 2080ti...
 

OmegaSupreme

advanced basic bitch
To me modding is just another way to justify expensive rigs, because the ultra sliders don't offer enough of a visual upgrade over the base console or base dev platform.......So a guy is willing to wait for a game to release on PC, then wait a few months for a guy to patch in some higher rez textures, better lighting, better foliage and vegetation. At that point it's gravitating more to justifying how you can exploit the unused power of your GPU that you paid so much money for, because the released game is not using that power to it's max, so you will never get that.....Spending months just to make a game look better as opposed to playing the game for the experience , seems like a hobby that's not exactly gaming to me....

Look at that guy still working on the RE4 texture and asset upgrade and if Capcom decides to remake RE4, it may very well look much better on their new engine with RT in tow. It's the same for HZD. It plays 4k 40fps with lows of 29fps on an i9 and 2080ti, but I'm sure Guerilla can probably make this look much better and 4K 60fps on PS5. The price of PS5 vs just the i9 should be about the same, so we won't even include the 2080ti...
You dont need a high end card for many mods though. Most even. Do you buy everything day one? I dont. So if I miss out on a game at launch but buy it six months later odds are it has some mods to go with it or at the very least perfomance patches.
 
Last edited:

Korranator

Member
I hope these get released soon.

My current card a lowly 970 is starting to act up. Slight, grinding noise every now and then.
 

HarryKS

Member
What's 500 dollars to you might be 100 dollars for someone else. Money is not equal.
I would argue that a pc game can be hugely apart from its console counterpart. Especially when you count modding. For some the jump to 60 fps from 30 is enough.
Broski, that's not how wealth distribution works. The mass market is always in the middle of standard deviation chart, never at the tails.
 

ZZZZ

Member
To me modding is just another way to justify expensive rigs, because the ultra sliders don't offer enough of a visual upgrade over the base console or base dev platform.......So a guy is willing to wait for a game to release on PC, then wait a few months for a guy to patch in some higher rez textures, better lighting, better foliage and vegetation. At that point it's gravitating more to justifying how you can exploit the unused power of your GPU that you paid so much money for, because the released game is not using that power to it's max, so you will never get that.....Spending months just to make a game look better as opposed to playing the game for the experience , seems like a hobby that's not exactly gaming to me....

Look at that guy still working on the RE4 texture and asset upgrade and if Capcom decides to remake RE4, it may very well look much better on their new engine with RT in tow. It's the same for HZD. It plays 4k 40fps with lows of 29fps on an i9 and 2080ti, but I'm sure Guerilla can probably make this look much better and 4K 60fps on PS5. The price of PS5 vs just the i9 should be about the same, so we won't even include the 2080ti...
We will see the difference in Cyberpunk between PS5 and 3090 soon, don't worry.
 
Last edited:

supernova8

Banned
The divide in hardware parts vs the uptick in visual fidelity is very much compromised.......I played Horizon Zero Dawn on my PRO at 4K CB 30fps, amazing visuals. The pluses you can slide on the PC version to the max does not make it look that much different than what I played on the PRO version. Still, the very best GPU right now with the very best CPU, only plays HZD at around 40fps at 4K, that's around $500+ for an i9 and $1200 for a 2080ti, of course, if only you didn't do other upgrades during the last 7 years before you got to such a rig, which means that your spending would have been much more, just to keep up with ultra sliders...My PRO I have had since launch, only $400....and the OG PS4 still renders the game beautiful and playable, giving the same experience of Horizon when the credit rolls...

Oh sure I'm not claiming PC is 'better value'. It's not if all you want to do is play games.

If we take your example of HZD at 4K30 on PS4 Pro, I would assume people (on here at least) probably bought also bought a PS4 early on (maybe at launch?) before getting a PS4 Pro so combining you'll have spent $800($399 x 2 machines). Of course you could trade in your PS4 and get a deal on PS4 Pro and actually pay less than $399 for it but people can also sell their PC parts so we'll leave that to one side for now (although I do concede it's a talking point).

So the question would be, is there a system that could be had for a total of $800 back in November 2013 (PS4 launch) that would have run HZD games at 4K30, and without doing any system upgrades at all up to this point? The answer is an emphatic NO.

Even if we went overboard on the budget (GTX 780 was available November 2013 for $649, for the GPU alone) that system still wouldn't even be able to maintain 30fps at 1440p on low settings, and that's using a 9700K that wasn't available back in 2013.



Even the 980 Ti released a year later can only get to 40fps + in 4K if you use the very lowest settings.
 
Last edited:

Jonsoncao

Banned
Just curious, what is a "researcher" like you doing on a gaming forum? Important research?
I came here 15 years ago while at college. I used to play Halo 3, Demon's Souls, and Ace Combat online a lot.

Doing machine learning research, teaching math/stats/coding are my day job. I play games less and less nowadays since having shifted my gaming platform from PS3/360 to PC and more time on family. However, watching console fanboys fighting over dated hardwares on neogaf, as if SONY and M$ are their daddys, is always enjoyable.
 
Oh sure I'm not claiming PC is 'better value'. It's not if all you want to do is play games.

If we take your example of HZD at 4K30 on PS4 Pro, I would assume people (on here at least) probably bought also bought a PS4 early on (maybe at launch?) before getting a PS4 Pro so combining you'll have spent $800($399 x 2 machines). Of course you could trade in your PS4 and get a deal on PS4 Pro and actually pay less than $399 for it but people can also sell their PC parts so we'll leave that to one side for now (although I do concede it's a talking point).

So the question would be, is there a system that could be had for a total of $800 back in November 2013 (PS4 launch) that would have run HZD games at 4K30, and without doing any system upgrades at all up to this point? The answer is an emphatic NO.

Even if we went overboard on the budget (GTX 780 was available November 2013 for $649, for the GPU alone) that system still wouldn't even be able to maintain 30fps at 1440p on low settings, and that's using a 9700K that wasn't available back in 2013.



Even the 980 Ti released a year later can only get to 40fps + in 4K if you use the very lowest settings.

I would be surprised, if gtx 1080 wouldn't be able to get 30 fps on medium settings at half 4K in HZD like ps4 pro which also launched in 2016.

Unless HZD is exceptionally terrible port. As you know in Death Stranding gtx 1080 which launched same year as ps4 pro offers 3 times the performance. [~90 fps at 2560x1440; ps4pro runs that game at half 4k cb and unstable 30fps with a ton of nasty frame drops when driving]
 
Last edited:

supernova8

Banned
I would be surprised, if gtx 1080 wouldn't be able to get 30 fps on medium settings at half 4K in HZD like ps4 pro which also launched in 2016.

GTX 1080 can squeeze 30fps average at 4K Ultra (at least on the benchmark) but that's another $600 on top of what you'd already have paid for your system.
Plus, in 2013 the best CPU you'd have would be something like a 4970K but if you wanted to upgrade you'd have to buy an entirely new system (new CPU socket) so there's at least $100 for another motherboard. Possibly more costs if your new motherboard doesn't support your existing RAM.
 

Mister Wolf

Member
Oh sure I'm not claiming PC is 'better value'. It's not if all you want to do is play games.

If we take your example of HZD at 4K30 on PS4 Pro, I would assume people (on here at least) probably bought also bought a PS4 early on (maybe at launch?) before getting a PS4 Pro so combining you'll have spent $800($399 x 2 machines). Of course you could trade in your PS4 and get a deal on PS4 Pro and actually pay less than $399 for it but people can also sell their PC parts so we'll leave that to one side for now (although I do concede it's a talking point).

So the question would be, is there a system that could be had for a total of $800 back in November 2013 (PS4 launch) that would have run HZD games at 4K30, and without doing any system upgrades at all up to this point? The answer is an emphatic NO.

Even if we went overboard on the budget (GTX 780 was available November 2013 for $649, for the GPU alone) that system still wouldn't even be able to maintain 30fps at 1440p on low settings, and that's using a 9700K that wasn't available back in 2013.



Even the 980 Ti released a year later can only get to 40fps + in 4K if you use the very lowest settings.


Horizon Zero Dawn does not run at 4K 30 on the PS4 Pro. It uses half the pixels of 4K and checkerboard rendering.

 
Last edited:

I_D

Member
Oh sure I'm not claiming PC is 'better value'. It's not if all you want to do is play games.

If we take your example of HZD at 4K30 on PS4 Pro, I would assume people (on here at least) probably bought also bought a PS4 early on (maybe at launch?) before getting a PS4 Pro so combining you'll have spent $800($399 x 2 machines). Of course you could trade in your PS4 and get a deal on PS4 Pro and actually pay less than $399 for it but people can also sell their PC parts so we'll leave that to one side for now (although I do concede it's a talking point).

So the question would be, is there a system that could be had for a total of $800 back in November 2013 (PS4 launch) that would have run HZD games at 4K30, and without doing any system upgrades at all up to this point? The answer is an emphatic NO.

Even if we went overboard on the budget (GTX 780 was available November 2013 for $649, for the GPU alone) that system still wouldn't even be able to maintain 30fps at 1440p on low settings, and that's using a 9700K that wasn't available back in 2013.



Even the 980 Ti released a year later can only get to 40fps + in 4K if you use the very lowest settings.

I'm not disagreeing with your point. It's pretty clear that consoles have a better value-per-dollar than PC parts do. And designing for a closed system is obviously going to produce better results than designing for a million different possible configurations.


But using the launch of the PS4 Pro would be a better comparison though, right? Or maybe comparing the 780 to the regular-PS4's settings?
The conclusion will still be basically the same - consoles are great values - but at least the comparison would be a bit more accurate.




My 980ti just died out of nowhere, so now I'm on my backup 1060, and it's hurting me. This 3000 series can't come soon enough as far as I'm concerned.
 

supernova8

Banned
I'm not disagreeing with your point. It's pretty clear that consoles have a better value-per-dollar than PC parts do. And designing for a closed system is obviously going to produce better results than designing for a million different possible configurations.

But using the launch of the PS4 Pro would be a better comparison though, right? Or maybe comparing the 780 to the regular-PS4's settings?
The conclusion will still be basically the same - consoles are great values - but at least the comparison would be a bit more accurate.

My 980ti just died out of nowhere, so now I'm on my backup 1060, and it's hurting me. This 3000 series can't come soon enough as far as I'm concerned.


Yeah sure GTX 780 can get about 40fps on 1080p low settings but again that's much more than the price of a PS4 before you buy anything other than the GPU, so as you rightly point out, it doesn't make a huge difference.

I personally prefer my PC either way because it's much faster just in general stuff. I hate how much lag there is on the PS4 menus and that's without going into performance on some games - Witcher 3 is practically unplayable for me on PS4 it gives me a headache.
 

supernova8

Banned
Horizon Zero Dawn does not run at 4K 30 on the PS4 Pro. It uses half the pixels of 4K and checkerboard rendering.


Yeah absolutely but I challenge most people to notice the difference between 4K and 4KCB when sitting at a normal distance from the TV (ie TV to sofa) and not having both running side by side. In other words, it's just a smart trick Sony uses and I applaud them for it.
 

I_D

Member
Yeah sure GTX 780 can get about 40fps on 1080p low settings but again that's much more than the price of a PS4 before you buy anything other than the GPU, so as you rightly point out, it doesn't make a huge difference.

I personally prefer my PC either way because it's much faster just in general stuff. I hate how much lag there is on the PS4 menus and that's without going into performance on some games - Witcher 3 is practically unplayable for me on PS4 it gives me a headache.
Oh, I'm with you on that one. The only time I've touched a console in at least two years was to play Bloodborne.
I highly doubt this will be the case, but hopefully devs will be able to target at least 60fps as a standard in the coming generation. As it currently stands, playing console games is horrendous compared to PC performance.
 

supernova8

Banned
Oh, I'm with you on that one. The only time I've touched a console in at least two years was to play Bloodborne.
I highly doubt this will be the case, but hopefully devs will be able to target at least 60fps as a standard in the coming generation. As it currently stands, playing console games is horrendous compared to PC performance.

Ah yeah I've hardly touched my PS4. I played Yakuza and Judgement. Tried (and failed) to get in to GT Sport. Got sick of Spiderman although it is a pretty game.

Think I'd rather restart my WoW subscription or play Squad on PC.
 

Mister Wolf

Member
Yeah absolutely but I challenge most people to notice the difference between 4K and 4KCB when sitting at a normal distance from the TV (ie TV to sofa) and not having both running side by side. In other words, it's just a smart trick Sony uses and I applaud them for it.



Checkerboard 4K is directly comparable to 3200 x 1800.
 

jigglet

Banned
People saying 10gb is not enough relative to the 11gb of the 1080ti from 3 years ago, I have a question; has any game even used 11gb of vram? Do you even foresee any game in the next few years even coming close to 11gb?

And I don't really buy into this future proofing stuff, my GPU is headed straight for the bin after about 3-5 years.

(ok not the bin but the second hand market, but let's be real here, what enthusiast really expects "future proofed" performance beyond 5 years? I find the notion of "future proofing" a little absurd, high end PC's are for older people with more disposible income, if you've spent everything you have as a youngin on a card then that's just irresponsible)

tldr; I'm happy with 10gb.
 
Last edited:

Garibaldi

Member
I'm going from a first gen i7 with a GTX580 to a top end Zen3 and RTX3090. Bring it on! I can't wait!

Full disclosure: I've got a PS4Pro too but it doesn't sound as dramatic hah
 
Last edited:
Yeah absolutely but I challenge most people to notice the difference between 4K and 4KCB when sitting at a normal distance from the TV (ie TV to sofa) and not having both running side by side. In other words, it's just a smart trick Sony uses and I applaud them for it.
It's very blatant. CB 4K doesn't look like 4K. The image is softer on PS4 Pro with a shitload of post-processing and there's heavy artifacting when Aloy's hair moves around.
 
Last edited:

Hudo

Member
That sounds like neither the RTX 3090 nor the RTX 3080 will be available for less than $1000, lol.
 

GymWolf

Member
Why do you say that? Legitimately curious.
Because the last top tier gpu (2080ti) was not capable of doing 4k60 ultra\high details after a couple of months from release with heavy\broken games and not every game use dlss (it's a small minority for now).

So how do you suppose to do top tier setting for 7-8 years with a 3090??

Of course you can play at 1440p with mixed setting when the gpu became old and games more heavy, but expecting to play with the best possible setting for 7-8 years straight is absurd to even think about it.

Gpu with that kind of longevity simply don't exist, especially in these grim times where devs don't give a fuck about optimization on pc.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
It's very blatant. CB 4K doesn't look like 4K. The image is softer on PS4 Pro with a shitload of post-processing and there's heavy artifacting when Aloy's hair moves around.

It's only very blatant in a direct comparison video with the problematic sections pointed out to you, magnified by at least 200% and you're watching that video on a computer monitor that's right in front of you. But those visual defects will be unnoticeable when you're playing that game on a PS4 Pro game while sitting 2 meters or more away from the tv.
 
Near infinite backwards compatibility and free online are also great reasons. Far cheaper games. 60fps or higher support. edit: what @BacklashWave534 said as well.
You dont need a 500$+ card to play 10+ old games. Free online functions is quiete a good point IF the anti hacking support is also reliable. This whole "cheaper games" mentality is not only hurting gamedevs in the short run, but also the gaming culture as a whole. As i isaid, if 4k/60FPS becomes the new console standard. Idont want derail this topic to pc vs consoles, so i think im good now.
 
It's only very blatant in a direct comparison video with the problematic sections pointed out to you, magnified by at least 200% and you're watching that video on a computer monitor that's right in front of you. But those visual defects will be unnoticeable when you're playing that game on a PS4 Pro game while sitting 2 meters or more away from the tv.
No it’s very blatant on my KS8000 sitting at a normal distance from the TV. And those visual defects such as the bad AF are pretty obvious too. So is the post-processing with its copious amounts of motion blur.

The menus if I remember are rendered in 4K and are crisp compared to the rest of the game.

The image quality isn’t bad but it’s obvious it isn’t 4K.
 

Knightime_X

Member
Could the 3080 surprise us all and be significantly cheaper given the massive difference in Ram?

Also if games aren't really using all that ram in the 3090 today by the time any where near 24 will be utilised won't the card be outdated in other ways? 🤔
Yeah it'll kinda be like paying $50,000,000 for a 4k TV in 1995.
All that TV, all that money, all wasted. Because you can't fully utilize it.
 
Last edited:

OmegaSupreme

advanced basic bitch
Broski, that's not how wealth distribution works. The mass market is always in the middle of standard deviation chart, never at the tails.
We aren't talking about the mass market though. We are talking about enthusiasts. The exact kind of people that post on forums.
 
Last edited:

GHG

Member
No it’s very blatant on my KS8000 sitting at a normal distance from the TV. And those visual defects such as the bad AF are pretty obvious too. So is the post-processing with its copious amounts of motion blur.

The menus if I remember are rendered in 4K and are crisp compared to the rest of the game.

The image quality isn’t bad but it’s obvious it isn’t 4K.

The bad AF is nothing to do with CB rendering though.

Personally I can't see a huge difference when sat ~7ft from a 55" TV. I also cant see a huge difference between 1440p and 4k at that distance.

I'm all about DLSS/CB/whatever because unless you're sat up close the difference isn't big enough considering the trade-off in performance.
 

ZywyPL

Banned
Because the last top tier gpu (2080ti) was not capable of doing 4k60 ultra\high details after a couple of months from release with heavy\broken games and not every game use dlss (it's a small minority for now).

So how do you suppose to do top tier setting for 7-8 years with a 3090??

Of course you can play at 1440p with mixed setting when the gpu became old and games more heavy, but expecting to play with the best possible setting for 7-8 years straight is absurd to even think about it.

Gpu with that kind of longevity simply don't exist, especially in these grim times where devs don't give a fuck about optimization on pc.

Ultra settings are almost always invisible, while tanking the performance by half. Secondly, and most importantly, let's not lie ourselves, 4K on PC is a myth, it exist almost entirely in nothing but benchmarks on the internet and marketing slogans/ads, the vast majority of people who aims for those cards use 1440p 144Hz screens. So taking everything into consideration, the card will most likely do more than fine in the next 5-10 years. Especially given that consoles which set up the baseline are now made from PC parts, so if a card is good on day one it'll be almost just as good on the last day of the generation. And if rumors about DLSS3 turn out to be true and you'll be able to enable it in every single title that uses TAA on top of already 20-22TF or so computing power, they yeah, the card should do really fine in the upcoming decade or so.
 

supernova8

Banned
It's very blatant. CB 4K doesn't look like 4K. The image is softer on PS4 Pro with a shitload of post-processing and there's heavy artifacting when Aloy's hair moves around.

Blatant is a pretty strong word. The only reason the vast majority people who claim to be able to tell the difference can do so is because they were shown the difference on a side-by-side Youtube video.
 

supernova8

Banned
Ultra settings are almost always invisible, while tanking the performance by half. Secondly, and most importantly, let's not lie ourselves, 4K on PC is a myth, it exist almost entirely in nothing but benchmarks on the internet and marketing slogans/ads, the vast majority of people who aims for those cards use 1440p 144Hz screens. So taking everything into consideration, the card will most likely do more than fine in the next 5-10 years. Especially given that consoles which set up the baseline are now made from PC parts, so if a card is good on day one it'll be almost just as good on the last day of the generation. And if rumors about DLSS3 turn out to be true and you'll be able to enable it in every single title that uses TAA on top of already 20-22TF or so computing power, they yeah, the card should do really fine in the upcoming decade or so.

Yeah personally I'd rather go for 120 Hz and above as opposed to going from 1440p to 4K. There is a difference but it's not significant enough in motion for me to give more than 1 shit about it. Unless you have a gigantic TV, the difference from 1440p to 4K isn't even that great. Even more true when you have a really small 4K gaming monitor (which is pointless IMO).

I'm glad I stuck with a 1080p 165Hz monitor. Absolutely no interest in tanking my framerates for no reason.
 

GymWolf

Member
Ultra settings are almost always invisible, while tanking the performance by half. Secondly, and most importantly, let's not lie ourselves, 4K on PC is a myth, it exist almost entirely in nothing but benchmarks on the internet and marketing slogans/ads, the vast majority of people who aims for those cards use 1440p 144Hz screens. So taking everything into consideration, the card will most likely do more than fine in the next 5-10 years. Especially given that consoles which set up the baseline are now made from PC parts, so if a card is good on day one it'll be almost just as good on the last day of the generation. And if rumors about DLSS3 turn out to be true and you'll be able to enable it in every single title that uses TAA on top of already 20-22TF or so computing power, they yeah, the card should do really fine in the upcoming decade or so.
That's why i said ultra\high, i know that a lot of ultra setting are bullshit.
agree to disagree for the 4k stuff, a lot of people play pc in their 4k oled tv or 4k monitor and buy those gpu to get 4k resolution like me.

dlss on every game would be wonderfull and it will sure prolong the life of every gpu, but not for 10 years (again, if you wanna play with high resolution\fps\details).

We will talk again when the gen is gonna end to see how people play with a 3090...
 
Top Bottom