FromDuskTillGameOver
Member
Can it run Crysis Cyberpunk 2077?
Full PT lighting as standard? PS7 maybe.
Looks fantastic. Need to see a full blown gameplay video in 4K before I say yes though. OP didn’t provide enough cold hard evidence
Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.
A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.
In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.
A 7900XTX is a 60TF+ GPU.
If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.
In which case,I would just say, lets agree to disagree.
Yah that shot there needs automatic exsposure control turned on.![]()
This scene was fine before. Now they added a nuclear bomb blast outside the room.
This kind of extreme exposure makes games look worse for me, not better.
Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.
A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.
In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.
A 7900XTX is a 60TF+ GPU.
If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.
In which case,I would just say, lets agree to disagree.
UE5 has a built in pathtracer as well.They have the most advanced game right now (3 years after release) and they are changing the engine to UE5, I don't get it...
When do you think consoles will get Path Tracing in games like Cyberpunk?
![]()
![]()
![]()
![]()
The PS6/Xbox should have something relatively close, if not faster than the 4090 when they release. They will also benefit from a more advanced architecture.Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.
A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.
In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.
A 7900XTX is a 60TF+ GPU.
If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.
In which case,I would just say, lets agree to disagree.
Exactly...I think we all have been expecting the PS6 to have a 60 - 65 TFs of GPU power, just going by current trends. Throw in 64 GB stack memory and Zen5+ CPU with maybe some Zen6 features. My question is, how much SSD space will come in the PS6. Because if we are moving more digital, it'll need at least 4 TBs of space.
I don't see how I am doing that... My exact argument, from my first post... was
Since that was somehow hard to understand... I meant, in 2028, when considering the 90xx series GPUs, the low end/entry mode GPU then, would more than likely be more powerful than a 4090 is today.Nope... by PS6 we will have that.
PS6 would be here in like what? 2028... thats 5 years from now. Make no mistake by then a 4090 GPU woudnt even be considered a low-end GPU.
The kicker though, is that if consoles in 2028 are coming with GPUs more powerful than a 4090 today... what would be the high-end PC GPU then?
Another important question, is even if the hardware todo it is there... should they?
Maybe that's some hyperbole or maybe you took what I said outta context. What I mean, is that the 4090 that we know today likely would not even be as powerful as the entry GPU/(low-end GPU) of the 90xx series cards or whatever they are called in 2028.The PS6/Xbox should have something relatively close, if not faster than the 4090 when they release. They will also benefit from a more advanced architecture.
My disagreement is with your statement that the 4090 will be obsolete by 2028, which history indicates is false. A 5 year old flagship won't be below entry-level, unless we make massive leaps in RT. It should be mid-range, and with how beastly the 4090 is in terms of gen-on-gen improvement, I wouldn't be the least bit surprised to see it on par or with the 6060. After all, the soon to be 6 year old 2080 Ti is still faster than even the 4060 Ti (which admittedly is a piece of crap).
They have the most advanced game right now (3 years after release) and they are changing the engine to UE5, I don't get it...
Cp2077 with PT looks better technically, but I still like how HFW looks over it.
![]()
Since your very first posting you've been comparing cards from vastly different price and performance brackets for your nonsense comparisons. And you just continue like that. Suddenly the 7900xtx is the same as a 4090 for your argument's sake... As if 20% performance and 500+buck price difference are tooootally comparable.....I don't see how I am doing that... My exact argument, from my first post... was
It's a pretty good experience on a 4070 on up.It's impressive but useless for anyone without a RTX 4090.
Because RT scales so brutally at higher resolutions, the lack of DLSS is also a huge factor for AMD cards.Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.
A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.
In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.
A 7900XTX is a 60TF+ GPU.
If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.
In which case,I would just say, lets agree to disagree.
I really dont know what o say to you....Since your very first posting you've been comparing cards from vastly different price and performance brackets for your nonsense comparisons. And you just continue like that. Suddenly the 7900xtx is the same as a 4090 for your argument's sake... As if 20% performance and 500+buck price difference are tooootally comparable.....
If you don't like getting called out by us "pc types" about hardware then maybe just stop talking nonsense that everyone with a tiny bit of insight can immediately identify as such....
lol, read the bullshit you´ve written and then read the initial OPI really dont know what o say to you....
and I don't know how to continue this discourse without outright being insulting...
It does not matter what cards I compare or what the price/performance brackets are. If the point I am making is.. by 2028, I expect that the next-gen consoles would be more powerful than a 4090 is today. What the fuck does it matter if come 2028, we somehow can buy a console for under $600 that has a GPU that is more powerful than a $1500 GPU today? But more importantly, what does it matter if it's my opinion? Or are you just insecure or something?
Please, take a step back and look at that statement. You either agree or disagree. There is no nonsense to be had there. My problem with you PC types is that all your fucking sense just goes out the window the second you see someone saying console will be as or more powerful than `insert GPU here`. And you g on this warpath without even stopping to see that there is technically, or even historically nothing wrong with what I have said.
I am curious though... what PC GPU do you have?
Ok.. instead of repeatedly insulting me.. could you please point out what I have said that is bullshit and that is nonsense....etc. And how I have made everyone here stupid for even listening to me?lol, read the bullshit you´ve written and then read the initial OP..... gosh.....
![]()
as me and other have repeatedly pointed out your card tier timeline comparisons are bullshit, nothing more nothing less. You´re comparing modern titan bracket past crypto price cards with 2018 normal higher end then claim that "we know from experience"Ok.. instead of repeatedly insulting me.. could you please point out what I have said that is bullshit and that is nonsense....etc. And how I have made everyone here stupid for even listening to me?
legit question...
OG OP asked when we can expect tech like that in consoles. I said by the next-gen, and using the 4090 as an example, went on to say, I expect consoles to be as powerful as that then, and also went onto to say that I expect AMD tech to improve on things like reconstruction and RT.
EVERYTHING else i have said since then,is in support or validation of that same OG point or opinion. I am curious to know what about any of that has set you off on this warpath.
Or is your idea of having a debate to misconstrue what a person says and throw around insults?
Easily. My 3080 gets 40-50 fps at 1440p with DLSS balanced.Can 4080 with DLSS 3 handle this?
Ok...as me and other have repeatedly pointed out your card tier timeline comparisons are bullshit, nothing more nothing less. You´re comparing modern titan bracket past crypto price cards with 2018 normal higher end then claim that "we know from experience"
It´s nonsense from start to finish. Then suddenly it`s about PC vs console which no one here ever adressed except you.
Get those insecurities out of your system and learn to read tech specs.....
That is the question posed in the very first post of this thread.When do you think consoles will get Path Tracing in games like Cyberpunk?
Which was in response to you saying PS7.Nope... by PS6 we will have that.
PS6 would be here in like what? 2028... thats 5 years from now. Make no mistake by then a 4090 GPuwoudnt even be considered a low-end GPU.
The kicker though, is that if consoles in 2028 are coming with GPUs more powerful than a 4090 today... what would be the high-end PC GPU then?
Another important question, is even if the hardware todo it is there... should they?
To which you said this...We have an optimist here
The 4090 is only barely able to do it with DLSS2 + DLSS3.
The hardware level where something like this could actually become standard is 5090 terrain or even beyond that, and I don`t see that kind of power on a 300$ <300w SoC in 2027 at the latest when the base tech for the next gen should be finalized.
really hoping to eat crow on that one, but with how things are going.......
you gave an opinion, I gave mine.
Its not optimism... its common sense and precedent.
considering that you`re throwing 600$ and 1600$ cards in the same basket for your nonsense argumentation that is somehow not an opinion because it`s based on "common sense and precedent" I agree that that`s probably for the best.But tell you what,I bow out of this conversation.
Ok...are you retarded or something?Considering that you`re throwing 600$ and 1600$ cards in the same basket for your nonsense argumentation that is somehow not an opinion because it`s based on "common sense and precedent" I agree that that`s probably for the best.
Couldn't agree more.If all you care about is the technical aspect, sure, it's the most advanced game out there.
If you care about art direction, consistency, cohesion, and everything else, it's very subjective.
I don't find it's the most pleasing game to look at but it's definitely a cut above everything else from a technical perspective. At least in several aspects.
Funny mixture of sheer stupidity and short term memory loss your are displaying here.Ok...are you retarded or something?
How exactly am I throwing $600 and $1000 cards in the same basket?
2080ti launched at $1000
And yet, you are getting a relatively equivalent specced GPU in a console along with a CPU, RAM, and storage all for $499... 26 months after said GPU was released. Even if its not 2080ti spec... they are not that far off.
That is precedent... and it's common sense to say that what we may consider high-end GPUs today, would be console spec GPUs in x number of years in the future. That is also precedent. And history has shown this to be the case.
It doesn't matter if Nvidia is price gouging the industry, thankfully, consoles do not use Nvidia GPUs... but it does not change the fact, that in 2028, over 5 years or 60 months from now...whatever it is that we get from an AMD console spec GPU that would go into the PS6/XSX2, will be more powerful, than the $1500 4090 GPU there today.
This is like the 6th time I have said this exact thing. This is the same thing I have been saying, but your moronic mind has somehow managed to make this about everything else but of the actual thing I am actually taking about.
what's your degree in, wasting money? build your own surely you can figure it out with a mastersJust ordered a 4090 Rig last night… Treated myself after finishing my Masters degree.
From 980TI > 4090
My body is absolutely F’n ready
Poor behavior j/kSigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.
A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.
In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.
A 7900XTX is a 60TF+ GPU.
If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.
In which case,I would just say, lets agree to disagree.
what's your degree in, wasting money? build your own surely you can figure it out with a masters
geforce GTX 1080ti. The highest of the highest end GPUs, back in 2017. a genuine beast of a card and one of the greatest pieces of computer hardware ever engineered, Nvidia's true magnum opus.Do you really think, that in 5 years, a GPU like the 4090 would even be considered low-end?
all good brother was just messing with ya. congratsCybersecurity actually lol
I used to build my machines back in the day. I just don’t have the itch to do so anymore.
Anyway, I view it as money well spent.
Ok.. so, 3 years after the 1080ti... we have console GPUs that are more powerful than it.geforce GTX 1080ti. The highest of the highest end GPUs, back in 2017. a genuine beast of a card and one of the greatest pieces of computer hardware ever engineered, Nvidia's true magnum opus.
it's a midrange card today. 6 years time and it's still a relevant result for gaming today. in 5 years time the 4090 will be a midrange card
if you can't max out or play the current AAA titles at 1080p high with at least 60 fps, you are low end. a 6650xt can do that. a 6600xt can do that. a 3060 can do that. the 1080ti, also, can do that.And a small question, what wud you call the intel Arc A770 GPU? A $320 GPU. low end? Or Mid-range? If you think its a low-end card? Then surely the 1080ti cant be considered mid rag being that its more powerful than a 1080ti.
For me, low end GPUs $400 and under, mid-range, $401 - $800, High end $800+. Based on MSRP and not looking at the used market.
Ok..I tend to use pricing to determine their tier because I take into consideration the cost of the silicon and other components. Basically, TF/$. I man, in 2014, $60 cod gt yu 8GB of GDDR5.In 2022, that same $80 can get you 8GB of GDDR6+... same 8GB, results in a world of difference for the GPU.if you can't max out or play the current AAA titles at 1080p high with at least 60 fps, you are low end. a 6650xt can do that. a 6600xt can do that. a 3060 can do that. the 1080ti, also, can do that.
I can understand that, the new next gen titles are far more demanding to run at 1080p than our last gen stuff. That being said, 1080p medium is rarely needed unless you're running a portable PC. I've been able to play plenty of 2023 titles at 1440/1080p high settings on my 6650xt.Because to me... not a 1080p is made equal.