• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Cyberpunk 2077 With Path-Tracing The best looking game available now?

Alex11

Member
Artistically, probably not, though it is doing very good in that regard, but as of realistic lighting and all that, yes, there's no two ways about it.
And if you're not set on 4k, you don't need a 4090 to play with Overdrive. My 4070 runs it well at 1440p, 60-70fps most of time, can go to lower 50s, of course with DLSS quality and frame generation, that may bother some people, but I really don't feel the input lag.

EdTC5uc.jpg
 

mckmas8808

Banned
Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.

A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.

In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.

A 7900XTX is a 60TF+ GPU.

If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.

In which case,I would just say, lets agree to disagree.

I think we all have been expecting the PS6 to have a 60 - 65 TFs of GPU power, just going by current trends. Throw in 64 GB stack memory and Zen5+ CPU with maybe some Zen6 features. My question is, how much SSD space will come in the PS6. Because if we are moving more digital, it'll need at least 4 TBs of space.
 

Schmendrick

Member
Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.

A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.

In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.

A 7900XTX is a 60TF+ GPU.

If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.

In which case,I would just say, lets agree to disagree.

giphy.gif
 

nemiroff

Gold Member
Yeah, there's nothing like it out there. And no wonder, it's catering to the highest end.

BTW, I like how the "low quality" asset npcs for the city simulation look a LOT more grounded just by introducing PT. The weird "eye glow" f.ex. is mostly gone.

The best thing, it's not even in its finished form.. They're working on improvements for the gfx/PT, some of which will probably be introduced together with the coming DLC.
 
Last edited:

Jigsaah

Gold Member
When do you think consoles will get Path Tracing in games like Cyberpunk?

4128873-20230418_010810.jpg

4128874-20230418_011356.jpg






4128880-20230423_172333.jpg

4128881-20230417_004546.jpg

I'd say it looks pretty good from a graphical standpoint. However there are somethings about Cyberpunk where I guess I just don't agree with the artstyle. It's strictly personal preference and not meant to take away from them achieving this. In first person and without UI (aka Photo Mode) the game has some incredible vistas and you can find beauty in the smallest details as well. But then the character models just come in and fuck everything up IMO.
 

Gaiff

SBI’s Resident Gaslighter
Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.

A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.

In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.

A 7900XTX is a 60TF+ GPU.

If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.

In which case,I would just say, lets agree to disagree.
The PS6/Xbox should have something relatively close, if not faster than the 4090 when they release. They will also benefit from a more advanced architecture.

My disagreement is with your statement that the 4090 will be obsolete by 2028, which history indicates is false. A 5 year old flagship won't be below entry-level, unless we make massive leaps in RT. It should be mid-range, and with how beastly the 4090 is in terms of gen-on-gen improvement, I wouldn't be the least bit surprised to see it on par or with the 6060. After all, the soon to be 6 year old 2080 Ti is still faster than even the 4060 Ti (which admittedly is a piece of crap).
 
This like basically buttering up a shell with shit inside.

Hopefully Phantom Liberty makes the game a lot closer to what people thought it was going to be. Then we can talk.

The game is like Hogwarts Legacy. Nice to look at, immersive in it's aesthetic but shallow in it's gameplay.
 

Mr.Phoenix

Member
I think we all have been expecting the PS6 to have a 60 - 65 TFs of GPU power, just going by current trends. Throw in 64 GB stack memory and Zen5+ CPU with maybe some Zen6 features. My question is, how much SSD space will come in the PS6. Because if we are moving more digital, it'll need at least 4 TBs of space.
Exactly...
I don't see how I am doing that... My exact argument, from my first post... was


Nope... by PS6 we will have that.

PS6 would be here in like what? 2028... thats 5 years from now. Make no mistake by then a 4090 GPU woudnt even be considered a low-end GPU.

The kicker though, is that if consoles in 2028 are coming with GPUs more powerful than a 4090 today... what would be the high-end PC GPU then?

Another important question, is even if the hardware todo it is there... should they?
Since that was somehow hard to understand... I meant, in 2028, when considering the 90xx series GPUs, the low end/entry mode GPU then, would more than likely be more powerful than a 4090 is today.

I meant, that in 2028, the consoles we have then, would be more powerful than a 4090 is today. That is what I said, and what I am still saying.

But hey,if winning is so important to you? Take the win.
The PS6/Xbox should have something relatively close, if not faster than the 4090 when they release. They will also benefit from a more advanced architecture.

My disagreement is with your statement that the 4090 will be obsolete by 2028, which history indicates is false. A 5 year old flagship won't be below entry-level, unless we make massive leaps in RT. It should be mid-range, and with how beastly the 4090 is in terms of gen-on-gen improvement, I wouldn't be the least bit surprised to see it on par or with the 6060. After all, the soon to be 6 year old 2080 Ti is still faster than even the 4060 Ti (which admittedly is a piece of crap).
Maybe that's some hyperbole or maybe you took what I said outta context. What I mean, is that the 4090 that we know today likely would not even be as powerful as the entry GPU/(low-end GPU) of the 90xx series cards or whatever they are called in 2028.

I am not saying that the 4090 becomes obsolete... i mean, how would I be saying its obsolete if I am saying the Nextgen consoles would have power on par or better than it? Wouldn't that be me saying next-gen consoles are obsolete on arrival?
 

Umbasaborne

Banned
It can look amazing, but some times it can look a little gnarly with ghosting, and weird glittering on some meshes due to the amount of light it’s trying to bounce. But yeah, when it looks good, it looks super fucking good
 

Haint

Member
They have the most advanced game right now (3 years after release) and they are changing the engine to UE5, I don't get it...

"Pro" consoles won't be able to run this, even next gen is doubtful. A 4090 barely runs it at a 1080p internal resolution, a genuine 45-50TF card in the old single instruction/clock sense (90-100 in the new double count sense). Lumen is a good enough approximation, and nanite is a huge advantage over Red. At least consoles will be able to run some version of it.

Cp2077 with PT looks better technically, but I still like how HFW looks over it.
4QjzZzD.gif

Because Horizon dumps all its compute power into assets/geometry, Cyberpunk's assets are decidedly middling last gen quality. Horizon definitely has a more obvious and immediate wow factor thanks to that, compounded by the fact devs and modern techniques are very good at faking lighting.
 
Last edited:

Schmendrick

Member
I don't see how I am doing that... My exact argument, from my first post... was
Since your very first posting you've been comparing cards from vastly different price and performance brackets for your nonsense comparisons. And you just continue like that. Suddenly the 7900xtx is the same as a 4090 for your argument's sake... As if 20% performance and 500+buck price difference are tooootally comparable.....

If you don't like getting called out by us "pc types" about hardware then maybe just stop talking nonsense that everyone with a tiny bit of insight can immediately identify as such....
 
Last edited:

Crayon

Member
I think so but the excitement is blunted if you don't buy $1000+ cards. It's more like a peek at the future. Possibly near future, tho. One can hope.
 

SF Kosmo

Al Jazeera Special Reporter
I've spent a couple hours with it, and yeah it kind of is. There might be games that beat it on art direction or asset quality, but in terms of rendering, it's 2023's Crysis.

It's impressive but useless for anyone without a RTX 4090.
It's a pretty good experience on a 4070 on up.
 

SF Kosmo

Al Jazeera Special Reporter
Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.

A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.

In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.

A 7900XTX is a 60TF+ GPU.

If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.

In which case,I would just say, lets agree to disagree.
Because RT scales so brutally at higher resolutions, the lack of DLSS is also a huge factor for AMD cards.

FSR is not a DLSS competitor and never will be. Hopefully AMD adds some better support for tensor and maybe XeSS can emerge as an open standard.
 
Last edited:
Based on the world complexity, these games are the best looking games rn, imo: Cyberpunk 2077, The Witcher 3 (RTX), Red Dead Redemption 2, Horizon Forbidden West.

Rockstar still has a big problem with their engine (in terms of physics-based engine). Distant images still generate strange pixelated artefacts. It's the same issue with GTA V. Imagine RDR2 with path tracing :messenger_confused:

yWs27oN.jpeg


hUSdy7a.jpeg


3xVxGmt.jpeg


mRx0nbr.jpeg


fTMxxYg.jpeg


kOYsyTJ.jpeg


iAYWNa3.jpeg
 
Last edited:

Mr.Phoenix

Member
Since your very first posting you've been comparing cards from vastly different price and performance brackets for your nonsense comparisons. And you just continue like that. Suddenly the 7900xtx is the same as a 4090 for your argument's sake... As if 20% performance and 500+buck price difference are tooootally comparable.....

If you don't like getting called out by us "pc types" about hardware then maybe just stop talking nonsense that everyone with a tiny bit of insight can immediately identify as such....
I really dont know what o say to you....

and I don't know how to continue this discourse without outright being insulting...

It does not matter what cards I compare or what the price/performance brackets are. If the point I am making is.. by 2028, I expect that the next-gen consoles would be more powerful than a 4090 is today. What the fuck does it matter if come 2028, we somehow can buy a console for under $600 that has a GPU that is more powerful than a $1500 GPU today? But more importantly, what does it matter if it's my opinion? Or are you just insecure or something?

Please, take a step back and look at that statement. You either agree or disagree. There is no nonsense to be had there. My problem with you PC types is that all your fucking sense just goes out the window the second you see someone saying console will be as or more powerful than `insert GPU here`. And you g on this warpath without even stopping to see that there is technically, or even historically nothing wrong with what I have said.

I am curious though... what PC GPU do you have?
 
Last edited:

Schmendrick

Member
I really dont know what o say to you....

and I don't know how to continue this discourse without outright being insulting...

It does not matter what cards I compare or what the price/performance brackets are. If the point I am making is.. by 2028, I expect that the next-gen consoles would be more powerful than a 4090 is today. What the fuck does it matter if come 2028, we somehow can buy a console for under $600 that has a GPU that is more powerful than a $1500 GPU today? But more importantly, what does it matter if it's my opinion? Or are you just insecure or something?

Please, take a step back and look at that statement. You either agree or disagree. There is no nonsense to be had there. My problem with you PC types is that all your fucking sense just goes out the window the second you see someone saying console will be as or more powerful than `insert GPU here`. And you g on this warpath without even stopping to see that there is technically, or even historically nothing wrong with what I have said.

I am curious though... what PC GPU do you have?
lol, read the bullshit you´ve written and then read the initial OP :messenger_grinning_smiling::messenger_grinning_smiling::messenger_grinning_smiling:
billy-madison-dumber.gif
 
Last edited:

Mr.Phoenix

Member
lol, read the bullshit you´ve written and then read the initial OP..... gosh.....
billy-madison-dumber.gif
Ok.. instead of repeatedly insulting me.. could you please point out what I have said that is bullshit and that is nonsense....etc. And how I have made everyone here stupid for even listening to me?

legit question...

OG OP asked when we can expect tech like that in consoles. I said by the next-gen, and using the 4090 as an example, went on to say, I expect consoles to be as powerful as that then, and also went onto to say that I expect AMD tech to improve on things like reconstruction and RT.

EVERYTHING else i have said since then,is in support or validation of that same OG point or opinion. I am curious to know what about any of that has set you off on this warpath.

Or is your idea of having a debate to misconstrue what a person says and throw around insults?
 

Schmendrick

Member
Ok.. instead of repeatedly insulting me.. could you please point out what I have said that is bullshit and that is nonsense....etc. And how I have made everyone here stupid for even listening to me?

legit question...

OG OP asked when we can expect tech like that in consoles. I said by the next-gen, and using the 4090 as an example, went on to say, I expect consoles to be as powerful as that then, and also went onto to say that I expect AMD tech to improve on things like reconstruction and RT.

EVERYTHING else i have said since then,is in support or validation of that same OG point or opinion. I am curious to know what about any of that has set you off on this warpath.

Or is your idea of having a debate to misconstrue what a person says and throw around insults?
as me and other have repeatedly pointed out your card tier timeline comparisons are bullshit, nothing more nothing less. You´re comparing modern titan bracket past crypto price cards with 2018 normal higher end then claim that "we know from experience"
It´s nonsense from start to finish. Then suddenly it`s about PC vs console which no one here ever adressed except you.
Get those insecurities out of your system and learn to read tech specs.....
 
Last edited:

Mr.Phoenix

Member
as me and other have repeatedly pointed out your card tier timeline comparisons are bullshit, nothing more nothing less. You´re comparing modern titan bracket past crypto price cards with 2018 normal higher end then claim that "we know from experience"
It´s nonsense from start to finish. Then suddenly it`s about PC vs console which no one here ever adressed except you.
Get those insecurities out of your system and learn to read tech specs.....
Ok...

When do you think consoles will get Path Tracing in games like Cyberpunk?
That is the question posed in the very first post of this thread.

To which I said
Nope... by PS6 we will have that.

PS6 would be here in like what? 2028... thats 5 years from now. Make no mistake by then a 4090 GPuwoudnt even be considered a low-end GPU.

The kicker though, is that if consoles in 2028 are coming with GPUs more powerful than a 4090 today... what would be the high-end PC GPU then?

Another important question, is even if the hardware todo it is there... should they?
Which was in response to you saying PS7.

This entire thing, should have ended there.You gave an opinion, I gave mine. And I singled out the 4090 GPU because its where I expect next gen consoles to sit at power-wise.

We have an optimist here :messenger_grinning_sweat:
The 4090 is only barely able to do it with DLSS2 + DLSS3.
The hardware level where something like this could actually become standard is 5090 terrain or even beyond that, and I don`t see that kind of power on a 300$ <300w SoC in 2027 at the latest when the base tech for the next gen should be finalized.

really hoping to eat crow on that one, but with how things are going.......
To which you said this...

I don't know what the issue is here. If its that we have a different idea of what standard is? or if it's something else.

But tell you what,I bow out of this conversation. I see it's going nowhere.
 

Schmendrick

Member
Last edited:
I have a 4090 and while it looks absolutely gorgeous, it falls apart in motion. You get so much blurring and ghosting from all the temporal algorithms meant to make it run on mere mortals PCs. Maybe some day in the next 5 years or so there will be enough performance overhead to do it without relying on AI generated frames to get decent performance, but that day is not today.
 

Mr.Phoenix

Member
Considering that you`re throwing 600$ and 1600$ cards in the same basket for your nonsense argumentation that is somehow not an opinion because it`s based on "common sense and precedent" I agree that that`s probably for the best.
Ok...are you retarded or something?

How exactly am I throwing $600 and $1000 cards in the same basket?

2080ti launched at $1000

And yet, you are getting a relatively equivalent specced GPU in a console along with a CPU, RAM, and storage all for $499... 26 months after said GPU was released. Even if its not 2080ti spec... they are not that far off.

That is precedent... and it's common sense to say that what we may consider high-end GPUs today, would be console spec GPUs in x number of years in the future. That is also precedent. And history has shown this to be the case.

It doesn't matter if Nvidia is price gouging the industry, thankfully, consoles do not use Nvidia GPUs... but it does not change the fact, that in 2028, over 5 years or 60 months from now...whatever it is that we get from an AMD console spec GPU that would go into the PS6/XSX2, will be more powerful, than the $1500 4090 GPU there today.

This is like the 6th time I have said this exact thing. This is the same thing I have been saying, but your moronic mind has somehow managed to make this about everything else but of the actual thing I am actually taking about.
 

KXVXII9X

Member
If all you care about is the technical aspect, sure, it's the most advanced game out there.

If you care about art direction, consistency, cohesion, and everything else, it's very subjective.

I don't find it's the most pleasing game to look at but it's definitely a cut above everything else from a technical perspective. At least in several aspects.
Couldn't agree more.
 

Schmendrick

Member
Ok...are you retarded or something?

How exactly am I throwing $600 and $1000 cards in the same basket?

2080ti launched at $1000

And yet, you are getting a relatively equivalent specced GPU in a console along with a CPU, RAM, and storage all for $499... 26 months after said GPU was released. Even if its not 2080ti spec... they are not that far off.

That is precedent... and it's common sense to say that what we may consider high-end GPUs today, would be console spec GPUs in x number of years in the future. That is also precedent. And history has shown this to be the case.

It doesn't matter if Nvidia is price gouging the industry, thankfully, consoles do not use Nvidia GPUs... but it does not change the fact, that in 2028, over 5 years or 60 months from now...whatever it is that we get from an AMD console spec GPU that would go into the PS6/XSX2, will be more powerful, than the $1500 4090 GPU there today.

This is like the 6th time I have said this exact thing. This is the same thing I have been saying, but your moronic mind has somehow managed to make this about everything else but of the actual thing I am actually taking about.
Funny mixture of sheer stupidity and short term memory loss your are displaying here.
"I have an opinion, but it is not an opinion, because it's a fact, because there are precedents... Not anywhere near the same hardware or price tier or anything but anyways ugahbugah". :D :D
The 4090 is a (Titan) tier above the 2080ti and even that level of performance still costs you 400+bucks to this day, 5 years after launch but sure it's all just price gauging that will toooootally vanish soon. /s
And hey 2080,2080ti or 4090... All the same right? After all, what's a few ... Dozen...percent performance or a thousand bucks price tier difference as long as the hardware has the same logo, riiight? :D

Anyways, as you said AMD will probably make up for all of this. Have faith yee believers, AMD can do it. They haven't done it this gen, or the gen before or the one before that, but but but...2027 will be that year, guaranteed. :D


Your postings get dumber and funnier the longer this goes on. And that is before the fact that we're not even talking about 4090 performance when it comes to broad use of path tracing but more about Titan 5xxx performance.
But sure the next console gen will have that, just like the current "has" RT now.
Can you also predict the lottery numbers, because you sure pretend to know more than the GPU manufacturers themselves.
 
Last edited:

Exentryk

Member
It's great the CDPR is pushing the boundaries. The tech will eventually become more accessible. Looking forward to when the majority can run this.
 

M1chl

Currently Gif and Meme Champion
Sigh... what I find beyond dumb is how shallow this argument is.. Then again doesn't surprise me, is always the case with you PC types.

A 7900XTX is a $1000 GPU. That Nvidia decided to gouge its customers is not my business.

In relative performance, a 7900XTX is only 18% behind the 4090. Where it really suffers tough, is in RT performance in which case AMD RT tech is still at least a generation behind Nvidia.

A 7900XTX is a 60TF+ GPU.

If what you are saying, is that in 2028, the new consoles would not have a GPU, that has a 60TF+ ratster performance, and then an RT performance that is on par or better to what the 4090 currently is, and that AMD would not have improved their RT and FSR tech between now and then... 5 years from now... then continue.

In which case,I would just say, lets agree to disagree.
Poor behavior j/k
 

Elbereth

Member
what's your degree in, wasting money? build your own surely you can figure it out with a masters

Cybersecurity actually lol

I used to build my machines back in the day. I just don’t have the itch to do so anymore. 🤷‍♂️

Anyway, I view it as money well spent.
 

64bitmodels

Reverse groomer.
Do you really think, that in 5 years, a GPU like the 4090 would even be considered low-end?
geforce GTX 1080ti. The highest of the highest end GPUs, back in 2017. a genuine beast of a card and one of the greatest pieces of computer hardware ever engineered, Nvidia's true magnum opus.

it's a midrange card today. 6 years time and it's still a relevant result for gaming today. in 5 years time the 4090 will be a midrange card
 

Gametrek

Banned
When I was a kid Bill and Ted was okay. Now it is pointless. As long as I know any cash from this game goes into that persons hand. Nope.
What was this game even about? Fat people walking outside and everybody looks Moe pugly??>>
 
Last edited:

Mr.Phoenix

Member
geforce GTX 1080ti. The highest of the highest end GPUs, back in 2017. a genuine beast of a card and one of the greatest pieces of computer hardware ever engineered, Nvidia's true magnum opus.

it's a midrange card today. 6 years time and it's still a relevant result for gaming today. in 5 years time the 4090 will be a midrange card
Ok.. so, 3 years after the 1080ti... we have console GPUs that are more powerful than it.

So my point still stands? That in 5 years we would have cnsoeGPUs that are as or more powerful than the 4090.

And maybe my own measurement of what a low-end or midrange GPU just differs... but

And a small question, what wud you call the intel Arc A770 GPU? A $320 GPU. low end? Or Mid-range? If you think its a low-end card? Then surely the 1080ti cant be considered mid rag being that its more powerful than a 1080ti.

For me, low end GPUs $400 and under, mid-range, $401 - $800, High end $800+. Based on MSRP and not looking at the used market.
 

64bitmodels

Reverse groomer.
And a small question, what wud you call the intel Arc A770 GPU? A $320 GPU. low end? Or Mid-range? If you think its a low-end card? Then surely the 1080ti cant be considered mid rag being that its more powerful than a 1080ti.

For me, low end GPUs $400 and under, mid-range, $401 - $800, High end $800+. Based on MSRP and not looking at the used market.
if you can't max out or play the current AAA titles at 1080p high with at least 60 fps, you are low end. a 6650xt can do that. a 6600xt can do that. a 3060 can do that. the 1080ti, also, can do that.
 

Mr.Phoenix

Member
if you can't max out or play the current AAA titles at 1080p high with at least 60 fps, you are low end. a 6650xt can do that. a 6600xt can do that. a 3060 can do that. the 1080ti, also, can do that.
Ok..I tend to use pricing to determine their tier because I take into consideration the cost of the silicon and other components. Basically, TF/$. I man, in 2014, $60 cod gt yu 8GB of GDDR5.In 2022, that same $80 can get you 8GB of GDDR6+... same 8GB, results in a world of difference for the GPU.

I find... It's also a good way to predict what to expect in the next-gen consoles.

Because to me... not all 1080p is made equal.
 
Last edited:

64bitmodels

Reverse groomer.
Because to me... not a 1080p is made equal.
I can understand that, the new next gen titles are far more demanding to run at 1080p than our last gen stuff. That being said, 1080p medium is rarely needed unless you're running a portable PC. I've been able to play plenty of 2023 titles at 1440/1080p high settings on my 6650xt.

Low end is getting to the point where you actively have to think about the graphics settings during play, that's what i consider low end IMO. stuff like the 1070, 1650, 1660, 580, etc.

Our definitions wildly vary here
 
Last edited:
Top Bottom