• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Lamen here: Why do pc games simply do not blow out console games graphically?

Gamer79

Predicts the worst decade for Sony starting 2022
I know a 4090 has like 80-90 teraflops of processing power vs the PS5's 8-10 teraflops. A 4090 cost about 3x what a ps5 does alone so it should be much more powerful. With that said, Why do we not see the difference? Yes I know a 4090 can run similiar looking games at increased framerates but why do we not see eye melting graphics.

Throwing all the numbers out of the WIndows, I am going to use the tools I was born with the eye test. I look at Playstation games like Ratchet and Clank remake, Horizon Forbidden West, Dark Souls remake, even the Resident Evil 4 remake and others that visually can go toe to toe with what the pc has to offer. Why? The pc is clearly superior and it's price point reflects that but why do I not see the Vastly improved visuals? I get it that on pc one has to develop for the lowest common denominator but there are always some studios who are going to push the envelope.
 

T4keD0wN

Member
Why waste too many resources on developing something that only the 1% of people who have a 4090 will experience?

They want to sell it to largest audience possible, the priority is on the majority. Pathtracing is nice and all, but i see why its only in 4 games (2 of those being RTX remix titles) or so.
 
Last edited:
Are you trying to say layman?

ULdhqxO.gif
 
I get it that on pc one has to develop for the lowest common denominator but there are always some studios who are going to push the envelope.

Not any more unfortunately. Dev costs are too high these days to make AAA games targeted for just cutting edge PCs (Which is a smaller segment of the gaming audience then some people realise) especially since that crowd has made it clear that they're just as happy with the same games everyone else is getting but with higher framerates.
 

StueyDuck

Member
Diminishing returns. The rate at which technology is getting stronger is faster but with every leap the gains will be lesser.

I don't think for a fuck that pc or consoles are close currently, but the gap will lessen between everything as time goes on.

This is why we have reconstruction techniques and ML sumpersampling and so on and so forth
 
Last edited:

th4tguy

Member
Such a small percentage of PC users actually have the hardware being talked about to push tech enough to have the differences you are wanting to see. No dev is going to require such high specs. It limits the sales too much.
Push graphics is going to cost more money in dev time.
Push graphics is going to limit user base/ limit sales.
Devs have to find a balance, which is why you don't see devs pushing pc hardware as much.

On consoles, you get what you get and everyone has the same hardware. Devs can push it as hard as they want, knowing everyone who plays it on that console are going to have the same experience and they can potentially have a sale on every console unit sold.
 

CrustyBritches

Gold Member
It's production costs and diminishing returns. Games have looked good enough to me since last gen, so I'm more interested in frame rate and responsiveness. I'd rather have a 120fps PS4-looking game than a 30fps PS5 game.
 
Because there are not really any games that truly utilize and push hard on a 4090's horse power.
Most games when you maxed' out graphic settings and turn off the ray tracing, the 4090 GPU load is almost never full. Ray tracing is taxing and fancy, of course, but it doesn't always drastically improve graphics.
 
Last edited:

Vick

Member
Because games aren't made to fully take advantage of the highest architectures and what the power is used for nowadays is usually more advanced ray tracing support, which most of the time goes almost unnoticed by untrained eyes.

But there is a difference, and a pretty massive one in certain cases.
Granted it's not that quantum leap there was in some previous gens because resolution and framerate are high enough on consoles now and in terms of temporal coverage most console games are pristine, but if you know what to look for on ray traced games, difference on some multi is still massive.
 

64bitmodels

Reverse groomer.
To me, AAA titles like TLOU, GOW and Horizon might not always push graphics/performance as much as PC
The other 2 are questionable but Horizon is undoubtedly a marquee graphics showcase and has done probably as much as cyberpunk for advancing visual fidelity in games. Those graphics are ridiculously good.
 

64bitmodels

Reverse groomer.
You should've been around in the early days of 3D accelerators. Now that was a leap from consoles that today's gamer couldn't even begin to comprehend.
Texture filtering, dynamic lighting, no texture shimmer and 90fps. That must've been a crazy time for gamers.
 

Gaiff

SBI’s Resident Gaslighter
Depends what games. A handful of games are made to scale dramatically with powerful PC hardware. These are the ones that usually blow every console games out of the water.

Right now there is: Cyberpunk 2077, Avatar: Frontiers of Pandora, Alan Wake 2

Not many though, simply because most studios don't have the budget for top-tier animations, texture work, art, technology, etc. Those tend to be mostly 1st-party developers and a large portion of them belong to Sony who scale them to their hardware and then port them to PC without many improvements.

Hellblade 2 on PC might be a showcase too depending on how well they push the visuals on better hardware.
 

RickSanchez

Gold Member
Senua Senua said it best, OP. You need to play a graphically intensive game on a 4090 on a high quality 4K monitor with atleast a 60 or 120Hz refresh rate, with all settings cranked up to maximum to truly grasp the graphical power.

You seem to be making comparisons using second hand info such as YouTube videos, which ruin the resolution as people have pointed out.

Cyberpunk 2077, Avatar Frontiers of Pandora, Alan Wake 2, Red Dead Redemption 2, Jedi Survivor, etc look basically lifelike on a 4090. I can tell you from firsthand experience. Can't wait for Horizon Forbidden West.
 
Last edited:

Elysium44

Banned
The textures are deisgned around the latest console hardware, the PC upgrade is mostly a resolution upgrade and some other tweaks.

This is it, and in truth the differences on PC are minor and most people probably don't care. I've played Resident Evil 4 on PC at the highest settings, and then I played it on Xbox Series S where it's a lower resolution and details. If you go all 'Digital Foundry' you can pick holes in it, if you want, but if you just play it rather than pausing it and pixel peeping, you essentially can't tell the difference.
 

Topher

Gold Member
Senua Senua said it best, OP. You need to play a graphically intensive game on a 4090 on a high quality 4K monitor with atleast a 60 or 120Hz refresh rate to truly grasp the graphical power.

You seem to be making comparisons using second hand info such as YouTube videos, which ruin the resolution as people have pointed out.

Cyberpunk 2077, Avatar Frontiers of Pandora, Alan Wake 2, Red Dead Redemption 2, Jedi Survivor, etc look basically lifelike on a 4090. I can tell you from firsthand experience. Can't wait for Horizon Forbidden West.

And 4090 is really overkill in that comparison since PS5 will typically max out at 60fps. A 4080, maybe a 4070 (not sure), will still get those higher visuals but at lower frames than the 4090.
 

Senua

Member
This is it, and in truth the differences on PC are minor and most people probably don't care. I've played Resident Evil 4 on PC at the highest settings, and then I played it on Xbox Series S where it's a lower resolution and details. If you go all 'Digital Foundry' you can pick holes in it, if you want, but if you just play it rather than pausing it and pixel peeping, you essentially can't tell the difference.
Sure if you played it on PC at a super low resolution. 4k alone would provide an insane boost in image quality. Doesn't Series S run RE4 at like 1080p checkerboarded? That's insanely soft.
 

Vick

Member
The other 2 are questionable but Horizon is undoubtedly a marquee graphics showcase and has done probably as much as cyberpunk for advancing visual fidelity in games. Those graphics are ridiculously good.
I agree GoW being somewhat questionable, but having just replayed Part II not TLOU.
That game looked like this on a PS4 four years ago.. these are all four years old Pro gameplay screenshots, not even from the Remaster.

m6RhqF4.png


JQVNPzs.png


aVRdfKc.png


x7ywkGk.png


gTFIfY6.png


K7hiIuC.png


XQsPTQk.png


eIroJTS.png


Gb5HTKS.png


m17gmdK.png


bbF6iOo.png


bMNvtWX.png


EWT1zFC.png


N2b4G9A.png


nUBbcRE.png


CvyAMwY.png








 

Guilty_AI

Member
Its worth noting that budget and talent are a much more constraining factor than hardware. Many games go for simpler graphics because it is cheaper that way

Also, gameplay is king. Developing a graphical monster like Alan Wake 2 does not necessarely mean you'll get your money worth's back.
 

Elysium44

Banned
Sure if you played it on PC at a super low resolution. 4k alone would provide an insane boost in image quality. Doesn't Series S run RE4 at like 1080p checkerboarded? That's insanely soft.

Other than a few PC elitists and snobs, nobody cares about 4K. Most people game at 1080p which is still fine. Until a game can make the visuals look better than the best blu-ray movie (which are 1080p), and at 60fps rather than 24, then bumping resolution doesn't fundamentally improve the graphics. 4K was a mistake, a gimmick, and is why graphics progression is stuck since around 2010.
 
This is it, and in truth the differences on PC are minor and most people probably don't care. I've played Resident Evil 4 on PC at the highest settings, and then I played it on Xbox Series S where it's a lower resolution and details. If you go all 'Digital Foundry' you can pick holes in it, if you want, but if you just play it rather than pausing it and pixel peeping, you essentially can't tell the difference.
These graphics comparison videos are cringe when they look virtually the same with only a very subtle difference in lighting or a little longer length grass that isn't even noticeable until you pause and zoom in. People use them as fuel for console wars and to decide which version they are buying, like why? You don't even notice it.
 

Vick

Member
Also, gameplay is king. Developing a graphical monster like Alan Wake 2 does not necessarely mean you'll get your money worth's back.
But I don't think that took all that much effort to begin with, game is basically automated raytracing brute force.

Console version sucks.
 

Killjoy-NL

Member
The other 2 are questionable but Horizon is undoubtedly a marquee graphics showcase and has done probably as much as cyberpunk for advancing visual fidelity in games. Those graphics are ridiculously good.
The other two are polished for sure.

Pushing tech is arguable.
 

Senua

Member
Other than a few PC elitists and snobs, nobody cares about 4K. Most people game at 1080p which is still fine. Until a game can make the visuals look better than the best blu-ray movie (which are 1080p), and at 60fps rather than 24, then bumping resolution doesn't fundamentally improve the graphics. 4K was a mistake, a gimmick, and is why graphics progression is stuck since around 2010.
I recently upgraded from a 2006 Sony Bravia 1080p TV to a LG C3 and the upgrade was fucking insane. The upgrade in clarity was genuinely eye popping, not even talking about HDR or anything either, just resolition. I'm not going to shit on 1080p like a lot of people do but to say 4k was a mistake is just ridiculous, it's a HUGE upgrade.

4k with DLAA is just heaven right now.

I do think 4k should be the focus for a long time going forward though, fuck 8k for a LONG time.

But I don't think that took all that much effort to begin with, game is basically automated raytracing brute force.

Console version sucks.
Huh? The game looks fucking insane without raytracing. The only reason the console versions suck is because the internal resolution is way too low, and that + FSR equals an artifact ridden mess. The asset quality, characters and shaders among many other things in Alan Wake 2 is fucking nuts tbh.
 
Last edited:

64bitmodels

Reverse groomer.
Other than a few PC elitists and snobs, nobody cares about 4K.
Judging by how the ps5 and xsx marketed themselves as 4k beast machines and got 80 million sales off of that it's safe to say that people do somewhat care.

Though I do agree that 4k is pretty mid and that ai upscaling solutions are the way to go from here... a hardware solution that is also best experienced on pc through Nvidia graphics
 

Hohenheim

Member
When you experience modern games like Cyberpunk on a 4090 rig, in 4K with path tracing all and the good stuff that GPU gives you, well.. you WILL see the difference. A huge difference.
In general, going back to console after playing a few games on PC..the difference is quite huge.
I have a 4090 and a 4K monitor, and it's a huge visual upgrade from my PS5.
Huge!
 
Last edited:

Elysium44

Banned
I agree GoW being somewhat questionable, but having just replayed Part II not TLOU.
That game looked like this on a PS4 four years ago.. these are all four years old Pro gameplay screenshots, not even from the Remaster.

Artistic and programming excellence goes a long way doesn't it? Still nothing has surpassed those graphics, which were done on the relative potato hardware of PS4. If a game came out for PS5 now it has to run at 400% the resolution so all the extra horsepower which could have gone on producing four times more detail and realism is instead wasted on a resolution bump, and/or the other gimmick which is very heavy on resources but makes little to no difference, ray tracing.
 

Spyxos

Gold Member
I know a 4090 has like 80-90 teraflops of processing power vs the PS5's 8-10 teraflops. A 4090 cost about 3x what a ps5 does alone so it should be much more powerful. With that said, Why do we not see the difference? Yes I know a 4090 can run similiar looking games at increased framerates but why do we not see eye melting graphics.

Throwing all the numbers out of the WIndows, I am going to use the tools I was born with the eye test. I look at Playstation games like Ratchet and Clank remake, Horizon Forbidden West, Dark Souls remake, even the Resident Evil 4 remake and others that visually can go toe to toe with what the pc has to offer. Why? The pc is clearly superior and it's price point reflects that but why do I not see the Vastly improved visuals? I get it that on pc one has to develop for the lowest common denominator but there are always some studios who are going to push the envelope.
Cyberpunk or Alan Wake 2 are on the Pc assuming you have the hardware a very different experience.
 

Mr.Phoenix

Member
Anyone that tells you they do? Is talking out of their ass. And when you take the mountain of differences between the specced-out hardware into consideration, YouTube or DF shouldn't have any say in the quality.

But there is an excellent reason why they don't though. No one is making games specifically for that upper limit of PC hardware. People seem to forget that when a PC game is made, it covers the entire spectrum of PC hardware availability. The weakest platform is not a console, its a PC... it also just happens to also be the strongest.

If you want to see a game that would blow everything else away, then someone needs to make a game that you can only run with an overclocked 13900k and a 4090. And at best at 30fps. With basic RT.
 
Last edited:

Elysium44

Banned
Not really. 4K is just a resolution lol. It's not like anyone has to use it.

The point is that if that resolution is the target for a console game then it puts a ceiling on how good the graphics can be. If games capped out at 1080p (and let people who have 4K TVs simply upscale to 4K) then graphics could be a LOT better.
 

Senua

Member
Artistic and programming excellence goes a long way doesn't it? Still nothing has surpassed those graphics, which were done on the relative potato hardware of PS4. If a game came out for PS5 now it has to run at 400% the resolution so all the extra horsepower which could have gone on producing four times more detail and realism is instead wasted on a resolution bump, and/or the other gimmick which is very heavy on resources but makes little to no difference, ray tracing.
Nothing has surpassed TLoU2? You clearly just haven't experienced enough games mate. Alan Wake 2 on a high end PC absolutely crushes it. It's not 2020 anymore.
 
Top Bottom