• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

My argument still stands. Regardless of the nm size, they will have to develop TODAY as in - right now - in order to put something out for PS6. And PS6 isn't the next-generation of latest tech anyway, the AMD GPUs for the PC are. So we will know what future console will look like when the next GPU from AMD comes out and it'll more than likely will be more powerful than the PS6. It will probably fall somewhere at the low-end of the new generation of cards - which will probably be less powerful than a 3090 (regardless of efficiency advances).
Ps5 came with features that didnt appear on pc gpus till 2 years prior. It also benefited from latest amd efficiency and feature set as sendit sendit said.

Nvidia was recently showing realtime path tracing on complex scenes iirc. If we're lucky soon gpus able to handle such in games may come to be

If nanite can be made to work with deformable geometry such solutions may even get h/w accelerated and apply to all objects. Certain aspects of running such could get 10+x improvement in performance going from software implementation to h/w implementation.

Also some believe intels raytracing solution may be even faster than nvidias. Such if true could be reversed engineered and applied to future consoles.
 
Last edited:
Gordon Ramsay Facepalm GIF by Masterchef


yes, that 1080p, super aliased demo... really CGI like...
This is disingenuous…at the end of the day the demo has reached offline rendered quality visuals and its a real playable demo…its also an early “demo” it’s running in real time on $500 hardware a year from the consoles launch…2 to 3 years from now this demo which isn’t super aliased btw, will be improved upon…bottom line is image quality wasn’t super aliased and it reached CGI like visuals…

 
This is why I waste my time talking to these guys. They make these claims EVERY single generation. It's laughable. And then they predict supercomputer-like hardware for the next generation ignoring any advances in the PC space. It's as if the consoles are always the lead platform with new advances despite the consoles adopting old tech developed years ago. They somehow believe that the next hardware will be better than the latest hardware that comes from Nvidia/AMD. Instead they should be paying attention to what the PC leads with and scaling DOWN from that.
You do know sometimes consoles out do PC in certain aspects ( PS5 SSD, PS3 Cell processor, Xbox 360 Xenos, etc) and the best looking games are on consoles ( ignoring graphics settings) no real time playable games look better than The Matrix Awakens, TLOU II, HFW, Ratchet and Clank Rift apart, spiderman MM on PC… PC is pretty much 3rd party games at max settings, no real exclusives take advantage of RTX 3090 or we would have the best visuals available on PC…consoles are more efficient with closed architecture and API…

This is 1.8 TFLOPS… (running on PS5 60 fps patch)

 
Last edited:
Those games are still rendering clothing that look and move like cardboard cutouts man. Even in the cutscenes of those newer games every character look straight up muscleless in their jaws with animated lips, shit look crazy.

This is disingenuous…at the end of the day the demo has reached offline rendered quality visuals and its a real playable demo…its also an early “demo” it’s running in real time on $500 hardware a year from the consoles launch…2 to 3 years from now this demo which isn’t super aliased btw, will be improved upon…bottom line is image quality wasn’t super aliased and it reached CGI like visuals…


Same as the people who harp on Returnal being 1080
 

rofif

Can’t Git Gud
You do know sometimes consoles out do PC in certain aspects ( PS5 SSD, PS3 Cell processor, Xbox 360 Xenos, etc) and the best looking games are on consoles ( ignoring graphics settings) no real time playable games look better than The Matrix Awakens, TLOU II, HFW, Ratchet and Clank Rift apart, spiderman MM on PC… PC is pretty much 3rd party games at max settings, no real exclusives take advantage of RTX 3090 or we would have the best visuals available on PC…consoles are more efficient with closed architecture and API…

This is 1.8 TFLOPS… (running on PS5 60 fps patch)


This is incredible. Throws punches with uncharted 4 for some best graphics.
And the hidden champion in Death Stranding. People forget that cutscenes in this are also real time.
Some stuff matches or exceeds tlou characters. Terrain looks great too

Some spoilers but skip to 3min for the graphics.
 

SeraphJan

Member
I knew various game dev personally, I'm told by many of them that working for an unified architecture is way easier and preferable compare to working for PC platform. They've also told me, high end PC might seems more powerful in the eyes of consumer, but from the perspective of developer developing for console is much more preferred since they could make sure every gamer is experiencing the same thing from their vision.

When they are developing game for PC, they need extra headroom as well, they can't squeeze every last bit of memory allocation to make a certain scene happen because there is always lower end PC to accommodate for, not to mention high end PC owners are the minority not the majority, the most popular GPU today on steam is still GTX 1060, when dev is making game for PC they are actually aiming for the most popular GPU in mind. And when you look at it from this perspective, your perception of current gen console's hardware capability might begin to change. If you want proof, just look at God of War 2018 on steam (which according to Digital Foundry, they basically just ported the game with minimal enhancement), it still looks better than 99% of the game on steam even in 2022 standard, while most multiplatform game that's also on steam which released in 2018 looks a bit outdated in 2022 standard even counting the remasters.

People that worry about this gen's potential is overthinking, if a Jaguar CPU plus a 1.8 tfops GPU could run something as good as The Last of Us Part 2, Red Dead 2, Detroit Become Human, there is no reason a Ryzen 3700X and RTX 2080 equivalent could not make something way better.
 
Last edited:

VFXVeteran

Banned
Then why are you predicting the future performance/capabilities of the PS6?
I'm only trying to be realistic about performance. Surely you have to admit that this kind of talk happens every generation and it's always down on power at the start of the generation. Don't you agree that the PS4/X1 was underpowered and required a mid-gen refresh? There has never been a mid-gen refresh in the history of consoles until the PS4 days.

And no it isn't. However, aside from missing dedicated hardware for AI accelerated functions. The PS5 and XSX support pretty much all of the latest technologies that surround the RDNA 2 architect.
I disagree. There isn't dedicated hardware for RT either. They are using shader cores for the RT which eats up bandwidth on the rasterization side.
 
Last edited:

VFXVeteran

Banned
This is disingenuous…at the end of the day the demo has reached offline rendered quality visuals and its a real playable demo…its also an early “demo” it’s running in real time on $500 hardware a year from the consoles launch…2 to 3 years from now this demo which isn’t super aliased btw, will be improved upon…bottom line is image quality wasn’t super aliased and it reached CGI like visuals…


That is completely ignorant of the CG side of things. That matrix demo isn't even close to CG-like visuals. I can break down every aspect that's not up for the task but it would be wasting time and energy for nothing as nothing will change your outlook on it - just like early PS4 generation claims.
 

VFXVeteran

Banned
You do know sometimes consoles out do PC in certain aspects ( PS5 SSD, PS3 Cell processor, Xbox 360 Xenos, etc) and the best looking games are on consoles ( ignoring graphics settings)
This is and will always be the subjective opinion of Sony gamers. It's simply objectively false with regards to tech. The high end GPU cards are doing way more on the rendering pipeline than any of the consoles.

no real time playable games look better than The Matrix Awakens,
Matrix awakens is a demo and can be run on a PC at better fidelity.

TLOU II, HFW, Ratchet and Clank Rift apart, spiderman MM on PC
Of course Sony warrior. All of those games are better looking than PC games because you "see what you want to see". I've given up on trying to change that because it's beating a dead horse.

… PC is pretty much 3rd party games at max settings, no real exclusives take advantage of RTX 3090 or we would have the best visuals available on PC…consoles are more efficient with closed architecture and API…
The bandwidth of the 3090 is being used to 100% capacity by rendering at 4k/60 with all RT features enabled and even that's not enough. Requiring DLSS in order to hit that mark. The consoles are nowhere near that kind of power and I doubt the PS6 will have that kind of power too unless it uses hardware AI chips. 4k is not just at final render framebuffer but must be carried throughout the entire rendering pipeline for 1:1 pixel to shading quality. As far as bandwidth capabilities go, the consoles have already been tapped out this generation. Rendering a 4k/60FPS framebuffer with full RT features on a console is out of the question this generation.
 
Last edited:

sendit

Member
I'm only trying to be realistic about performance. Surely you have to admit that this kind of talk happens every generation and it's always down on power at the start of the generation. Don't you agree that the PS4/X1 was underpowered and required a mid-gen refresh? There has never been a mid-gen refresh in the history of consoles until the PS4 days.


I disagree. There isn't dedicated hardware for RT either. They are using shader cores for the RT which eats up bandwidth on the rasterization side.

Consoles will always be behind the performance curve if you're comparing it to PC hardware that follows a tick tock schedule instead of every 6-7 years. Additionally, consoles not having dedicated RT hardware is the same reason why AMD cards don't have dedicated RT hardware in their PC part.

If I had it my way, I'd want a refresh every 1-2 years in the console space.
 
This is why I waste my time talking to these guys. They make these claims EVERY single generation. It's laughable. And then they predict supercomputer-like hardware for the next generation ignoring any advances in the PC space. It's as if the consoles are always the lead platform with new advances despite the consoles adopting old tech developed years ago. They somehow believe that the next hardware will be better than the latest hardware that comes from Nvidia/AMD. Instead they should be paying attention to what the PC leads with and scaling DOWN from that.

The idea that PC is the lead platform is laughable and displays someone who is misinformed of the true nature of game development.
You think because you get a LOD, shadow, ambient occlusion, volumetric resolution slider you think your PC is the lead platform? that's laughable.
None of that stuff mean-fully contributes to the actual visual fidelity of a game scene.

Consoles ARE the target platform. The characters in the ~1 TFLOP console have the same geometry density (triangles) as your 10x more powerful GPU and 10x+ more powerful CPU. The same is the case for general object geometry density. A water bottle on the 10x more power PC isn't 1,000 triangles and then on the 1 TFLOP console, its 15 triangles....They area both 15 triangles. We have access to fbx game files through hackers, they are IDENTICAL.

An object's triangles (geometric density) and lighting are the most important part of visual fidelity. Heck even the textures are completely identical and only liike 1% of games offer 4k textures on PC and this only started gaining traction because XOX started supporting 4k textures. Material Quality is also important and are identical across the board.

For example just like HFW, current gen games are about to get a huge upgrade in character triangles, with better material and skin shading, better eye shading hair, peach fuzz.

Yet why didn't games on PC uniquely have these improvements as the gen progressed with PC getting more powerful and XBO and PS4 staying static? why didn't they have peach fuzz? For years they had GPU around the power level of XSX and PS5... This is a simple logical answer if PC WAS the lead platform. Yes you had one offs like TreeFX on tomb raider on PC vs 360 but that is typically the usual %1 exceptions.

Yet this gen, a huge amount of games will have significantly better character fidelity WITH peach fuzz.

If you want me to use a game as an example, then Look at the 2012 Watchdogs demo. When the game finally released in 2014, it was downgraded 4 games. It time after that first reveal it was downgraded significantly. The original demo ran on a NVIDIA GeForce GTX 680 that is 3.0 TFLOPS (~2.0 attainable) with only 2GB VRAM. If PC were lead platform, then the PC version would have looked like the 2012 demo. Yet its not even in the same stratophere. Why did they downgrade object density (triangles), lighting, particles, interior building, and textures? Making it look identical to what we got on consoles. WTF i thought PC were lead platform?

Even then in 2016 when Watchdogs 2 released, PC GPUs had gotten so powerful. We had GTX 1080 that is 9 TFLOPs with 8 GB VRAM. Yet Watchdogs 2 on PC looked worse than the 4x downgraded Watchdogs 1. What gives? I thought PC were the lead platform? Wouldn't that mean tools and technique would be developed to take advantage of it? Why did Watchdog 2 look virtually identical to its console counterpart and worse than Watchdog 1?

Then in 2020 we had Watch dogs legions, again 8 years after the original Watchdogs e3 demo. GPUs on PC are orders of magnitude more powerful than the NVIDIA GeForce GTX 680 that ran the original demo. Yet legion with all the Ray tracing sliders on earth STILL doesn't hold a candle to the original demo. It doesn't even come close.
Yet when you compare for example a ~1TLFOP box with a ~10TFLOP box. It looks almost identical? Why is that? I thought PC was lead platform?

ITS BECAUSE THE THINGS THAT MATTERS ARE COMPLETELY IDENTICAL!!!

PC settings Ultra Sliders are worthless.




Again this is a NVIDIA GeForce GTX 680 demo with 3.0 TFLOPS (~2.0 attainable) with only 2GB VRAM.

watch_dog-50uu0c-1.gif




akselw.gif


watch_dog-1pfp69.gif

watch_dog-3rzzw9.gif
 
Last edited:

SlimySnake

Flashless at the Golden Globes
The idea that PC is the lead platform is laughable and displays someone who is misinformed of the true nature of game development.
You think because you get a LOD, shadow, ambient occlusion, volumetric resolution slider you think your PC is the lead platform? that's laughable.
None of that stuff mean-fully contributes to the actual visual fidelity of a game scene.

Consoles ARE the target platform. The characters in the ~1 TFLOP console have the same geometry density (triangles) as your 10x more powerful GPU and 10x+ more powerful CPU. The same is the case for general object geometry density. A water bottle on the 10x more power PC isn't 1,000 triangles and then on the 1 TFLOP console, its 15 triangles....They area both 15 triangles. We have access to fbx game files through hackers, they are IDENTICAL.

An object's triangles (geometric density) and lighting are the most important part of visual fidelity. Heck even the textures are completely identical and only liike 1% of games offer 4k textures on PC and this only started gaining traction because XOX started supporting 4k textures. Material Quality is also important and are identical across the board.

For example just like HFW, current gen games are about to get a huge upgrade in character triangles, with better material and skin shading, better eye shading hair, peach fuzz.

Yet why didn't games on PC uniquely have these improvements as the gen progressed with PC getting more powerful and XBO and PS4 staying static? why didn't they have peach fuzz? For years they had GPU around the power level of XSX and PS5... This is a simple logical answer if PC WAS the lead platform. Yes you had one offs like TreeFX on tomb raider on PC vs 360 but that is typically the usual %1 exceptions.

Yet this gen, a huge amount of games will have significantly better character fidelity WITH peach fuzz.

If you want me to use a game as an example, then Look at the 2012 Watchdogs demo. When the game finally released in 2014, it was downgraded 4 games. It time after that first reveal it was downgraded significantly. The original demo ran on a NVIDIA GeForce GTX 680 that is 3.0 TFLOPS (~2.0 attainable) with only 2GB VRAM. If PC were lead platform, then the PC version would have looked like the 2012 demo. Yet its not even in the same stratophere. Why did they downgrade object density (triangles), lighting, particles, interior building, and textures? Making it look identical to what we got on consoles. WTF i thought PC were lead platform?

Even then in 2016 when Watchdogs 2 released, PC GPUs had gotten so powerful. We had GTX 1080 that is 9 TFLOPs with 8 GB VRAM. Yet Watchdogs 2 on PC looked worse than the 4x downgraded Watchdogs 1. What gives? I thought PC were the lead platform? Wouldn't that mean tools and technique would be developed to take advantage of it? Why did Watchdog 2 look virtually identical to its console counterpart and worse than Watchdog 1?

Then in 2020 we had Watch dogs legions, again 8 years after the original Watchdogs e3 demo. GPUs on PC are orders of magnitude more powerful than the NVIDIA GeForce GTX 680 that ran the original demo. Yet legion with all the Ray tracing sliders on earth STILL doesn't hold a candle to the original demo. It doesn't even come close.
Yet when you compare for example a ~1TLFOP box with a ~10TFLOP box. It looks almost identical? Why is that? I thought PC was lead platform?

ITS BECAUSE THE THINGS THAT MATTERS ARE COMPLETELY IDENTICAL!!!

PC settings Ultra Sliders are worthless.




Again this is a NVIDIA GeForce GTX 680 demo with 3.0 TFLOPS (~2.0 attainable) with only 2GB VRAM.

watch_dog-50uu0c-1.gif




akselw.gif


watch_dog-1pfp69.gif

watch_dog-3rzzw9.gif

Dont waste your time. This has been explained to him over the last five years across two different forums.

For fun, you can look at his predictions of next gen visuals. He even created thread showing PC versions of last gen games like Star Wars Jedi Fallen order saying how thats how good next gen games will look lol
 

VFXVeteran

Banned
The idea that PC is the lead platform is laughable and displays someone who is misinformed of the true nature of game development.
You think because you get a LOD, shadow, ambient occlusion, volumetric resolution slider you think your PC is the lead platform? that's laughable.
None of that stuff mean-fully contributes to the actual visual fidelity of a game scene.
Which GPU implemented hardware RT first? PS5 or the 2x00-series boards?

Your comment about the increased samples is blatantly false. A slider to provide more samples does indeed increase visual fidelity. If they didn't do ANYTHING at all, why have them there? There have been way more accurate ambient occlusion algorithms than the typical SSAO. The more modern AO is done using RT but before that was world space AO which is much more accurate since computations are done in world space. Again, this is a feature that has never been available for the consoles due to lack of bandwidth.

Consoles ARE the target platform.
Target platform and lead platform are two different things. New graphics features are implemented on the high-end GPUs first. Not the other way around.
 
Last edited:

VFXVeteran

Banned
Dont waste your time. This has been explained to him over the last five years across two different forums.

For fun, you can look at his predictions of next gen visuals. He even created thread showing PC versions of last gen games like Star Wars Jedi Fallen order saying how thats how good next gen games will look lol
And yet - next-gen games haven't implemented anything far and away from Jedi Fallen. We are 3yrs into new console generation and I'm still waiting for that game. And you are right, I'm wasting my time. You guys have been bragging about the graphics of every single new release of a PS exclusive since this gen started and yet, after about a couple of weeks have set in and all the hype has faded away, it's back to wishing for that next hardware upgrade.
 
Last edited:

SeraphJan

Member
The most popular GPU in PC gaming space is GTX 1060 according to statistics.

If you are a dev, would you be aiming for PS5/Series X or 1060 to meet your vision?

This notion of "Console is holding gaming back" is nonsense, if anything is holding gaming back are those outdated PCs that refuse to die, which consists the majority of PC gaming space.

If you look at history of video game, every time when gaming technology leaps forward, it was the console space that pushed it. The logic behind it is pretty simple: there is less incentive for majority of outdated PC owner to get a new PC compare to console where you have to get the new hardware or else you will not be able to play any new releases at all. Where in PC space, they could still struggle to lower all the setting to potato and make the game at least playable.

By only looking at High end PC users that changes GPU annually, is like in a gaming community when you are only looking at how the speed runners are playing the game.

As someone personally owns at least 10 gaming PC concurrently, I'm still looking forward to how console space evolves
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Is this guy living in the real world? 3 years into what?!
Time moves faster when you are bullshitting.

And yet - next-gen games haven't implemented anything far and away from Jedi Fallen.
That has nothing to do with the PS5 or XSX specs. Every game is held back by last gen versions. You know this but you cant help yourself. Matrix, Avatar and other UE5 engine demos show what you can accomplish by targeting PS5 and XSX specs.
This notion of "Console is holding gaming back" is nonsense, if anything is holding gaming back are those outdated PCs that refuse to die, which consists the majority of PC gaming space.
It's definitely not the 1060. No dev gives a shit about PC gaming. Consoles decide when the leap is made. If it was PCs, then the 2016 era 1080 Ti wouldve allowed devs to push Matrix quality visuals in 2016.

What's happened this gen is an abberation. Third parties were always a year late, but did show up with Mass Effect, Bioshock, CoD4 in the PS3 era. In the PS4 era, AC Unity, Watch Dogs despite the downgrade, and CoD Advanced Warfare all came after a year but had that gerenational leap we've come to expect.

This time around they all missed the first year and are literally nowhere to be found. To make matters worse, Sony and MS first party have also phoned in with cross gen games and underwhelming next gen only games. If it wasnt for Epic, the Matrix demo and the first UE5 demo, we would have no idea just how impressive these new consoles really are.

This industry is struggling because of publisher greed, a lack of ambition on the devs part, not because of hardware specs of the new consoles or the GTX 1060. I hope we see some next gen games at this E3.
 
The hardware RT is basically using shader cores (which of course is hardware acceleration). They don't have separate RT cores like Nvidia does - which frees up bandwidth for rasterization.
It is shader cores but i believe they have additional h/w to accelerate part of the ray tracing. It is not like the 1080 that has to do it in software.
And yet - next-gen games haven't implemented anything far and away from Jedi Fallen. We are 3yrs into new console generation and I'm still waiting for that game. And you are right, I'm wasting my time. You guys have been bragging about the graphics of every single new release of a PS exclusive since this gen started and yet, after about a couple of weeks have set in and all the hype has faded away, it's back to wishing for that next hardware upgrade.
The hair shading on the main characters in ratchet is a step up from any prior creature hair shading.

Also spider man ray traced reflections had long draw distance unlike some of the cities in prior games where the ray tracing had very limited draw distance.

And if we include matrix, wire fences have geometric wires with perfect roundness. Unlike the prior 2d texture wire fences found in past games.
 
Last edited:

anothertech

Member
The idea that PC is the lead platform is laughable and displays someone who is misinformed of the true nature of game development.
You think because you get a LOD, shadow, ambient occlusion, volumetric resolution slider you think your PC is the lead platform? that's laughable.
None of that stuff mean-fully contributes to the actual visual fidelity of a game scene.

Consoles ARE the target platform. The characters in the ~1 TFLOP console have the same geometry density (triangles) as your 10x more powerful GPU and 10x+ more powerful CPU. The same is the case for general object geometry density. A water bottle on the 10x more power PC isn't 1,000 triangles and then on the 1 TFLOP console, its 15 triangles....They area both 15 triangles. We have access to fbx game files through hackers, they are IDENTICAL.

An object's triangles (geometric density) and lighting are the most important part of visual fidelity. Heck even the textures are completely identical and only liike 1% of games offer 4k textures on PC and this only started gaining traction because XOX started supporting 4k textures. Material Quality is also important and are identical across the board.

For example just like HFW, current gen games are about to get a huge upgrade in character triangles, with better material and skin shading, better eye shading hair, peach fuzz.

Yet why didn't games on PC uniquely have these improvements as the gen progressed with PC getting more powerful and XBO and PS4 staying static? why didn't they have peach fuzz? For years they had GPU around the power level of XSX and PS5... This is a simple logical answer if PC WAS the lead platform. Yes you had one offs like TreeFX on tomb raider on PC vs 360 but that is typically the usual %1 exceptions.

Yet this gen, a huge amount of games will have significantly better character fidelity WITH peach fuzz.

If you want me to use a game as an example, then Look at the 2012 Watchdogs demo. When the game finally released in 2014, it was downgraded 4 games. It time after that first reveal it was downgraded significantly. The original demo ran on a NVIDIA GeForce GTX 680 that is 3.0 TFLOPS (~2.0 attainable) with only 2GB VRAM. If PC were lead platform, then the PC version would have looked like the 2012 demo. Yet its not even in the same stratophere. Why did they downgrade object density (triangles), lighting, particles, interior building, and textures? Making it look identical to what we got on consoles. WTF i thought PC were lead platform?

Even then in 2016 when Watchdogs 2 released, PC GPUs had gotten so powerful. We had GTX 1080 that is 9 TFLOPs with 8 GB VRAM. Yet Watchdogs 2 on PC looked worse than the 4x downgraded Watchdogs 1. What gives? I thought PC were the lead platform? Wouldn't that mean tools and technique would be developed to take advantage of it? Why did Watchdog 2 look virtually identical to its console counterpart and worse than Watchdog 1?

Then in 2020 we had Watch dogs legions, again 8 years after the original Watchdogs e3 demo. GPUs on PC are orders of magnitude more powerful than the NVIDIA GeForce GTX 680 that ran the original demo. Yet legion with all the Ray tracing sliders on earth STILL doesn't hold a candle to the original demo. It doesn't even come close.
Yet when you compare for example a ~1TLFOP box with a ~10TFLOP box. It looks almost identical? Why is that? I thought PC was lead platform?

ITS BECAUSE THE THINGS THAT MATTERS ARE COMPLETELY IDENTICAL!!!

PC settings Ultra Sliders are worthless.




Again this is a NVIDIA GeForce GTX 680 demo with 3.0 TFLOPS (~2.0 attainable) with only 2GB VRAM.

watch_dog-50uu0c-1.gif




akselw.gif


watch_dog-1pfp69.gif

watch_dog-3rzzw9.gif

Honestly, the closest thing to that reveal is cyberpunk 2077 on next gen hardware after the latest patch. Over a decade later. And not the buggy mess we got on PC.

It's nice we finally got there but dam did we have to wait.
 

VFXVeteran

Banned
Time moves faster when you are bullshitting.


That has nothing to do with the PS5 or XSX specs. Every game is held back by last gen versions. You know this but you cant help yourself. Matrix, Avatar and other UE5 engine demos show what you can accomplish by targeting PS5 and XSX specs.
I'm tired of arguing over and over and over again with people who have no contacts in the industry and are expert programmers in the graphics field without a single line of code written to implement anything remotely graphics oriented. Continue your talk about technology being pushed forward by consoles and therefore yield the best rendering results and are on par with CG-movies. I'm done with this thread.
 
Last edited:

winjer

Gold Member
The most popular GPU in PC gaming space is GTX 1060 according to statistics.

If you are a dev, would you be aiming for PS5/Series X or 1060 to meet your vision?

This notion of "Console is holding gaming back" is nonsense, if anything is holding gaming back are those outdated PCs that refuse to die, which consists the majority of PC gaming space.

If you look at history of video game, every time when gaming technology leaps forward, it was the console space that pushed it. The logic behind it is pretty simple: there is less incentive for majority of outdated PC owner to get a new PC compare to console where you have to get the new hardware or else you will not be able to play any new releases at all. Where in PC space, they could still struggle to lower all the setting to potato and make the game at least playable.

By only looking at High end PC users that changes GPU annually, is like in a gaming community when you are only looking at how the speed runners are playing the game.

As someone personally owns at least 10 gaming PC concurrently, I'm still looking forward to how console space evolves

Consider that Steam in 2021 had 132 million monthly active players.
Then consider that 1/3 has an RTX card. Then there are those with cards with GTX 1080-1080Ti, RX 5700XT, RX 6600XT and better. These are cards with performance close to current gen consoles.
And we are getting close to having 40-50% of Steam users with decent GPUs. That's more than 50 million users with good GPUs. The PS5 and Series X combined, probably won't reach this value in 2022.
Also consider that the GTX 1060 only accounts for 7.95% of users. Also remember that the GTX 1060 6Gb is more powerful than the Series S.

So this idea that PCs are slowing down game development on this new generation, is complete nonsense.
Truth be told, it's the opposite, because consoles have much weaker RT and AI capabilities.

And mind you, this is at a time when it's near impossible to get a GPU at MSRP.
If it wasn't for this, the average GPU would be even better.
 
Last edited:
Target platform and lead platform are two different things. New graphics features are implemented on the high-end GPUs first. Not the other way around.
I literally used the world "lead" 7 times. this is your response to my post?
I'm tired of arguing over and over and over again with people who have no contacts in the industry and are expert programmers in the graphics field without a single line of code written to implement anything remotely graphics oriented. Continue your talk about technology being pushed forward by consoles and therefore yield the best rendering results and are on par with CG-movies. I'm done with this thread.
Thought-out this thread You've proven with each statement you have no idea what you are talking about.
You display ridiculous elitism. You have provided actual 0 arguments, 0 evidence, 0 points, 0 facts. Nothing. Zip. Nada. Nil, Null.

And this is coming from someone who has pointed out incorrect statements made by PS5 fans on a regular basis. But your whole "its sony fans" crusade is so tiring. Its also hilarious that you talk about the industry because dozens of TOP VFX industry personnel in GDC meet up with the Unity Team and told them they were blown away.

Here is me for example correction something that a Sony Fan says. Proving yet again its about truth and facts not emotions and vendettas.

You do know sometimes consoles out do PC in certain aspects ( PS5 SSD, PS3 Cell processor, Xbox 360 Xenos, etc) and the best looking games are on consoles ( ignoring graphics settings) no real time playable games look better than The Matrix Awakens, TLOU II, HFW, Ratchet and Clank Rift apart, spiderman MM on PC… PC is pretty much 3rd party games at max settings, no real exclusives take advantage of RTX 3090 or we would have the best visuals available on PC…consoles are more efficient with closed architecture and API…

This is 1.8 TFLOPS… (running on PS5 60 fps patch)


I have to push back on this alil because the Matrix Awaken demo will literally be release in 2 days (on April 5th) on the PC and just like the Valley of Ancient ran at a higher native resolution and fps compared to the PS5 and XSX (1080p and 30 fps) and did so without superfast SSD, (god like IO) direct storage or (or compression/decompression tech) RTXIO.

The matrix awaken will fellow suite and have a higher performance and resolution than consoles.

But that's completely moot here. What VFXVeteran VFXVeteran don't understand, can't understand or won't understand. Is that the Matrix Awaken and the nanite tech powering it would not exist without the PS5 and XS. The full time development of Nanite started in 2017 by Brian Karis after epic got word that the next console would have enough power. Later in 2019 he was joined by other engineers to help out.

Nanite has been in active research by Brian Karis since 2009, he actually created demos back then and demoed it to Epic. But then GPUs weren't powerful enough. But if PC were lead platform as VFX claims. Why did Epic and Brian Karis wait till 2017 to commission Nanite? Nanite should have been introduced on the PC as a unique PC feature way back.

But that didn't happen. It was when the new consoles came along that Nanite was introduced.
And? If you're implying that Samaritan model is using less polygons than these games that's it even worst man, it's disappointing.

My video games are not doing this. Why aren't retail games moving like this in motion with even more polygons?
You claimed that Arkham Knight surpassed the Samaritan demo, but I disagree. The Arkham Knight I've played on PC (Maxed settings) looked liked ass compared to this:

This is what you consider as superior? Am i being punked here? Literally every single game has better hair, skin shader, eye shader.
The eye is literally just a texture. Like whats wrong with you?
7lXkgwe.png


rkW0ETs.png


noiBf5a.png


This is 2015 old hair, skin and eye rendering and its still orders of mag better. This has to be trolling!

411p1_Facebook-1200x634-1668280508.png


sidebyside.jpg


How many polygons is the Infiltrator demo using from characters to environment assets? If the poly count budget is lower than current games too then where are the video games that look liked the Infiltrator demo?

Unreal Engine 4 Infiltrator Demo Released | Geeks3D

Not sure what you are trying to say here? That games don't have the horrible skin shader, eye shader and hair shader that the infiltrator demo has? or that the material and texture work hasn't been matched and superseded? or that the same environment design hasn't been replicated? If you played control at all you would know that the maintenance section power plant (if you look up) has similar look to the infiltrator demo. There's not one thing you can point to in the infiltrator demo that looks better than games coming out today.
 
Last edited:
You do know sometimes consoles out do PC in certain aspects ( PS5 SSD, PS3 Cell processor, Xbox 360 Xenos, etc) and the best looking games are on consoles ( ignoring graphics settings) no real time playable games look better than The Matrix Awakens, TLOU II, HFW, Ratchet and Clank Rift apart, spiderman MM on PC… PC is pretty much 3rd party games at max settings, no real exclusives take advantage of RTX 3090 or we would have the best visuals available on PC…consoles are more efficient with closed architecture and API…

This is 1.8 TFLOPS… (running on PS5 60 fps patch)


Don't bother. He can not grasp the facts you just stated. PC's with RTX 5090TI 50TFLOP GPU's will still just be running games built around the constraints of the 4TFLOP Xbox Series S. Comparing that $3000-4000 PC experience versus a 12TFLOP Series X will not look all that different to most people from normal TV viewing distances.
 
Last edited:

rofif

Can’t Git Gud
Consider that Steam in 2021 had 132 million monthly active players.
Then consider that 1/3 has an RTX card. Then there are those with cards with GTX 1080-1080Ti, RX 5700XT, RX 6600XT and better. These are cards with performance close to current gen consoles.
And we are getting close to having 40-50% of Steam users with decent GPUs. That's more than 50 million users with good GPUs. The PS5 and Series X combined, probably won't reach this value in 2022.
Also consider that the GTX 1060 only accounts for 7.95% of users. Also remember that the GTX 1060 6Gb is more powerful than the Series S.

So this idea that PCs are slowing down game development on this new generation, is complete nonsense.
Truth be told, it's the opposite, because consoles have much weaker RT and AI capabilities.

And mind you, this is at a time when it's near impossible to get a GPU at MSRP.
If it wasn't for this, the average GPU would be even better.
pcs are not easier to develop because of more power...
It's still much more difficult to develop for pc properly than a console I would imagine
 

Haggard

Banned
This notion of "Console is holding gaming back" is nonsense, if anything is holding gaming back are those outdated PCs that refuse to die, which consists the majority of PC gaming space.
horrendous nonsense...anything with a noteworthy budget is always targeting the standardized console specs first and foremost and then scales from there.
Lower end PCs aren`t holding gaming back anymore than a common calculator because neither is taken into consideration when a game`s scope is decided upon.
The lowest common denominator a game has to run on is always the weakest console it releases on, not some office PC.
 
Last edited:

SeraphJan

Member
horrendous nonsense...anything with a noteworthy budget is always targeting the standardized console specs first and foremost and then scales from there.
Lower end PCs aren`t holding gaming back anymore than a common calculator because neither is taken into consideration when a game`s scope is decided upon.
The lowest common denominator a game has to run on is always the weakest console it releases on, not some office PC.
How is console holding gaming back when the weakest console is PS5/XSX? Which is way stronger than the most popular GPU 1060? The notion of "Console is holding gaming back" is from people that claim PC is a more preferable platform. If there is no comparison, how does the lowest common denominator even exist? Since current gen XSX and PS5 is identical in spec (which neither is the lowest common denominator), according to your theory (since you hypocritically claim you are not comparing console to pc), lowest common denominator should not even exist at all. Which just invalid your entire argument. If you are talking about cross-gen, that's another story, cross-gen won't continue forever (and its called cross-gen for a reason, because it still not current gen yet).

So in the end how is my statement "Console is holding gaming back is nonsense" a horrendous nonsense? If you are not comparing PC to Console, there is no lowest common denominator to begin with, so nothing is holding anything back, if you are comparing however, if anything is holding back it would be outdated PC, mind you the word "If" is the key here.

If you want to read, read my entire statement including the link to my previous thread (Which I clearly stated dev prefer Console, so I don't know what are you even arguing, are you saying dev should aim for high end pc only? If so you might have a hard time trying to convince them, good luck), nitpicking a single sentence out of context and call it nonsense is meaningless, it does not represent what I'm saying at all.
 
Last edited:
Dont waste your time. This has been explained to him over the last five years across two different forums.

For fun, you can look at his predictions of next gen visuals. He even created thread showing PC versions of last gen games like Star Wars Jedi Fallen order saying how thats how good next gen games will look lol
Lol. The worst is just how much he contradicts himself. Its simply amazing. How someone can contradict themselves so much. its not normal.
Sure.

Think of all this last gen games (CoD, Control, Metro, Jedi Fallen, etc.) on PC. I think most will look like that but target 4k/30. The little bit of ray-tracing that we'll see will be small. Think of like BF5 reflections or Metro's ambient occlusion.
 
Last edited:

Haggard

Banned
How is console holding gaming back when the weakest console is PS5/XSX? Which is way stronger than the most popular GPU 1060? The notion of "Console is holding gaming back" is from people that claim PC is a more preferable platform. If there is no comparison, how does the lowest common denominator even exist? Since current gen XSX and PS5 is identical in spec (which neither is the lowest common denominator), according to your theory (since you hypocritically claim you are not comparing console to pc), lowest common denominator should not even exist at all. Which just invalid your entire argument. If you are talking about cross-gen, that's another story, cross-gen won't continue forever (and its called cross-gen for a reason, because it still not current gen yet).

So in the end how is my statement "Console is holding gaming back is nonsense" a horrendous nonsense? If you are not comparing PC to Console, there is no lowest common denominator to begin with, so nothing is holding anything back, if you are comparing however, if anything is holding back it would be outdated PC, mind you the word "If" is the key here.

If you want to read, read my entire statement including the link to my previous thread (Which I clearly stated dev prefer Console, so I don't know what are you even arguing, are you saying dev should aim for high end pc only? If so you might have a hard time to convince them, good luck), nitpicking a single sentence out of context and call it nonsense is meaningless, it does not represent what I'm saying at all.
You go on about others not reading thoroughly enough while you didn't even bother to check which part I quoted nor what I answered, then you brabble about stuff that other people (probably imaginary) seem to have said to you and you also forgot that the XSS exists.

Stay off the drugs.
 
Last edited:
I'm tired of arguing over and over and over again with people who have no contacts in the industry and are expert programmers in the graphics field without a single line of code written to implement anything remotely graphics oriented. Continue your talk about technology being pushed forward by consoles and therefore yield the best rendering results and are on par with CG-movies. I'm done with this thread.
But he’s right…common sense says so. When you target next gen hardware only you get The Matrix Awakens, Ratchet and Clank: Rift Apart, Spiderman MM etc…even TLOU II was done with ONLY 1.84 TFLOPS in mind…it STILL looks better than most newer titles.
 

SeraphJan

Member
But he’s right…common sense says so. When you target next gen hardware only you get The Matrix Awakens, Ratchet and Clank: Rift Apart, Spiderman MM etc…even TLOU II was done with ONLY 1.84 TFLOPS in mind…it STILL looks better than most newer titles.
Not to mention God of War 2018 on steam looks better than most of triple A games that was released in 2022 on Steam, according to Digital Foundry GoW 2018 barely had any enhancement(aside from some minor ones that is extremely difficult to differentiate with naked eye such as slightly improvement to shadow), they basically just straight ported the game to PC.
 
Last edited:
horrendous nonsense...anything with a noteworthy budget is always targeting the standardized console specs first and foremost and then scales from there.
Lower end PCs aren`t holding gaming back anymore than a common calculator because neither is taken into consideration when a game`s scope is decided upon.
The lowest common denominator a game has to run on is always the weakest console it releases on, not some office PC.
Well tbh alot games have to be scalable for lower end pcs too… but yes office pcs aren’t usually taken into account.
 

SeraphJan

Member
You go on about others not reading thoroughly enough while you didn't even bother to check which part I quoted nor what I answered, then you brabble about stuff that other people (probably imaginary) seem to have said to you and you also forgot that the XSS exists.

Stay off the drugs.
This just proves you've read nothing. stop wasting my time by quoting me if you are not even going to read in the first place.
 
Last edited:
You go on about others not reading thoroughly enough while you didn't even bother to check which part I quoted nor what I answered, then you brabble about stuff that other people (probably imaginary) seem to have said to you and you also forgot that the XSS exists.

Stay off the drugs.
This just proves you've read nothing. stop wasting my time by quoting me if you are not even going to read in the first place.

35e0bf99e43df561914609cbf5979e47.gif
 

Haggard

Banned
This just proves you've read nothing. stop wasting my time by quoting me if you are not even going to read in the first place.
If you're too dumb to understand more than 2 sentences at once don't go on internet boards. You've not replied to what I wrote but to some mumbo jumbo in your fantasy.
 
Last edited:

SeraphJan

Member
If you're too dumb to understand more than 2 sentences at once don't go on internet boards. You've not replied to what I wrote but to some mumbo jumbo in your fantasy.
Stop wasting my time, go troll someone else, I hope at least you understand what this sentence means.
 
Last edited:

Haggard

Banned
Stop wasting my time, go troll someone else
You waste your own time by writing paragraphs to the voices in your head instead of actually reading what you quote first.....
Well, one more entry for my ignore list

Well tbh alot games have to be scalable for lower end pcs too… but yes office pcs aren’t usually taken into account.
Scalability usually isn`t an issue when it comes to graphics. The base however on which any game has to hit a good balance between graphics and performance are always the consoles.
In a purely technological sense consoles will always hold back game development as their fixed hardware ages, but for mass market products like games targeting high end pc specs would be economical suicide which makes it a non argument.
Developers have to earn money after all.
 
Last edited:

SeraphJan

Member
You waste your own time by writing paragraphs to the voices in your head instead of actually reading what you quote first.....
Well, one more entry for my ignore list
You are the one who initially quoting me out of nowhere, ignore everything I've said here and here, cherry picked one sentence, making self conflicting statement. When I fully respond to your nonsense, you pretend you didn't saw it and went full troll mode.

Get a life and stop bothering me, we are done.
 

Haggard

Banned
You are the one who initially quoting me out of nowhere, ignore everything I've said here and here, cherry picked one sentence, making self conflicting statement. When I fully respond to your nonsense, you pretend you didn't saw it and went full troll mode.

Get a life and stop bothering me, we are done.
your extra chromosomes seem to get in the way of your reading capabilities. Don´t worry, I think there are special jobs people like you can do...news anchor at Fox maybe.
Gee you seriously make me go through the hassle of actually clicking on your nickname to go for the ignore button. shame on you.
cartman-hit.gif
 
Last edited:

SeraphJan

Member
your extra chromosomes seem to get in the way of your reading capabilities.
Gee you seriously make me go through the hassle of actually clicking on your nickname to go for the ignore button. shame on you.
Didn't you just claimed you've already did a while ago? I thought you can't read, but it seems your memory capacity is falling apart too. How many time do I have to tell you stop quoting me, or you just can't help yourself resolving to insult other one more time? You clearly had no intent for a meaningful discussion to begin with, I don't like repeat myself to someone who just debate to win.
 
Last edited:
Top Bottom