• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

edit: if you're literally just talking about having vfx hardware prebake the lighting in game scenes, they already do spend hours/days prebaking the lighting in video games using the same kind of hardware. the difference is all the decisions i outlined above have already been made and thats why it doesnt look as good.
In some instances it looks photoreal, so I assume it should be possible in general for static scenes with static lighting. The reflective surfaces can be handled with less precision by ray tracing.
 
Never ever say that better hardware isn't needed. That's what Bill Gates thought before extended kernal RAM in DOS was required for bigger programs. There will ALWAYS be a need for more clarity and a better approximation to the real world. To keep yourself from saying such nonsense, always invision a diagonal line on a whiteborard in the realworld with not a single stairstep. Now superimpose that analog straight line onto a digital plane with pixels. We want to get as close to the real world as we can. We don't stop just because people think a 1440p approximation is "good enough". We will need precision smaller than even that pixel on the monitor if we hope to get better clarity of the realworld. All those interactions with light adds up to accumulated "error" when the final pixel color needs to make that line. It's the same with texturing. We should never said 4k texture maps are good enough approximation.
Better is always going to be better..but after a while its unnecessary…
 

PUNKem733

Member
So how long until Hollywood levels of CGI? PS8-PS9? 2040's or 2050's? I think the CEO of epic was all like it takes 40 TF for photorealism, but I always thought that is BS. GPUs are coming out with 100 TF and they won't be close.
 

VFXVeteran

Banned
True some more complex scenes can take over 100 hours. But I've heard in many cases a few hours suffices.

So are you saying with their likely 100s of 20+K$ workstations and probable access to a renderfarm, they can't achieve good results even if they devoted a few hours to prebake the lighting of a scene?
To be honest, I lost you. I thought we was talking about realtime pre-baked lighting results vs. correct results.
 

VFXVeteran

Banned
Doesn’t matter what’s used…its the results.
No. That's what's wrong with that argument. You can't bring an objectively false rendering result to a subjective "it looks better to me" argument. No one can argue that at all. I could say Quake 1 looks better than any game today and people would laugh at me because the rendering results are so far off compared to what is correct today (i.e rendering resolutions, techniques, etc.. all now have better approximations to the rendering equation).. yet I should still feel my opinion has merit?
 
Last edited:

Neilg

Member
In some instances it looks photoreal, so I assume it should be possible in general for static scenes with static lighting. The reflective surfaces can be handled with less precision by ray tracing.

it absolutely is possible, the peak of the technique still had some way to go IMO. the challenge comes when blending dynamic lights and objects, and the man hours required in setting up objects to be baked. it's a huge, huge pain in the ass on the backend to work with baked light and the main benefit of switching to a fully dynamic automated lighting system like lumen is to simply not have to do all that tedious work/wait for calculations. I'm sure you read that article about destiny 2 forcing the lighting to be rebaked after every change to a level, taking all night.

Even if the new tech doesn't look as good as baking - being forced into a creative situation where mid way through development you cant do specific things with lights is best avoided. These tools are not being introduced only to make your games look better, they're being introduced to make developers' lives easier - they are the customers. They obviously like to introduce tools that will also make things look better - but any tool which made it harder to work would not be used in any meaningful way.
 
Last edited:

VFXVeteran

Banned
Naughty Dog's next game will make everyone shit their pants. That's my bold prediction. We're fooled by shiny things, but it's the meeting of all of the technical specs and attention to animation, environmental, and artistic details that make a game look good.
I doubt it for rendering. ND has followed basically after most large AAA studios due to not having the man-power to do R&D. My friend that worked there was constantly in crunch mode for years. They managed to get a better approximation to SSS, good clothing shaders and used their last-gen PBR materials but not much else was pushed on the rendering side of things. Not talking about the animation or facial rigs as they are always stellar and can be migrated pretty easily from film assets and tools to realtime gameplay.
 
Last edited:

VFXVeteran

Banned
it absolutely is possible, the peak of the technique still had some way to go IMO. the challenge comes when blending dynamic lights and objects, and the man hours required in setting up objects to be baked. it's a huge, huge pain in the ass on the backend to work with baked light and the main benefit of switching to a fully dynamic automated lighting system like lumen is to simply not have to do all that tedious work/wait for calculations. I'm sure you read that article about destiny 2 forcing the lighting to be rebaked after every change to a level, taking all night.

Even if the new tech doesn't look as good as baking - being forced into a creative situation where mid way through development you cant do specific things with lights is best avoided. These tools are not being introduced only to make your games look better, they're being introduced to make developers' lives easier - they are the customers. They obviously like to introduce tools that will also make things look better - but any tool which made it harder to work would not be used in any meaningful way.
Baked lighting doesn't necessarily have actual solutions to some things dynamic and if it did, it would look wrong. We grew tired of constantly creating multiple lighting passes that didn't work right with what we wanted to do with the scene. RT is definitely the best way to go and will look just as good as baked lighting, be dynamic, and work with any sort of production light rig (i.e. combined with FX, water, etc..) as GPUs grow more powerful.
 
Last edited:

Neilg

Member
Baked lighting doesn't necessarily have actual solutions to some things dynamic and if it did, it would look wrong. We grew tired of constantly creating multiple lighting passes that didn't work right with what we wanted to do with the scene. RT is definitely the best way to go and will look just as good as baked lighting as GPUs grow more powerful.

A good comparison is the big shift from photon mapped GI or the irradiance map in vray. Inferior techniques, but they were the only approach that most could use to get things done in time. and required a lot of fucking around if objects were moving - and did produce a very smooth / clean result in certain situations.
Then, as cpu's got faster, more and more people started switching to fully brute force GI - even though at first they ended up with slower, noiser images. another 5-10 years later and it's the only way anyone does anything.

Before photon mapping even, I started my career right on the threshold of having to manually place fake bounce lights based on what we thought GI might look like, and negative value lights to suck light in and create dark areas. that fucking sucked but was fortunately very brief.
 

VFXVeteran

Banned
A good comparison is the big shift from photon mapped GI or the irradiance map in vray. Inferior techniques, but they were the only approach that most could use to get things done in time. and required a lot of fucking around if objects were moving - and did produce a very smooth / clean result in certain situations.
Then, as cpu's got faster, more and more people started switching to fully brute force GI - even though at first they ended up with slower, noiser images. another 5-10 years later and it's the only way anyone does anything.

Before photon mapping even, I started my career right on the threshold of having to manually place fake bounce lights based on what we thought GI might look like, and negative value lights to suck light in and create dark areas. that fucking sucked but was fortunately very brief.
LOL! Yes, I remember the bounced cards to get diffuse contributions.. switching them visible or not, ignoring them in certain instances (which would require a pre-pass to determine if the object was 'marked' a card or not).. ugly times back then!
 
Last edited:
No. That's what's wrong with that argument. You can't bring an objectively false rendering result to a subjective "it looks better to me" argument. No one can argue that at all. I could say Quake 1 looks better than any game today and people would laugh at me because the rendering results are so far off compared to what is correct today (i.e rendering resolutions, techniques, etc.. all now have better approximations to the rendering equation).. yet I should still feel my opinion has merit?
Nothing subjective about what looks better…you can have 10000000 TRILLION TRIANGLES AND PATHTRACING on a RTX 6090!!! and the game can look worse than TLOU II if not used correctly…
 

VFXVeteran

Banned
Disagree ...graphics are nowhere close to being perfected. There's lots more levels of fidelity to go before we get there. If you care about graphics now I don't see why you would take that 'good enough' stance. Someone like that wouldn't be posting in this thread.
Exactly. People who say such things are just satisfied with their box of choice and is capabilities. It's only when their box gets upgraded or replaced that their requirements go up.
 
Disagree ...graphics are nowhere close to being perfected. There's lots more levels of fidelity to go before we get there. If you care about graphics now I don't see why you would take that 'good enough' stance. Someone like that wouldn't be posting in this thread.
I’m not talking about graphics as a whole…I’m referring to comparing two methods of doing a task as an example where one is “technically”superior but both give the “same” results…no visible difference… example DLSS… native 4k vs DLSS 4k where DLSS 4k looks better sometimes.
 

Shmunter

Member
I’m not talking about graphics as a whole…I’m referring to comparing two methods of doing a task as an example where one is “technically”superior but both give the “same” results…no visible difference… example DLSS… native 4k vs DLSS 4k where DLSS 4k looks better sometimes.
Just to add, a technically superior method is achieving equal or better with more efficiency. Brute force is a technically inferior method relying on faster hardware to carry it.
 
Exactly. People who say such things are just satisfied with their box of choice and is capabilities. It's only when their box gets upgraded or replaced that their requirements go up.
No this isn’t about a “box”. This isn’t a fanboy thread leave that out of my thread plz…This is for a professional and mature discussion about the future of visual fidelity…also I’m a multi platform gamer and as it stands IMO PlayStation games push visuals beyond any other platform as of now as far as RESULTS. Then you have games like Cyberpunk 2077 and Star Citizen on PC which are mind blowing…
 

SlimySnake

Flashless at the Golden Globes
Hey naughty dog I want this for TLOU III
Isnt it depressing that a mother fucking solo dev is able to do this while entire AAA studios that cost tens of millions to run every year have yet to produce anything that looks even remotely that look?

If it wasnt for hellblade and the matrix demo, we wouldnt even know if these visuals were actually possible. How can the entire industry be asleep at the wheel? Wtf has happened to Sony studios. We are 19 months in and we dont have a single next gen game that looks remotely that good for these so-called best devs in the industry.

Absolutely pathetic. I would be ashamed if i was a dev working in one of those studios making cross gen games or jerking around doing whatever the fuck they have been doing since April 2019 when Cerny first revealed devkits were being sent out. I would quit in disgust. Do these devs even take pride in what they do anymore?

No offense to insomniac but ratchet looks last gen compared to this and the matrix. And why? VRR shows they can hit 60 fucking FPS in ratchet and average 50 at native 4k. WHY are you wasting so much of the GPU on rendering pixels and hitting 60 fps? Target 1440p 30 fps and push the visuals like you always have. You can get HALF of the GPU back by reducing resolution to 1440p and another half targeting 30 instead of 60. That's 4x more power in the palm of your hands and they are wasting it on more pixels.

Spiderman PS5 doesnt look anything special either and its aiming for native 4k. People actually think it might also be cross gen. What the fuck is going on at these studios where they are being shown up by a dude who made this in less than month.
 
Last edited:

Shmunter

Member
Isnt it depressing that a mother fucking solo dev is able to do this while entire AAA studios that cost tens of millions to run every year have yet to produce anything that looks even remotely that look?

If it wasnt for hellblade and the matrix demo, we wouldnt even know if these visuals were actually possible. How can the entire industry be asleep at the wheel? Wtf has happened to Sony studios. We are 19 months in and we dont have a single next gen game that looks remotely that good for these so-called best devs in the industry.

Absolutely pathetic. I would be ashamed if i was a dev working in one of those studios making cross gen games or jerking around doing whatever the fuck they have been doing since April 2019 when Cerny first revealed devkits were being sent out. I would quit in disgust. Do these devs even take pride in what they do anymore?

No offense to insomniac but ratchet looks last gen compared to this and the matrix. And why? VRR shows they can hit 60 fucking FPS in ratchet and average 50 at native 4k. WHY are you wasting so much of the GPU on rendering pixels and hitting 60 fps? Target 1440p 30 fps and push the visuals like you always have. Spiderman PS5 doesnt look anything special either and its aiming for native 4k. People actually think it might also be cross gen. What the fuck is going on at these studios where they are being shown up by a dude who made this in less than month.
I was bruhahaing right along with you till the 30fps.

Apart from that, Sony has disappointed this gen beyond what’s reasonable. Nothing has impressed like the ps4 reveal with KZ Shadowfall. The cross gen approach by Sony is deplorable, it’s up to the platform holder to lead the way, not be part of the pack.

Day one Sony games are a thing of the past for me. Still haven’t picked up HZ Forbidden West, and likely won’t pickup God Of Cross gen either. One day when a sale reflecting the practice comes about I’ll reconsider.
 

SlimySnake

Flashless at the Golden Globes
I was bruhahaing right along with you till the 30fps.

Apart from that, Sony has disappointed this gen beyond what’s reasonable. Nothing has impressed like the ps4 reveal with KZ Shadowfall. The cross gen approach by Sony is deplorable, it’s up to the platform holder to lead the way, not be part of the pack.

Day one Sony games are a thing of the past for me. Still haven’t picked up HZ Forbidden West, and likely won’t pickup God Of Cross gen either. One day when a sale reflecting the practice comes about I’ll reconsider.
Oddly enough Horizon FW looks better than Ratchet, Demon Souls and pretty much every other so called next gen only game out there. At least in the 30 fps mode which just goes to show how talent these devs are and how other next gen only games have basically phoned it in so far this gen.

And Im just saying they should target 1440p 30 fps for the main fidelity graphics mode. Not native 4k 50 fps like they do today. There is nothing stopping them from going for 1080p 60 fps as long as they target 1440p for the 30 fps mode. Especially now with FSR 2.0 and TSR producing absolutely incredible visuals, and the CPU no longer being the bottleneck it was last gen. I mean I thought Matrix was 1440p until DF and NX gamer pointed out it was just 1080p upscaled to 4k using TSR. And FSR 2.0 is even better.

People who are willing to play games at 60 fps dont give a shit about 1440p or some small downgrades to shadows and RT.
 

01011001

Banned
Oddly enough Horizon FW looks better than Ratchet, Demon Souls and pretty much every other so called next gen only game out there. At least in the 30 fps mode which just goes to show how talent these devs are and how other next gen only games have basically phoned it in so far this gen.

and then you set it to 60 and it looks like utter shit :) meanwhile Ratchet looks good no matter the graphics mode
same with Demon's Souls
 

SlimySnake

Flashless at the Golden Globes
Continuing from above. When it comes to performance modes, even a poorly optimized title like the Matrix UE5 demo can easily run at higher framerates if you just disable hardware accelerated lumens. Software lumens has a far lower performance hit. The traffic and pedestrian density also has a massive hit so if you can also tone that down for performance mode.

Here is what Alex found. With Software Lumens and traffic turned off, he was able to take his framerate from 38 to 95! over a 2.5x increase without ever dropping resolution. If you drop resolution from 1440p to 1080p, use fsr, use software lumens and turn down some CPU intensive settings like A.I density then you can easily have a 1440p 30 fps game running at 1080p 60 fps with very little visual downgrades.
If anything, Im surprised Epic didnt choose software lumens for the series s version. It wouldnt have needed to dip significantly below 533p if it was using software lumens since it is 38% more efficient.

And this is a very poorly optimized single threaded demo which devs will surely optimize for release.

3wCLEKd.jpg

YuwAHCO.jpg
 

VFXVeteran

Banned
No this isn’t about a “box”. This isn’t a fanboy thread leave that out of my thread plz…This is for a professional and mature discussion about the future of visual fidelity…also I’m a multi platform gamer and as it stands IMO PlayStation games push visuals beyond any other platform as of now as far as RESULTS. Then you have games like Cyberpunk 2077 and Star Citizen on PC which are mind blowing…
PS games don't push visuals with tech though. It's an impression of the artistic skills of the PS games that people are impressed with. Not the tech at all. I can't disagree with your opinion on PS games but neither can you disagree with mine when I don't agree with your claims. Artistically, there are many games (even on multiplats) that dwarf the PS exclusive games' graphics presentations IMO. Elden Ring being a perfect example.
 
Last edited:

VFXVeteran

Banned
Isnt it depressing that a mother fucking solo dev is able to do this while entire AAA studios that cost tens of millions to run every year have yet to produce anything that looks even remotely that look?

If it wasnt for hellblade and the matrix demo, we wouldnt even know if these visuals were actually possible. How can the entire industry be asleep at the wheel? Wtf has happened to Sony studios. We are 19 months in and we dont have a single next gen game that looks remotely that good for these so-called best devs in the industry.

Absolutely pathetic. I would be ashamed if i was a dev working in one of those studios making cross gen games or jerking around doing whatever the fuck they have been doing since April 2019 when Cerny first revealed devkits were being sent out. I would quit in disgust. Do these devs even take pride in what they do anymore?

No offense to insomniac but ratchet looks last gen compared to this and the matrix. And why? VRR shows they can hit 60 fucking FPS in ratchet and average 50 at native 4k. WHY are you wasting so much of the GPU on rendering pixels and hitting 60 fps? Target 1440p 30 fps and push the visuals like you always have. You can get HALF of the GPU back by reducing resolution to 1440p and another half targeting 30 instead of 60. That's 4x more power in the palm of your hands and they are wasting it on more pixels.

Spiderman PS5 doesnt look anything special either and its aiming for native 4k. People actually think it might also be cross gen. What the fuck is going on at these studios where they are being shown up by a dude who made this in less than month.
You are probably the only person on these boards that can downplay these Sony exclusives without getting any kind of backlash for saying it. I don't know why that is but I agree with you nevertheless.

Splitting up the graphics rendering with just lowering resolution and lowering a FPS and putting that extra performance to the implementations of specifically new graphics algorithms isn't that easy to do. If you want better lighting that's dynamic and looks superior, you have to use RT. Period. Also the material shaders would have to use RT as well to get a better solution than what we have now. The big issue is the lack of power + bandwidth in the AMD GPUs for RT. They completely missed the boat this generation. Freeing up that bandwidth only to be limited to rasterization techniques won't achieve the push in visuals we are all expecting this generation. It is what it is.
 
Last edited:

VFXVeteran

Banned
Continuing from above. When it comes to performance modes, even a poorly optimized title like the Matrix UE5 demo can easily run at higher framerates if you just disable hardware accelerated lumens. Software lumens has a far lower performance hit. The traffic and pedestrian density also has a massive hit so if you can also tone that down for performance mode.

Here is what Alex found. With Software Lumens and traffic turned off, he was able to take his framerate from 38 to 95! over a 2.5x increase without ever dropping resolution. If you drop resolution from 1440p to 1080p, use fsr, use software lumens and turn down some CPU intensive settings like A.I density then you can easily have a 1440p 30 fps game running at 1080p 60 fps with very little visual downgrades.
If anything, Im surprised Epic didnt choose software lumens for the series s version. It wouldnt have needed to dip significantly below 533p if it was using software lumens since it is 38% more efficient.

And this is a very poorly optimized single threaded demo which devs will surely optimize for release.

3wCLEKd.jpg

YuwAHCO.jpg
The difference is significant actually. Looking at the bench on the right from the character's perspective, you can see how RT ambient occlusion in Lumen is much more accurate to the real world and looks much better than a world with no self-shadowing like on the left image. All that shadow behind the bench makes a big difference in quality of the lighting. When you play the entire game in other scenarios, I bet it's even more noticeable.
 

SlimySnake

Flashless at the Golden Globes
The difference is significant actually. Looking at the bench on the right from the character's perspective, you can see how RT ambient occlusion in Lumen is much more accurate to the real world and looks much better than a world with no self-shadowing like on the left image. All that shadow behind the bench makes a big difference in quality of the lighting. When you play the entire game in other scenarios, I bet it's even more noticeable.
I honestly cant tell even after Alex pointed it out.
 
PS games don't push visuals with tech though. It's an impression of the artistic skills of the PS games that people are impressed with. Not the tech at all. I can't disagree with your opinion on PS games but neither can you disagree with mine when I don't agree with your claims. Artistically, there are many games (even on multiplats) that dwarf the PS exclusive games' graphics presentations IMO. Elden Ring being a perfect example.
Elden Ring? Yes, it looks great but I’m not talking art only…I’m talking milestones AND pushing the limits…no game has better animation than TLOU IIs motion matching, HFW graphic fidelity and Ratchets geometric density and SSD use.
 
Last edited:
I saw the new Avatar trailer and it reminded me of that Avatar game that Ubisoft is working on.
Now with the advancements like nanite I wonder if it's possible to take all of those movie assets and put them in a game engine.
If that is going to be the future. Movies and games industry merging.

Like if there'd be an avenger game, instead of Crystal Dynamics tasking all their artist to recreate the world, characters and particles, Disney will just give it to them.

This could also work both ways. Like Naughty Dog giving their Last of Us assets to HBO. Although that might change the performance capture workflow. Instead of casting a voice actor, now they would have to make sure the actor's likeness will be used from the jump.
 
I agree with VFX in that Sony and MS essentially failed to provide the RT capabilities needed to keep pace with next gen visuals, perhaps for the first time in history. My question though, was this ever feasible? Looking at prices for PC's that have "what it takes" tells me no it wasn't, unless MAYBE they released an $800 sku.

Last gen it was all about having a physically based renders to achieve that "next gen look", right? So while consoles got massively outclassed in performance they were still able to feel suitably next gen. However, RT has changed the game. It's just so costly though. But it is necessary for impressive looking games going forward. We needed hardware powerful enough to do RT GI in any game while simultaneously being able to use high level effects like tesselation to not get embarassed by PC as the gen continues.

PS- I agree that developers often don't even take pride in their work. I know this because of how many games release broken or with missing features (bad hdr, bad tearing, no attempt at pushing visuals) and don't get fixed for months or year at a time.

Example- Hitman 3 has broken HDR that actively makes that game look worse when turned on. Hitman 3 is part of a collection of games that are all playable from the same menu, the "world of assassination trilogy", where Hitman 1 and 2 have a good working HDR yet 3's is totally broken (on all 3 platforms). You would THINK they would want all 3 games to have continuity and look their best, right? You would also think they'd especially want their NEW Hitman game looking great, right? Nope! It's been a year and 4 months now without a fix.
 

rofif

Banned
I agree with VFX in that Sony and MS essentially failed to provide the RT capabilities needed to keep pace with next gen visuals, perhaps for the first time in history. My question though, was this ever feasible? Looking at prices for PC's that have "what it takes" tells me no it wasn't, unless MAYBE they released an $800 sku.

Last gen it was all about having a physically based renders to achieve that "next gen look", right? So while consoles got massively outclassed in performance they were still able to feel suitably next gen. However, RT has changed the game. It's just so costly though. But it is necessary for impressive looking games going forward. We needed hardware powerful enough to do RT GI in any game while simultaneously being able to use high level effects like tesselation to not get embarassed by PC as the gen continues.

PS- I agree that developers often don't even take pride in their work. I know this because of how many games release broken or with missing features (bad hdr, bad tearing, no attempt at pushing visuals) and don't get fixed for months or year at a time.

Example- Hitman 3 has broken HDR that actively makes that game look worse when turned on. Hitman 3 is part of a collection of games that are all playable from the same menu, the "world of assassination trilogy", where Hitman 1 and 2 have a good working HDR yet 3's is totally broken (on all 3 platforms). You would THINK they would want all 3 games to have continuity and look their best, right? You would also think they'd especially want their NEW Hitman game looking great, right? Nope! It's been a year and 4 months now without a fix.
RT mostly only matters when you use dynamic time of day and lights.
I recently replayed Uncharted 4. a 2016 game... and it looks incredible. Totally looks like if it had ray traced global illumination. Especially in flashlight sections.
The game is baked probably for most part and it looks better than and dynamic solution could achieve today in real time from what I've seen. Maybe Metro is better when it comes to that
 

Tqaulity

Member
I agree with VFX in that Sony and MS essentially failed to provide the RT capabilities needed to keep pace with next gen visuals, perhaps for the first time in history. My question though, was this ever feasible? Looking at prices for PC's that have "what it takes" tells me no it wasn't, unless MAYBE they released an $800 sku.
First, the issue is more rooted to AMD falling behind with RT perf. Microsoft and Sony got the best parts possible given the time of release, power, and cost restrictions. RDNA2 itself is a bit below state of the art when it comes to RT. Second, given all of that I believe what we have is the best we could have gotten for a 2020 release. So "was this ever feasible"? If you think it's lacking now then the answer is no. Here we are in 2022 and neither AMD or Nvidia has released any new GPUs since the consoles released. Their next gen cards are due out this year and I have a feeling that when you look at the size, form factor, cost, and thermals of the next gen GPUs, you'll see why they too would not have been feasible in any kind of console form factor currently.

That said, there is still a ton that these consoles can do that we haven't seen yet, including visually. I don't think we need state of art RT features to deliver "next gen" wow visuals. RT in its current form of hybrid rendering is generally not worth the cost as the impact is subtle on average while playing (not paused in photo mode doing side by sides).
 
I agree with VFX in that Sony and MS essentially failed to provide the RT capabilities needed to keep pace with next gen visuals, perhaps for the first time in history. My question though, was this ever feasible? Looking at prices for PC's that have "what it takes" tells me no it wasn't, unless MAYBE they released an $800 sku.

Last gen it was all about having a physically based renders to achieve that "next gen look", right? So while consoles got massively outclassed in performance they were still able to feel suitably next gen. However, RT has changed the game. It's just so costly though. But it is necessary for impressive looking games going forward. We needed hardware powerful enough to do RT GI in any game while simultaneously being able to use high level effects like tesselation to not get embarassed by PC as the gen continues.

PS- I agree that developers often don't even take pride in their work. I know this because of how many games release broken or with missing features (bad hdr, bad tearing, no attempt at pushing visuals) and don't get fixed for months or year at a time.

Example- Hitman 3 has broken HDR that actively makes that game look worse when turned on. Hitman 3 is part of a collection of games that are all playable from the same menu, the "world of assassination trilogy", where Hitman 1 and 2 have a good working HDR yet 3's is totally broken (on all 3 platforms). You would THINK they would want all 3 games to have continuity and look their best, right? You would also think they'd especially want their NEW Hitman game looking great, right? Nope! It's been a year and 4 months now without a fix.
IMO UE5 should ease the pain…its about tools and optimizations this generation…Quixel Megascans, VRS, Geometry culling and SSD etc.
 
I honestly cant tell even after Alex pointed it out.
If you were shown both without knowing which was hardware based and which was software and asked to decide what looked better, I bet you would choose the hardware one everytime. Just because you're not as experienced in being able to quickly point out specific differences in shading, doesn't mean that you can't tell there is a difference. You're also probably watching it from your phone, right?
 

SlimySnake

Flashless at the Golden Globes
If you were shown both without knowing which was hardware based and which was software and asked to decide what looked better, I bet you would choose the hardware one everytime. Just because you're not as experienced in being able to quickly point out specific differences in shading, doesn't mean that you can't tell there is a difference. You're also probably watching it from your phone, right?
Nah I ran it myself on my 3080 on a big 65 inch LG CX. You can change the settings and enable Software Lumens. The difference is very minimal. Perhaps noticeable but very minor.

Besides, I would personally always choose better graphics which is where this discussion started in the first place. I am ok with higher fidelity at 1440p but people who want 60 fps dont care too much about IQ so in theory, they should be willing to take a minor hit in visual fidelity and software lumens should give them a good boost in performance.

If 60 fps gamers dont want to reduce visual fidelity then they can stick with hardware lumens but use FSR 2.0's performance mode to get upto 70% more performance and maybe reduce resolution below 1440p to get the extra 30% to double the framerate from 30.
 

VFXVeteran

Banned
Elden Ring? Yes, it looks great but I’m not talking art only…I’m talking milestones AND pushing the limits…no game has better animation than TLOU IIs motion matching, HFW graphic fidelity and Ratchets geometric density and SSD use.
Animation is cool and all but even Elden Ring has an incredible animation system with physics that make the combat feel realistic. The motion matching is cool though. I give props about that. R&C though is as basic as they come. 100% pure art direction. The SSD use in that game is only teleporting. Most game companies don't even have that as something they need in their own games.

Rendering is what everyone wants milestones in. And unfortunately, the consoles don't drive that initiative. Now since Sony games are coming to PC, perhaps they can innovate beyond what the 3rd party AAA studios do with their own PCs with high-end GPUs.
 
Last edited:
Top Bottom