• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ray Tracing is not meaningful and is a dumbness Galore demand

TrueLegend

Member
Ok So Elden ring RT update is out and you know what RT may be good but it is simply not meaningful in any game and I mean anygame including control. In a DF direct all the DF members themselves played the game on PS5 without raytracing accept Alex Battaglia. And you know what I just realized that if not for Dumb Demands of raytracing, the developers could have focused on optimizing video games for the three platforms, and we won't get this many broken game on every platform.

Major issues of All Platforms:

PS5: Weakest of three in IQ but you get the best performant version in terms of relative power and hardware but still gets the framedrops.

Xbox X: Nearly always looks better than PS5 output but always missing one or two critical things that just makes the release dead. RE4 -controller sucks, Callisto Protocol: Some fucking effect missing, Hogwarts Legacy - Framefacing issues clearly shows dev used most of their time optimization for their biggest player base

PC: Absolutely demolishes consoles in Image Quality and Performance but suffers severly from fucking stutters, and then you add the dumb RT and nearly every game stutters with RT on regardless of hardware. It's there in 4090 too, so fuck that.

And seriously so many fucking dumb modes on Consoles too.

Just make one 60 fps mode for console with FSR2 4K60 drs like look at RE4 it's just Mode Dumbness Galore on consoles and the 30 fps cinematic crowd can't even argue on that title, like it is 60 all time but never fucking hits it at two modes so it's all just big Dumbo dumba dumdum.

And then we fucking add RT mode and guess what all we needed was a bit better reflection and improved AO because as close to real light simulation RT produces the results matter nada when you are playing the game and are immersed in it. It does improve on things, things definitely look better, more realistic one would say but it only feels like slightly altered art direction.

DLSS2, FSR2 are the real deal and we need that incorporated in one mode that locks to 60 on all three fucking platforms and thats it. Ray tracing sucks and the addition and demand for it is hurting the game performance on launch on all platforms.
 
You know what I just realized that if not for Dumb Demands of raytracing, the developers could have focused on optimizing video games for the three platforms, and we won't get this many broken game on every platform.
This requires knowing who's good at optimization, who is just okay at it, and who is bad at it.

From Software leans between just okay and bad. I don't think RT implementation is to blame for their own shortcomings.
 

Umbasaborne

Banned
No way, rt is amazing and when its implemented well its a sublime experience. Minecract with rt and cyberpunk in particular. There are some cases where its phoned in. Hogwarts legacy has horrible rt implementation with blurry low res reflections that dont add anything to the scene. But in a game where rtgi, and high quality reflections are abound, it transforms the experience
 

RoboFu

One of the green rats
its true but you have to start somewhere.

The real thing that will push RT isn't just "MEH GRAPHIX!" its that it vastly simplifies game development. if you have a full RT renderer that means everything is taking care of. no more hacked together"trick" implementations for shadows lighting and reflections.
 
Last edited:

Xdrive05

Member
Shit take.

Ray tracing owns when used right. See: Metro Exodus EE, Minecraft RTX, Half-Life RTX, Quake RTX, Quake 2 RTX, Doom Eternal (ish), Fortnite Chapter 4 BR w/Lumen, and many more. RTGI especially is the real deal, and does the most to add immersion.

I will say that rasterized fakery has gotten REALLY good in some games. But RTGI is a big effing deal, and it's the future, when devs actually use it correctly.
 

diffusionx

Gold Member
It's total shit now, let's just be honest. Even on a high-end $1600 GPU, the manufacturer tells you to pair it with fake frames AND fake pixels to make the performance acceptable. On lower end hardware, it just destroys performance. And the jump in IQ is very subtle for the most part, with the most obvious benefits (probably reflections) usually severely cut down to not totally kill FPS.

I think we are a good 5 years away from it being to where it needs to be.
 

Thebonehead

Banned
Lighting artists have to spend time considering each model and scene deciding where to place lights and screen space reflections.

Imagine doing that for every internal and external location on cyberpunks scale. Those are Resources that can be freed up for other work.
 

Crayon

Member
Progress has been pretty sad but you can't ever get there if nobody tries. Eventually it will be worthwhile. I love the effects but look at the price you have to pay for the same performance but raytraced.
 

Robb

Gold Member
I agree it’s a bit of a waste right now, but it’ll get better for sure.

In a few years we likely won’t be talking about it as a feature at all, it’ll just be the standard in all games.
 

Umbasaborne

Banned
AMD hardware is just bad at raytracing so on current consoles I agree. Cyberpunk, Metro Exodus, Doom Eternal, Witcher 3 on PC with raytracing looks amazing.
How do i stop the lighting and shadow pop in on witcher 3 pc with rt turned on. I have a pc with a 4090 in but the lighting and shadow pop in witcher 3 has made me not want to use rtgi
 

lukilladog

Member
It's the future and in a few years time it will be the default rendering technique.

I think even cgi movies aren't fully raytraced, so that's wishful thinking. I get Op's point, ever since I got a high refresh HDR monitor I've turning RT off and don't miss it a bit. Everybody on consoles prefer performance modes, so it's a matter of time for devs to stop wasting time injecting stuff that no one uses and continue the evolution of graphics on the raster path. Because graphics evolution is just like natural evolution, having higher precision doesn't make you the fittest nor the selected one automatically, if one of the console manufacturers sacrifices raster for RT for next generation, it's gonna get trumped by the other one, mark my words.
 

Krappadizzle

Gold Member
Yeah, THAT'S the reason all of From Software's games have run like shit and have to be brute forced to play properly... It was RT all along!

Always Sunny Reaction GIF
 

JimboJones

Member
I think even cgi movies aren't fully raytraced, so that's wishful thinking. I get Op's point, ever since I got a high refresh HDR monitor I've turning RT off and don't miss it a bit. Everybody on consoles prefer performance modes, so it's a matter of time for devs to stop wasting time injecting stuff that no one uses and continue the evolution of graphics on the raster path. Because graphics evolution is just like natural evolution, having higher precision doesn't make you the fittest nor the selected one automatically, if one of the console manufacturers sacrifices raster for RT for next generation, it's gonna get trumped by the other one, mark my words.
You think so?
I would say by the next wave of consoles every engine will probably have some raytracing aspect baked into it, it won't be an afterthought on the next set of consoles, devs will want to use it even more. If a console comes out with gimped raytracing compared to it's competitor its going to routinely suffer performance drops or make sacrifices in other areas like resolution to keep up when it's used or it will end up getting Switch 2 up ports because it can't handle the proper version.
 

Kataploom

Gold Member
Have you seen Avengers Endgame, Avatar 2, etc? RT is one of the reasons why those movies look so good, but devs are still using old lighting pipelines and just putting some RT on top... Once they ditch rasterized lighting models in favor of fun path traced, games will not only be done quicker (or they should at least) but will look much better.

The problem is current hardware was never prepared for RT, Nvidia came out of nowhere with it and now people believe it's their "right" to have it in games somehow lol
 

SeraphJan

Member
The visual difference does not worth such performance hit at the moment, however for game with terrible SSR and AO, ray tracing makes huge difference, turning RT on in Resident Evil 2 remake for example is a day and night difference compare to its primitive SSR and AO implementation
 
Last edited:

diffusionx

Gold Member
You think so?
I would say by the next wave of consoles every engine will probably have some raytracing aspect baked into it, it won't be an afterthought on the next set of consoles, devs will want to use it even more. If a console comes out with gimped raytracing compared to it's competitor its going to routinely suffer performance drops or make sacrifices in other areas like resolution to keep up when it's used or it will end up getting Switch 2 up ports because it can't handle the proper version.
I am not even sure the next round of consoles will have acceptable quality RT, because it's still so primitive. It could be 50x better (totally arbitrary metric) and still not be good enough. We are going to have a RT/raster mix for a long time, like for the next decade at least.
 
Last edited:

A.Romero

Member
Doesn't really matter how it performs today. Just don't use it and be done with it.

However, it is important that games do have it. The reasons is because development pipelines need to be optimized for it (so it's standard when hardware that can run it is common place) and so developers can mature their skills by using it.

There will be a point where it will be a common feature that any hw can run, like tesselation or any other graphics features that would put a computer on its knees 10 years ago.

RT can be used for lots of things and the main benefit will be that professional looking implementations will be within the reach of even low budget indie games but it needs to mature and the only way for it to advance is to continue implementing it, failing, learn from the failure and try again.
 

MiguelItUp

Member
Can't say that I agree, OP.

It's still fairly new tech, and devs are trying to get the hang of it, so is hardware. It's only going to get better and better.

Until that happens, it's an option for those to use, or not. Some may really appreciate the outcome more than others. But, you know, your mileage may vary. But I would never say it's stupid, shit, or essentially meaningless. It's just different strokes for different folks.
 

flying_sq

Member
Ray tracing is just the beginning. Full path tracing is the real good stuff. Path tracing is what ray tracing was before RTX cards. From the ray tracing I have seen, only full ray traced lighting is a huge upgrade(ala Metro 2033). Everything else looks nice, but its not that much better. I'm curious to see how they do path tracing in the Cyberpunk 2077 update coming soon. Also consoles have it I'm sure because of some console parity clause or stupid crap that holds back graphical leaps.
 

lukilladog

Member
You think so?
I would say by the next wave of consoles every engine will probably have some raytracing aspect baked into it, it won't be an afterthought on the next set of consoles, devs will want to use it even more. If a console comes out with gimped raytracing compared to it's competitor its going to routinely suffer performance drops or make sacrifices in other areas like resolution to keep up when it's used or it will end up getting Switch 2 up ports because it can't handle the proper version.

Engines generally have features "baked into them" that never make it to games for performance reasons, UE is the perfect example of that. So that wont make RT appear on games all of a sudden... particularly now that publishers are seeing that people are favoring performance modes and 120 hz displays are getting more common... If I was a dev I would not want to spend time and resources on something that most people are not gonna use and is gonna give me a "lazy dev" reputation anyway.
 
Last edited:

Joramun

Member
Ray Tracing it's most likely the feature, but if I was going to build a new PC to game I wouldn't focus my build around RT.
 

Loomy

Member
And you know what I just realized that if not for Dumb Demands of raytracing, the developers could have focused on optimizing video games for the three platforms, and we won't get this many broken game on every platform.
This is not the reason for Fromsoftware games being some of the most unoptimized messes out there.
 

nemiroff

Gold Member
The dilemma with RT is that it makes stuff looks natural, thus some ignoramuses think it must suck because they "can't see it".

RT is the only way forward, there is no alternative. We'll have an adjustment period to make hardware and software catch up, but then it's gonna be all RT after that.
 
Last edited:

darrylgorn

Member
So the answer to this question really depends on how a game is developed.

A game can actually look better without ray tracing if the developer has a certain intention about how a game's lighting and reflections should appear. If it doesn't matter that shadows or lighting should be effected dynamically in the game world or realism isn't as important as their artistic direction, then raytracing won't matter at all.

The added benefit of raytracing though, is that in the future, developers will make it the default option with no alternative. This will allow them to flip that switch automatically and save a lot of development time which can be used for other parts of the game, including optimization.

For the time being, you could argue that RT's greatest strength is taking old games that have very minimal lighting and injecting raytracing into it. Half life 1 and Quake 1 RTX, for example, are pretty impressive looking alternatives to those games. Morrowind also looks like it could be an incredible update.
 
Last edited:
Top Bottom