• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

30 fps killing my eyes! Devs should abandon it!

You tried to make a claim how 3080 will aparently strugle soon. Then continued to beat around the bush with claims that go from 2013's 780TI to now 1080TI to now saying you dont know.

How things will play out is like this: 3080 is 2x faster than a ps5. This is a hard fact, that will not change next year, not 10 years from now. It will never stop being two times faster than a ps5. Therefore, it will for all time, run any game that the ps5 also runs substantially better. Of course the card will age and you will have to adjust details. But that will happen at quality levels far, far beyond what any current consoles will do. 7 years from now, when we will retire the ps5, a 3080 will still run games better and faster. Because 7 years from now this gpu will still be 2x faster than a ps5.
3080 10gb will hit vram limits eventually. Maybe some game will eventually need lower textures than a ps5 version.
 

Otre

Banned
3080 10gb will hit vram limits eventually. Maybe some game will eventually need lower textures than a ps5 version.
Didint it hit that with Far Cry 6 @ 4K ultra textures? Game is unplayable with it, Though, I did not notice much when changing the textures to high. Its an outlier, but its the start.
 
Didint it hit that with Far Cry 6 @ 4K ultra textures? Game is unplayable with it, Though, I did not notice much when changing the textures to high. Its an outlier, but its the start.
Yes. Not due to a hardware limit but due to unoptimized software, BUT such is the reality of PC software ; you need to have overhead to brute force good performance sometimes.

The 3080 should have never been shipped with 10gb, but they got away with it due to the market. It's funny that my 3060 has 12gb xD
 
Last edited:
You tried to make a claim how 3080 will aparently strugle soon. Then continued to beat around the bush with claims that go from 2013's 780TI to now 1080TI to now saying you dont know.

How things will play out is like this: 3080 is 2x faster than a ps5. This is a hard fact, that will not change next year, not 10 years from now. It will never stop being two times faster than a ps5. Therefore, it will for all time, run any game that the ps5 also runs substantially better. Of course the card will age and you will have to adjust details. But that will happen at quality levels far, far beyond what any current consoles will do. 7 years from now, when we will retire the ps5, a 3080 will still run games better and faster. Because 7 years from now this gpu will still be 2x faster than a ps5.
As long as you're happy :)

I wasn't talking about PS5 at all by the way. I was just saying that simply the 3080 will in a few years be underpowered and underwhelming. It won't yield you the results you think it will. Games are going to get way more demanding and yes, the PS5 will struggle and probably be 30fps at lower resolutions again (like 1080p-1440p etc) and your precious little 3080 will be struggling to do 4K/60 at that point. It's not the GPU that changes, but the software that's being run on it.
 

Otre

Banned
Does the PS5 run Far cry 6 at 4k ultra?

Its not about comparing platforms nor do I care to. Digital foundry said its a dynamic 4k 60fps, so ill assume its around 1800p most of the time. Its to be aware that games are already passing the 10gb VRAM limit. I played the PC version of FC6 and did not notice a downgrade going down to high textures on my 3080.
 
Last edited:

Petopia

Banned
I don't know did every body forget about the performance hit dying light 2 gave the consoles or is dementia settling in.
 
Honestly 30fps doesn't bother me. Maybe because I have always gamed on console and so I am just used to it, I don't know. I would rather 30fps and higher graphic settings than 60fps with less. It's great with the new consoles that most developers are giving us the option to choose what we want.
 

yamaci17

Member
10 gb will be fine for the entirety of the generation with CONSOLE equivalent settings

ps5/xsx have a total of 16 gb memory pool,

approximately 2.5 gb goes to system itself,
13.5 gb of usable memory left
most nextgen games will use a minimum of 3.5-4 GB memory for CPU operations. With that, most games will have to make do with 9.5-10 GB VRAM allocated. be reminded that not all of the memory available can be used as VRAM on consoles. game logic, physics, sound, most of that stuff are offload to RAM on PC, and they are not creating an extra burden on total VRAM.

most of the last gen (ps4 xbox one era) games use 2.5-4 GB GPU memory with CONSOLE settings.



 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
10 gb will be fine for the entirety of the generation

ps5/xsx have a total of 16 gb memory pool,

approximately 2.5 gb goes to system itself,
13.5 gb of usable memory left
most nextgen games will use a minimum of 3.5-4 GB memory for CPU operations. With that, most games will have to make do with 9.5-10 GB VRAM allocated. be reminded that not all of the memory available can be used as VRAM on consoles. game logic, physics, sound, most of that stuff are offload to RAM on PC, and they are not creating an extra burden on total VRAM.

most of the current gen games STILL use 2.5-4 GB GPU memory with CONSOLE settings.




RDR2 is last gen, not current. FH5 is cross gen more on last gen than current.

 
Last edited:

yamaci17

Member
Its not about comparing platforms nor do I care to. Digital foundry said its a dynamic 4k 60fps, so ill assume its around 1800p most of the time. Its to be aware that games are already passing the 10gb VRAM limit. I played the PC version of FC6 and did not notice a downgrade going down to high textures on my 3080.

3080 runs into vram limit in FC6 at 4K with

- ultra settings, ultra texture pack AND ray tracing

- consoles are capable of light ray tracing. it is proven by lots of titles at this point. WD Legion itself runs with ray tracing on consoles. but have you ever stopped and wonder, why FC6 on consoles does not have ray tracing? it has all the headroom it needs, it could easily have a 1080p 60 FPS ray tracing mode.

look how rx 6600xt performs with ray tracing



here is the kicker: they would have to disable ULTRA texture pack to adhere to VRAM limit they came across. they chose the texture pack. this is why FC6 lacks ray tracing on consoles. it already maxes out available VRAM ps5/xsx has.

it is also the reason why 3080's 10 gb buffer cannot handle textures+ray tracing at 4K together in that game. i've played that game at 4k with ultra textures on my 3070 and it was perfectly fine. even the 8GB buffer is enough to push ultra textures+4k together. ray tracing adds an extra burden of approximately 2-3.5 GB VRAM, which breaches the 10 GB buffer ps5, xsx and 3080 can allocate to the game

if you have run into limits WITHOUT ray tracing at 4k, most likely you have too much background applications that consumes VRAM or there are problems within your configuration

to add further: AMD-branded games specifically uses a specific ray tracing shadows implementations in their games that hammers VRAM a lot. supposedly, hammering on VRAM speeds up the ray tracing render, like some sort of RT shadow caching. in other words, the shadows are rendered, and then mostly cached so that they don't have to be "real timed" all the time. same was the case for Godfall, and once again, vram limit was breached there as well. this specific implementation works best when you have ample amount of vram (which amd gpus have). but ps5/xsx simply does not have that much ample VRAM, hence far cry 6 omitted out ray tracing from consoles, despite console being clearly capable of it.

WD legions was the EXACT opposite. it also had a 4K texture pack, and funny enough, they did not enable texture pack for consoles. and despite being the heavier game, it still included ray tracing for consoles.

they're practically testing the waters and playing around to see what maybe the best config for their games.

BUT, if they really had MORE than 10 GB buffer budget, they could easily enable BOTH textures and ray tracing for consoles. It is clear, they cannot. And consoles do not have magic jumbo mumbo to make it happen. You still have to use some of that memory pool (my appoximate guess is 2.5-5 GB) for game logic, sound, physics engine and more.

and a final note: This is why 10+6 GB split MAKES perfect sense for Xbox Series X. 10 GB for purely VRAM operations, and 6 GB for purely CPU operations (2-2.5 GB for system, 3.5-4 GB for pure CPU operations). Microsoft knew what they were doing. Developers have to stick with that 10 GB partition when it comes to GPU operations, otherwise any operations that slip to 336 GB/s will incur a performance penalty. I'm pretty sure PS5 has a similar allocation for devs.
 
Last edited:

yamaci17

Member
RDR2 is last gen, not current. FH5 is cross gen more on last gen than current.



my saying of "current gen" was meant for ps4/xbox one gen. since i'm on PC, and not a new, shiny console, I still feel like shift to the nextgen has not started. most games released are still on-par with last-gen

once we get to see games like metro exodus, flight simulator on a consistent basis, then I shall call those games lastgen. for now, they're still current gen to me. but you're right, officialy they're lastgen...

i fixed the wording regardless if that appeases you. no need to derail the thread

even the coveted 3.5 gb 970 still runs ps4/xbox one games with fine performance. it was kepler spefically got sidetracked because of really bad architecture. it took 8 years for maxwell to be broken on "some" games.

gtx 1060 still provides 1.7-2 times more performance than ps4, just as it did when it was released in 2016. Ampere is equipped with proper nextgen technologies and not only it will not age bad, it will actually age good. it has more advanced feature sets than both consoles, a thing that Kepler lacked. as a matter of fact, Kepler LACKED advanced feature sets that PS4/Xbox One had! the freaking architecture cannot even run some DX12 games.

comparing kepler to ampere is dishonest at best. one was worse in terms of technology than consoles, and other is BETTER in terms of technology than consoles.

lack of performance that gtx 780&780ti have has nothing to do with their VRAM. only gtx 770 with 2 gb buffer is really getting destroyed. kepler aged bad overall because of really bad architecture, plain and simple. look at how gtx 970, 980 still run most games fine. they're old, and you need to use optimized settings, just like consoles do.


look at far cry 6 with low settings at 1080p, which is what ps4/xbox one targets



4 GB VRAM buffer is not even maxed out. game barely uses 2.8 GB VRAM buffer, and gtx 970 pushes 60+ frames. you can even have a mix of lows and meds. its an ancient gpu but if you set the parameters right, it will still let you enjoy games. just as how consoles do even after 7 years.
 
Last edited:

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
30 is not an slideshow. Doesnt matter how many times do you repeat it to yourselves. It wont be true.
6ck06u.jpg


No matter how many times you say its not a slideshow doesn't make it true.

It means you need a new pair of glasses.
 

rodrigolfp

Haptic Gamepads 4 Life
my saying of "current gen" was meant for ps4/xbox one gen. since i'm on PC, and not a new, shiny console, I still feel like shift to the nextgen has not started. most games released are still on-par with last-gen

once we get to see games like metro exodus, flight simulator on a consistent basis, then I shall call those games lastgen. for now, they're still current gen to me. but you're right, officialy they're lastgen...

i fixed the wording regardless if that appeases you. no need to derail the thread

even the coveted 3.5 gb 970 still runs ps4/xbox one games with fine performance. it was kepler spefically got sidetracked because of really bad architecture. it took 8 years for maxwell to be broken on "some" games.

gtx 1060 still provides 1.7-2 times more performance than ps4, just as it did when it was released in 2016. Ampere is equipped with proper nextgen technologies and not only it will not age bad, it will actually age good. it has more advanced feature sets than both consoles, a thing that Kepler lacked. as a matter of fact, Kepler LACKED advanced feature sets that PS4/Xbox One had! the freaking architecture cannot even run some DX12 games.

comparing kepler to ampere is dishonest at best. one was worse in terms of technology than consoles, and other is BETTER in terms of technology than consoles.

lack of performance that gtx 780&780ti have has nothing to do with their VRAM. only gtx 770 with 2 gb buffer is really getting destroyed. kepler aged bad overall because of really bad architecture, plain and simple. look at how gtx 970, 980 still run most games fine. they're old, and you need to use optimized settings, just like consoles do.


look at far cry 6 with low settings at 1080p, which is what ps4/xbox one targets



4 GB VRAM buffer is not even maxed out. game barely uses 2.8 GB VRAM buffer, and gtx 970 pushes 60+ frames. you can even have a mix of lows and meds. its an ancient gpu but if you set the parameters right, it will still let you enjoy games. just as how consoles do even after 7 years.

All cool but I doubt anyone buys RTX cards thinking in PS4 games. They are already thinking on running PS5 games, so how will these cards perform on those games is what matters more.
 
Last edited:

yamaci17

Member
All cool but I doubt anyone buys RTX cards thinking in PS4 games. They are already thinking on running PS5 games, so how will this cards perform on this games is what matters more.
10 GB buffer won't be enough for ray tracing + ultra nextgen textures. Whether you have a console or GPU. And we know that in future, ray tracing will be niche for consoles. Hence, RTX 3080 owners will have to sacrifice on ray tracing just like consoles do at some point (they already have to, in FC6). in that light: yes, if people want reliable ray tracing in future, they must invest in a minimum of 12 GB VRAM GPU. 10 GB only saves the day with ray tracing.

Ray tracing + ultra nextgen textures will clearly need a minimum of 12 GB buffer at 1440p and 16 GB buffer at 4K, which neither PS5 nor Series X have

my answer was mostly about FC6. i'm saying that you can practically get double the FPS PS5 can provide while matching PS5's visual output. But you simply cannot run ray tracing on top of the PS5 visuals. The GPU has the grunt, but its buffer has not.

i don't know whether devs will focus on textures or ray tracing though. by the time nextgen textures arrive with actual nextgen games, running ray tracing will be out of question. look how 3080 struggles with cyberpunk. that's what real ray tracing performance looks like. that's how the future games will be.

imagine adding ray tracing on top of flight simulator. its already demanding as it is. or look at dying light or GotG. clearly, consoles were designed with no ray tracing in mind. otherwise, I would have expected them to have 24 GB total buffer so that graphic operations can have 16 GB buffer. instead they decided to be stingy instead (just like nvidia and their 3000 lineup).

then again, stinginess also stems from the limit of what GPU is capable.

do you think GTX 980 would be better off with 8 GB? you already have to adhere to low-medium console settings to get good performance out of it. and when you use such settings, you see that your vram usage is low regardless of how much vram you have.

i remember trying RDR 2 at console settings with my 8 GB 1080. game was just there, juggling 3.2 GB of total vram consumption. when i pushed the game further, I could get 6.5-7 GB VRAM usage. do you get what I mean?

3080, just like 980, won't have that much extra juice to push things way further than consoles. at best, I'd say be happy that you get 2x frames over consoles at equivalent settings. for ray tracing, it will be a no-go at 4K. maybe at 1440p it might get by for a couple more years. with nextgen textures? no chance.
 
Last edited:

Petopia

Banned
Honestly 30fps doesn't bother me. Maybe because I have always gamed on console and so I am just used to it, I don't know. I would rather 30fps and higher graphic settings than 60fps with less. It's great with the new consoles that most developers are giving us the option to choose what we want.
Well time get an upgrade then, you'll see the light when you upgrade.
 
Well time get an upgrade then, you'll see the light when you upgrade.
I have a Series X, but for me I choose the Quality over Performance option. What can I say, I'm not overly sensitive to 30fps.
But at least with the new consoles we are getting the choice now of quality or performance.
 

yamaci17

Member
You tried to make a claim how 3080 will aparently strugle soon. Then continued to beat around the bush with claims that go from 2013's 780TI to now 1080TI to now saying you dont know.

How things will play out is like this: 3080 is 2x faster than a ps5. This is a hard fact, that will not change next year, not 10 years from now. It will never stop being two times faster than a ps5. Therefore, it will for all time, run any game that the ps5 also runs substantially better. Of course the card will age and you will have to adjust details. But that will happen at quality levels far, far beyond what any current consoles will do. 7 years from now, when we will retire the ps5, a 3080 will still run games better and faster. Because 7 years from now this gpu will still be 2x faster than a ps5.
while Ampere will age much, much better than Kepler, you also have a wrong take on this.

even at best case, ampere will still lose a range of %10-30 performance over years. maxwell lost approximately %15 performance over pascal, and then both pascal and maxwell lost another %10 performance in relation to turing (approximate values based on the benchmarks i've observed). kepler lost like %30-40 directly when maxwell launched, and with pascal and turing, losses were totaling a %60-70 in certain games and a peak of %100 in doom eternal

gtx 770 was exactly 2 times faster than PS4 in games that did not have anything to do with then-next gen feature-set that PS4 had but 770 did not. naturally, nowadays gtx 770 is barely %30 above ps4 in new games. that's a whopping %70 regression.

3080 nowadays are not even 2 times faster than ps5, actually. its more like %70 80 faster. by the end of the generation, it will most like be %35-45 faster than ps5 (this is the best case. a case where nvidia does not completely abandon Ampere driver pathways. nvidia did not abandon maxwell and pascall pathways and i do not expect them to abandon turing/ampere pathways for a long time). but even then, Nvidia has the software power to exert more performance out of their GPUs. this is why they lose performance over time. when they do extra work, you see extra performance out of their GPUs. but the performance you see is not actually the actual, effective performance they have.

as i said, ampere will not regress like kepler did. and with ray tracing, it will always have a lead (a lead that 3080 will not be able to properly capitalize on due to low vram buffer)

regardless, it will regress. there are no two ways about it. lets hope that it does not become a Kepler. that stuff was horrendous. ps4 practically can be matched by 750ti in some games and then in some games it races with 770.
 
Last edited:

yamaci17

Member
relevant proof as to how Kepler regressed

Dl6JcG2.png
l2gbBEQ.png



you can see DECIMATION in some games. it is one of the most hideous architectures ever created in GPU world. i m glad I never got a hold of one. its just... horrendous. makes me want to puke whenever i check their performance.

then again... ampere should not have a similar fate. i hope

in games like rdr 2, wolfenstein, you can understand how ps4 practically matches gtx 770 in those games. THIS is a problem WITH ARCHITECHTURE. not PC.

equivalent GCN GPUs perform just like their CONSOLE counterparts. look at HD 7790. it still performs good. well above ps4. or r9 390 for that matter. they're still beasts.

since im not a wizard, i cannot say how well Ampere will age. But judging by how well equipped it is for nextgen, i would say it will age fine, just like maxwell/pascal.
 
Last edited:

Petopia

Banned
I have a Series X, but for me I choose the Quality over Performance option. What can I say, I'm not overly sensitive to 30fps.
But at least with the new consoles we are getting the choice now of quality or performance.
Well on PC u get the best of both worlds and Xbox games exist on there too well if ur willing to stand their garbage store front.
 

SteadyEvo

Member
Get your head checked.
No, OP is right. Playing RD2 on a 1X was rough. All the choppy horse back rides cross country have taken a serious 🧐 toll on my eyes.

I waited to get a PS5 before playing TLOU 2 and it was worth the wait. The difference from 30 to 60 is night and day.

And I’ve never owned a gaming PC. No desire to build and don’t want to overpay for prebuilt. Over 30 years of consoles, my brain can only process a controller.
 

Godfavor

Member
Well this gen consoles have more 60fps games than ever, or at least having the option to choose from in game settings. Would have liked that all devs optimizing for 60fps instead (like metro Exodus) and not scaling their engines down from 30fps by cutting effects or draw distance (which is a common option)
 
Last edited:

rofif

Can’t Git Gud
6ck06u.jpg


No matter how many times you say its not a slideshow doesn't make it true.

It means you need a new pair of glasses.
oh cmon. This has nothing to do with eyesight.
Rather brain and you willingness o play for 30 minutes and get used to it... also - better with controller than mouse and keyboard.
Of course 60 is better but then I can say 60 is trash slide show after I had 240hz monitor for 6 months.
And you know what? 30 is perfectly fine if done correctly, with good motion blur and good controls. People shit on bloodborne but aside from it's frampacing issues, it controls very well for a 30fps game.
Or uncharted4? Perfectly fine at 30fps. Just as new Horizon
 

rodrigolfp

Haptic Gamepads 4 Life
Well this gen consoles have more 60fps games than ever, or at least having the option to choose from in game settings. Would have liked that all devs optimizing for 60fps instead (like metro Exodus) and not scaling their engines down from 30fps by cutting effects or draw distance (which is a common option)
Than ever??? Pong, Atari, NES and SNES gens.
 
Last edited:

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
Rather brain and you willingness o play for 30 minutes and get used to it... also - better with controller than mouse and keyboard.
Both are equally bad, it's just less noticeable with a controller since the aim assist does all the fine work.

Theres higher input latency the lower the fps is.

A racing game as an example, the sense of speed is way slower with 30 fps than higher. The higher the frame rate, the faster the image, and the faster sense of speed.

Of course 60 is better but then I can say 60 is trash slide show after I had 240hz monitor for 6 months.
Of course you can. You are getting used to something better, so of course will 60 fps be worse.

A ferrari is more fun to drive than a fiat, but the fiat is still a functional way of transportation.

You pay for luxury, a better experience.

30 is perfectly fine if done correctly, with good motion blur and good controls
If 30 is perfectly fine, then why does it need motion blur and good controls.

30 fps without motion blur is a stutter show, which is why games with this low framerate something to graphically hide the stuttering.

I've played so many 30 fps games, also with a controller. I switch between my pc and Xbox, with my pc of course having a higher framerate as I have a gaming pc.

I can play with 30 fps. As you say, you get used to 30 fps after playing for some time. But having to dial in on something versus being able to instantly enjoy something is a big difference.

I've never tried a 30 fps game not having some sort of tanky feeling. My problem is probably that I am a sensitive person, so I notice small differences, but it's easy to feel the input latency in a 30 fps vs higher game.

Input lag for a 60 fps game is 16.67 MS, whereas for a 30 fps game it's 33.34, which is a lot in comparison.

Games are working against this by adding high amount of aim assist.

People shit on bloodborne but aside from it's frampacing issues, it controls very well for a 30fps game.
Or uncharted4? Perfectly fine at 30fps. Just as new Horizon

"fine" is just settling with it.

I've never tried Bloodborne, but uncharted felt pretty tanking with it.

Destiny felt tanky. Any 30 fps game felt tanky, and input latency shows there is.

Of course, if you are used to it then it's fine for what it is.

But it's easier to go from 30 fps and higher than settling with 30 fps after being used to play higher.

Unless you are blind and have slow reactions.
 

KXVXII9X

Member
Honestly, I can barely tell a difference between 120/144 fps and 60 fps. I got a new laptop with 144Hz and I can't really tell any difference. I can tell some difference between 30 and 60, but I remember my PS4 and games that had consistent 30 fps felt smooth. I don't think every game has to be 60 fps or higher, especially when we get to more real current gen only games. Of course multiplayer games should aim for 60 fps and up, but shorter single player games I don't see them needing to. As long as the framerate is consistent, I wouldn't mind as much. If I had my way I would prefer a less focus on photorealism and more on stylized graphics with 60 fps and good lighting.
 

rofif

Can’t Git Gud
Both are equally bad, it's just less noticeable with a controller since the aim assist does all the fine work.

Theres higher input latency the lower the fps is.

A racing game as an example, the sense of speed is way slower with 30 fps than higher. The higher the frame rate, the faster the image, and the faster sense of speed.


Of course you can. You are getting used to something better, so of course will 60 fps be worse.

A ferrari is more fun to drive than a fiat, but the fiat is still a functional way of transportation.

You pay for luxury, a better experience.


If 30 is perfectly fine, then why does it need motion blur and good controls.

30 fps without motion blur is a stutter show, which is why games with this low framerate something to graphically hide the stuttering.

I've played so many 30 fps games, also with a controller. I switch between my pc and Xbox, with my pc of course having a higher framerate as I have a gaming pc.

I can play with 30 fps. As you say, you get used to 30 fps after playing for some time. But having to dial in on something versus being able to instantly enjoy something is a big difference.

I've never tried a 30 fps game not having some sort of tanky feeling. My problem is probably that I am a sensitive person, so I notice small differences, but it's easy to feel the input latency in a 30 fps vs higher game.

Input lag for a 60 fps game is 16.67 MS, whereas for a 30 fps game it's 33.34, which is a lot in comparison.

Games are working against this by adding high amount of aim assist.



"fine" is just settling with it.

I've never tried Bloodborne, but uncharted felt pretty tanking with it.

Destiny felt tanky. Any 30 fps game felt tanky, and input latency shows there is.

Of course, if you are used to it then it's fine for what it is.

But it's easier to go from 30 fps and higher than settling with 30 fps after being used to play higher.

Unless you are blind and have slow reactions.
"just fine" is not a bad thing.
I am getting used to it pretty well but lower graphics will be a sore on my eyes for duration of the game.
It is just a sacrifice I am most likely willing to make but it is judged on game by game basis.
and sacrifice is maybe strong word... I have pretty easy "ignorance is bliss" mode when it comes to this with controller games. It is weird coming from 120hz but it is doable without much pain.
I did really had 240hz monitor but I put graphics over framerate... unless the 30fps mode feels really slow and bad.

In horizon, 30 was an easy choice. In elden ring, quality mode is out of the question. Performance is barely ok.

As for the question
30 fps without motion blur is a stutter show, which is why games with this low framerate something to graphically hide the stuttering.
Well yes. that is exactly it.
Motion blur is used to help blend frames, hide the stutter show and help to CHEAT your brain. Just like your brain is cheated when watching a movie.
To be honest, only when I had 240hz, motion blur was not needed but I realized that 60fps with motion blur looks the same as 240hz without at least in doom 2016 (not feel).
Each frame cannot be a single still image when you play at only 30 or 60fps. Good motion blur behaves like shutter speed setting in a camera. It's goal is to target as much movement that happened on the screen in the timeframe of that frame. So 33,3 or 16ms. If you display frozen, no motion blur time stop frame one after another ever 33,3 then of course it will look like stutter city. Your brain does not have the data needed to fill in the needed movement information.

And yeah. UC4 is very tanky feeling. With new port, it feels like other 30fps games when you run it in 60fps mode. It's just an animation priority game.
 

ACESHIGH

Banned
I built my rig with a GTX 760 in 2013. Should have gone with an R9 280. That GPU could still run games at console performance even today.
I had it until late 2019 then replaced it with an RX 580. I remember DOOM running like absolute dogshit on it. Couldn't believe how my friends ps4 were running the game at 60 fps.

The moral of the story is... wait until you see how next gen games run on a GPU before upgrading. I have learnt my lesson and will wait until DDR5 the 5000 series NVIDIA GPUs and a few UE5 games with Direct storage run on PC before building a new rig.
 

Gamer79

Predicts the worst decade for Sony starting 2022
I agree that all games should be locked to 60fps or more. 30fps is bad
 

Bojji

Member
bloodborne at 30 fps has more responsive controls than elden ring at 60 fps

LOL. You mean 60 fps PS4 version running on PS5? No other console version has stable performance.

The worst thing about 30FPS is input lag, I can adjust to the framerate after some time but high input lag just drives me mad (Guardians of the Galaxy....).
 

mansoor1980

Gold Member
LOL. You mean 60 fps PS4 version running on PS5? No other console version has stable performance.

The worst thing about 30FPS is input lag, I can adjust to the framerate after some time but high input lag just drives me mad (Guardians of the Galaxy....).
i meant the PC version of elden ring at 60 fps
btw try bloodborne and the controls will surprise you in a good way
 
Watching 30fps cutscenes with proper framepacing and good motion blur - Good. Would recommend.

Watching 30fps cutscenes without proper framepacing and/or good motion blur - Janky, stuttery but tolerable.

Playing a game at 30fps with proper framepacing and good motion blur - unresponsive, slow, barely tolerable.

Playing a game at 30fps without proper framepacing and/or good motion blur - DEATH

This is my personal opinion as a primarily PC gamer who played almost all games locked to 60fps (or sometimes 50fps on a graphically demanding titles) for as long as I can remember. Please put down any torches and/or pitchforks
 

Bojji

Member
People saying 30fps hurts their eyes clearly never watch movies :D :D :D

Spoiled brats everywhere... C'mon TS cant be serious... I laughed way too hard...

You don't interact with 24Hz movies. Games at 30fps are on the edge of tolerable responsiveness IF devs care about making input lag ad low as possible. 60 fps should be standard.
 
I replayed Bloodborne after Elden Ring PC and it took me about 10 minutes to get accustomed to the framerate. Sure was nice going back to 60 after I was done replaying, though! Lol
 

K2D

Banned
I'm playing The Last of Us Part 2 these days and my eyes are killing me! I'm trying to avoid camera movements at all costs.

I think that games i played all those years which run at 30 fps and below caused my myopia(especially, playing TES:Oblivion with 15-20 fps for more than 100 hours :messenger_tears_of_joy:). You don't believe it? Here is a literature review about screen time(use of digital devices) and myopia relation: link to scielo.br. They reviewed 9 studies with more than 100.000 participants in total. In conclusion, they can't say it's related for sure but they suspect it may has some effect on it. Maybe someone will do a study about 30 fps vs 60 fps games and their effects on eye problems in future. I think 30 fps does more damage than screen time or resolution.

I think devs should leave this old crap behind. I can handle some low resolution textures here and there. But i can't take this 30 fps bs anymore! It's not healthy at all.

PS: Great game so far btw.

Edit for necromancers: After a suggestion made on this thread 1 year ago, i learned that playing a ps4 game at 30 fps on a 75 hz monitor increases the awfulness of 30 fps. I continued to play The Last of Us 2 (which you can't play on a pc yet) on TV and it was a little more bearable. My thoughts on 30fps haven't really changed, but my frustration eased a bit.
I hope we see more 24 fps games. These 60/120/144 fps games hurt my sense of immersion.

Marvel Studios Reaction GIF by Disney+
 

Swift_Star

Banned
You don't interact with 24Hz movies. Games at 30fps are on the edge of tolerable responsiveness IF devs care about making input lag ad low as possible. 60 fps should be standard.
cry me a river, will you? You people are too sensitive. 30fps is fine. It’s here to stay. Maybe use a tv that has game mode and you’ll stop noticing input lag. Or use a decent monitor.
 

rodrigolfp

Haptic Gamepads 4 Life
Maybe it is on PC, I don't know. On PS5 it's fine, despite horrible performance and framepacing From games always had decent input latency.
If the PC version has same or worse lag than the PS5 version, than the game in indeed a technical disaster.
 
Top Bottom