• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGTech: Resident Evil 3 Remake PS5 vs Xbox Series X|S Frame Rate Comparison (Next-Gen Update)

DaGwaphics

Member
So I guess the answer is no you can't.
Your type likes to try and change goal posts so I will not acknowledge the rest of this post.

I just call out BS shots when I see it. 🤷‍♂️

Plus, I just don't give a shit, when the averages are what they are. Any lows on Xbox would have to be matched by equal highs in order to maintain the higher average. LOL
 
Last edited:

DaGwaphics

Member
It's a real shot no?
I don't know if PS5 drops to the same lows(hence the question I asked)but what I do know is that's a pretty big drop on XSX.

I don't know, but the stats can't be right for the shot, so BS just the same. The green line on the frametime there is clearly sub 30 across the board, which doesn't jive with the minimum FPS provided (above 30). Like I said, something not right there. Plus the shots not even being remotely in sync between the systems. 🤷‍♂️
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
It's a real shot no?
I don't know if PS5 drops to the same lows(hence the question I asked)but what I do know is that's a pretty big drop on XSX.

It is an extremely isolated case being milked for all it's worth.

2 out of the 22 thousand frames drop to that low. (underlined). On average SX still runs higher frame rate (16.67 ms corresponds to when the game is running at 60 FPS).

Via VGTech:



Frame Time CountsPS5Series X
16.67ms17235 (78.09%)18666 (81.95%)
33.33ms4803 (21.76%)4063 (17.84%)
50ms32 (0.14%)41 (0.18%)
66.67ms1 (0%)4 (0.02%)
83.33ms0 (0%)2 (0.01%)
116.67ms0 (0%)0 (0%)
 
Last edited:

DaGwaphics

Member
It is an extremely isolated case being milked for all it's worth.

2 out of the 22 thousand frames drop to that low. (underlined). On average SX still runs higher frame rate (16.67 ms corresponds to when the game is running at 60 FPS).

Via VGTech:



Frame Time CountsPS5Series X
16.67ms17235 (78.09%)18666 (81.95%)
33.33ms4803 (21.76%)4063 (17.84%)
50ms32 (0.14%)41 (0.18%)
66.67ms1 (0%)4 (0.02%)
83.33ms0 (0%)2 (0.01%)
116.67ms0 (0%)0 (0%)

Hadn't noticed those stats before, LOL. I'll take the 1,000+ more 16.67ms frames in exchange of those 14 slower frames any day.

I guess that white line in the graph is their 1 second split, that must be what made the average look wrong. If the screen grab is accurate I guess almost all the lows are right there in that one second, LOL.
 
Last edited:

S0ULZB0URNE

Member
It is an extremely isolated case being milked for all it's worth.

2 out of the 22 thousand frames drop to that low. (underlined). On average SX still runs higher frame rate (16.67 ms corresponds to when the game is running at 60 FPS).



Frame Time CountsPS5Series X
16.67ms17235 (78.09%)18666 (81.95%)
33.33ms4803 (21.76%)4063 (17.84%)
50ms32 (0.14%)41 (0.18%)
66.67ms1 (0%)4 (0.02%)
83.33ms0 (0%)2 (0.01%)
116.67ms0 (0%)0 (0%)
I know about the graph hence me asking for a screen shot of GAMEPLAY showing PS5 dropping as low.

Again not concerned about the other things but a fps drop that low is concerning.
 

adamsapple

Or is it just one of Phil's balls in my throat?
I know about the graph hence me asking for a screen shot of GAMEPLAY showing PS5 dropping as low.

Again not concerned about the other things but a fps drop that low is concerning.

The numbers are there, SX has a lowest drop of 18 for 2 frames, PS5 has a lowest drop of 24~ FPS for a single frame.

That's the absolute lowest those two go, but for two and one frames respectively.
 

Kuranghi

Member
Precis: Non-RT modes are fantastic, HFR modes are great if you have VRR (Not sure if a display could support HDMI 2.1 and not VRR so its kind of a moot point except for people playing on HDMI 2.0b displays with 1080p@120hz modes) RT modes are fine if you don't mind framerate crashing in cutscenes and during heavy gameplay scenes, they should've included a 30fps lock for the RT mode for those that want a stable fps, or dropped the res further to lock to 60 99% of the time.
 

welshrat

Member
Are we all still arguing over a gnats dick ? We have the closest consoles and fabulous backwards compat on both (signed up to gamepass ultimate and PSPlus Premium) and so many games and we are here arguing over a couple of dropped frames. Weird.
 
i saw some console users who actively fought against the premise of having 3rd or 4rd options (rt+performance, rt+quality or vrr quality modes etc.). whenever you suggest such stuff, someone will come out say "play on muh pc". and when you do choose PC, they will say "muh pc so complicated i prefer my basic modes on console, those improvements are trash and not valuable so i dont even care lololol"
Late but v sync should automaticlely be disabled on every game if you are using a vrr display
 
I don't follow you. I find incredibly stupid use 4k CBR with RT with such perfomance on console I literally don't understand why capcom didn't opted yet for lower resolution and still persist in this way.
It’s native the rt is what’s checkerboarded
 

yamaci17

Member
Late but v sync should automaticlely be disabled on every game if you are using a vrr display
vsync should not be disabled, vsync should always be there to sync out the odd frames here and there. if you play on a 120 hz container and game runs between 50-80 FPS, you're already free of Vsync lag

only problem is the capped 40/60 FPS modes. those modes have the full lag wrath of Vsync even if your screen is capable of VRR. So in that case, you would need to cap the framerate just a bit below refresh rate and let the VRR take the reins instead (57-58 FPS for 60 hz, or matching locked 40 FPS to 80 hz on a 120 hz container so that 120 hz vsync never occurs, but this is too advanced VRR stuff that I personally don't think we will ever see implemented. you have all the freedom on PC about this, u can practically run every game in existence to run on a 144 hz container, plain and simple. even games from 2000s. being in the 144 hz container, you can match 40 FPS input to 80 hz output. it is simply glorious and simple have super duper low input lag)
 
Last edited:
vsync should not be disabled, vsync should always be there to sync out the odd frames here and there. if you play on a 120 hz container and game runs between 50-80 FPS, you're already free of Vsync lag

only problem is the capped 40/60 FPS modes. those modes have the full lag wrath of Vsync even if your screen is capable of VRR. So in that case, you would need to cap the framerate just a bit below refresh rate and let the VRR take the reins instead (57-58 FPS for 60 hz, or matching locked 40 FPS to 80 hz on a 120 hz container so that 120 hz vsync never occurs, but this is too advanced VRR stuff that I personally don't think we will ever see implemented. you have all the freedom on PC about this, u can practically run every game in existence to run on a 144 hz container, plain and simple. even games from 2000s. being in the 144 hz container, you can match 40 FPS input to 80 hz output. it is simply glorious and simple have super duper low input lag)
V sync is only a negative though when using vrr it’s a performance and impute lag penalty
 

poodaddy

Gold Member
V sync is only a negative though when using vrr it’s a performance and impute lag penalty
No, there is a VRR range. VRR is wonderful, but when your frame rate drops below 40 or exceeds your display's maximum refresh rate, VRR can't be utilized and frame time variability and tearing will be inevitable. Direct from Nvidia themselves, you shouldn't disable v sync when utilizing VRR, just cap your refresh rate to 3 fps below your display's max. V sync will be utilized to catch the frames that fall outside the VRR range. If you're concerned about input lag, which is valid, I suppose you could go more for the "fast" v sync in the driver options, but I haven't personally ever used it. On console this is obviously not an option, such is why I wish rudimentary driver level options were available to console users who opt in to them, but that's a pipe dream and will never happen :/.
 
No, there is a VRR range. VRR is wonderful, but when your frame rate drops below 40 or exceeds your display's maximum refresh rate, VRR can't be utilized and frame time variability and tearing will be inevitable. Direct from Nvidia themselves, you shouldn't disable v sync when utilizing VRR, just cap your refresh rate to 3 fps below your display's max. V sync will be utilized to catch the frames that fall outside the VRR range. If you're concerned about input lag, which is valid, I suppose you could go more for the "fast" v sync in the driver options, but I haven't personally ever used it. On console this is obviously not an option, such is why I wish rudimentary driver level options were available to console users who opt in to them, but that's a pipe dream and will never happen :/.
Why can’t you just set a frame cap why do you need v sync to cap the frames? I thought v sync was used as a frame cap method to stop screen tearing (which vrr solves)
 

yamaci17

Member
Why can’t you just set a frame cap why do you need v sync to cap the frames? I thought v sync was used as a frame cap method to stop screen tearing (which vrr solves)

if you set an aggresive enough frame cap, you won't get tears, yes (120 fps on a 144 hz container is enough). but there will still be "rogue" frames here and there. and most people do not use aggresive caps, they use super strict limits (141 fps on 144 hz container. such a strict limit will shoot over 144 fps consistently, despite "rivatuner" showing a consistent 141 fps output. it is impossible for it to not shoot above 144 fps or below 141 fps with such a strict limit.

it really depends on your mindset. i always use a global 120 fps cap on my 144 hz container so i never get tears (ALMOST) with vsync disabled.

people who use strict limits will get tears here and there, which led them to believe that vsync is a hard requirement alongside with gsync/freesyn to get rid of tears. its a bit of misconception but I made my peace with it and I don't argue about that much, since it has been spread as a gospel thanks to "blurbusters" and their "wisdom"

i literally had raging fights with some of the communities, claiming i would not get rid of tears with a 120 fps cap on a 144 hz container. i even recorded slow mo videos to show that it does not tear. they still believed i somehow cheated / did something different etc. etc.

so yeah, no point in arguing

if you're well below your cap, you will never get affected by Vsync's input lag behaviour anyways. say, you play at locked 60 fps on a 144 hz container, and vsync is also enabled, you won't get hit by input lag penalty. that is why it became a gospel, with that setup, you won't hit input lag bound states most of the time, which is why people suggest it (not me, I still prefer a 120 fps cap)
 
if you set an aggresive enough frame cap, you won't get tears, yes (120 fps on a 144 hz container is enough). but there will still be "rogue" frames here and there. and most people do not use aggresive caps, they use super strict limits (141 fps on 144 hz container. such a strict limit will shoot over 144 fps consistently, despite "rivatuner" showing a consistent 141 fps output. it is impossible for it to not shoot above 144 fps or below 141 fps with such a strict limit.

it really depends on your mindset. i always use a global 120 fps cap on my 144 hz container so i never get tears (ALMOST) with vsync disabled.

people who use strict limits will get tears here and there, which led them to believe that vsync is a hard requirement alongside with gsync/freesyn to get rid of tears. its a bit of misconception but I made my peace with it and I don't argue about that much, since it has been spread as a gospel thanks to "blurbusters" and their "wisdom"

i literally had raging fights with some of the communities, claiming i would not get rid of tears with a 120 fps cap on a 144 hz container. i even recorded slow mo videos to show that it does not tear. they still believed i somehow cheated / did something different etc. etc.

so yeah, no point in arguing

if you're well below your cap, you will never get affected by Vsync's input lag behaviour anyways. say, you play at locked 60 fps on a 144 hz container, and vsync is also enabled, you won't get hit by input lag penalty. that is why it became a gospel, with that setup, you won't hit input lag bound states most of the time, which is why people suggest it (not me, I still prefer a 120 fps cap)
But in your example your talking about frame caps WITHOUT vrr im talking about using a normal frame cap in conjunction with vrr
 
I’m really amazed that there’s so much discussion nowadays in barely noticeable differences and tiny amounts of frame drops from 60fps compared to lets say, 1 gen ago almost all games were 30fps, and most of those hit like 25fps on average, and no one batted an eye lol. And let’s not forget 2 gens ago where games could barely run sometimes in the first place.

Can we not just enjoy our games anymore? These tiny differences between consoles where you literally see the same experience on your screen if you didn’t read these numbers are not worth discussing over imho.

Great job though from DF and other performance capturing channels to make themselves important about a much less important subject than ever before.
 

yamaci17

Member
How are you getting screen tearing then?
because of ROGUE frames.

the more your frame cap is close to your refresh rate container, the more likely rogue frames will EXCEED refresh rate

"
G-SYNC (GPU Synchronization) works on the same principle as double buffer V-SYNC; buffer A begins to render frame A, and upon completion,

Upon its release, G-SYNC’s ability to fall back on fixed refresh rate V-SYNC behavior when exceeding the maximum refresh rate of the display was built-in and non-optional. A 2015 driver update later exposed the option.

This update led to recurring confusion, creating a misconception that G-SYNC and V-SYNC are entirely separate options. However, with G-SYNC enabled, the “Vertical sync” option in the control panel no longer acts as V-SYNC, and actually dictates whether, one, the G-SYNC module compensates for frametime variances output by the system (which prevents tearing at all times. G-SYNC + V-SYNC “Off” disables this behavior; see G-SYNC 101: Range), and two, whether G-SYNC falls back on fixed refresh rate V-SYNC behavior; if V-SYNC is “On,” G-SYNC will revert to V-SYNC behavior above its range, if V-SYNC is “Off,” G-SYNC will disable above its range, and tearing will begin display wide."
 
Last edited:
It literally takes away performance
What the fuck are you smoking lol no vsync doesn't "take away performance" it simply forces the graphics card to hold a frame until the monitor is done drawing. That has no performance cost whatsoever. Eg - if you can only get 92 fps on a 144hz gsync monitor with vsync off, turning vsync on wouldn't change your 92 fps.
 
Top Bottom