• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

You can play all the games you want at 60 FPS on Bravia TV if you want

bitbydeath

Member
This has been the most interesting thread of the year!!!!
I welcome you to this thread.

If you start reading now you might be caught up by release!
 

Reizo Ryuu

Member
I dunno what's worse, OP's post, or people in here actually being ok with the PQ destroying artifacts interpolation introduces...
Def the latter, OP didn't know better.

team america vomit GIF
 

Kagey K

Banned
I welcome you to this thread.

If you start reading now you might be caught up by release!
That thread is so last year.
 
I'm pretty sure most of the 60fps from genres that work fine at 30fps have been using interpolated frames for ages, but we just haven't been told. When I was playing SackBoy recently on PS5 and watched back a videoclip I had saved the frame-rate looked nothing like what I experienced while playing, and looked as though the saved clip showed the real unique frames that were rendered at about 30fps.

Same while playing the 120fps Halo Infinite mode, it felt no more responsive than Rage (idtech5) did at true 60fps on PS3. Forza Horizon's in the 30fps felt more like 15fps-20fps in the short amount I tried, and the 60fps mode didn't feel as smooth as 60fps from any of the Gran Turismo games. it felt more like 40-45 in response, even though all visually looked like their claimed frame-rates the feeling was off.
Are you fanboying (again) or what? Consistent 30 fps, feels like consistent 30 fps. Inconsistent 30 fps, feels like inconsistent 30 fps. It doesn't matter if it's on your favorite plastic box or not, it'll still feel the same. There's no magical sauce to this. Stop fanboying and realize it'll be the same on both.


Otherwise end up like OP, and be laughed at for placebo effects.
 

PaintTinJr

Member
Are you fanboying (again) or what? Consistent 30 fps, feels like consistent 30 fps. Inconsistent 30 fps, feels like inconsistent 30 fps. It doesn't matter if it's on your favorite plastic box or not, it'll still feel the same. There's no magical sauce to this. Stop fanboying and realize it'll be the same on both.


Otherwise end up like OP, and be laughed at for placebo effects.
Go back and re-read my first post - where I was clearly being independent of either console, until someone took issue about the Halo vs Rage comparison. Rachet and clank, Spiderman, and Morales, and even games like GoW all feel like they are either triple buffered in their 60fps mode or use some frame interpolation technique IMO, where the 60fps feels just like a flawless 30fps, rather than a true 60fps, like games of old definitely were, which I mostly played on PC - back in the day.

If an engine is interpolating frames, and doing it cheaply it would be virtually impossible to tell it wasn't an interpolated frame-rate - by frame-counting.

It is well known that 30fps Racers aren't doing what they seem, because the rate of travel(by speed) versus frame-rate feedback is massively undersampled and so assisted steering in those (burnout? type)games fudges things to accommodate the lack of feedback to make a playable game. Interpolating frames in the engine for various situations or genres to fake a feedback response is along the same design style, and as someone already mentioned KZ 4 Shadow fall used this technique in single or multi player IIRC.
 
Go back and re-read my first post - where I was clearly being independent of either console, until someone took issue about the Halo vs Rage comparison. Rachet and clank, Spiderman, and Morales, and even games like GoW all feel like they are either triple buffered in their 60fps mode or use some frame interpolation technique IMO, where the 60fps feels just like a flawless 30fps, rather than a true 60fps, like games of old definitely were, which I mostly played on PC - back in the day.

If an engine is interpolating frames, and doing it cheaply it would be virtually impossible to tell it wasn't an interpolated frame-rate - by frame-counting.

It is well known that 30fps Racers aren't doing what they seem, because the rate of travel(by speed) versus frame-rate feedback is massively undersampled and so assisted steering in those (burnout? type)games fudges things to accommodate the lack of feedback to make a playable game. Interpolating frames in the engine for various situations or genres to fake a feedback response is along the same design style, and as someone already mentioned KZ 4 Shadow fall used this technique in single or multi player IIRC.
TLDR: You are fanboying. Cool. Could have just said that from the get go, it would be easier to disregard your post if you had said that from the beginning.


It's not about the tech behind the games, you only care about the specific plastic box behind it, which is why this conversation ends now.
 

Godfavor

Member
Samsung are the best and the only TV brand that really care about gaming. You should see how much this helps Switch games. Hard to find videos on Youtube and I don't feel like making my own.
My LG GX is quite acceptable with IFC in 1080p mode with switch, I was testing it with BOTW and it plays ok with artifacts ofc. But it has horrible input lag in 4k consoles. The artifacts are too distracting to keep it on even without input lag though.

Why the hell there is not a good algorithm to eliminate IFC artifacts. Philps did this 20 years ago with the natural motion crt. The motion handling was insane.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
Gonna be honest here, I've played with both motion enhancement on/off and it made no difference to my gameplay. I think input lag is overstated as a whole.

If you're playing something like say .. a turn based JRPG .. it wouldn't make a lot of difference to your experience.
 

HoofHearted

Member
I welcome you to this thread.

If you start reading now you might be caught up by release!
I know of that one - I meant this year
 

Fafalada

Fafracer forever
The quoted input lag of 20ms is impressive, would love to see for myself.
It's mathematically impossible for games.
Minimal input lag for a modern 60hz title is 33-50ms (the cutting edge titles usually get 50-60ms, but just making a point it's 'possible' to go lower, just unlikely).
That makes minimal input lag with interpolation(assuming 0 processing overhead) = 66-100ms (and that's for 60hz native titles. For 30hz you should double that number).

I don't know how the numbers are derived - if it's just manufacturer stating 'added delay from processing' - that's possible (17ms processing-cost for interpolation) but it doesn't tell you anything about actual input-lag. If it's based on a test - it would require 60hz input to begin with, and 0-native input lag in the stream (so not how games work at all, the only thing that responds that fast is maybe a hardware mouse cursor). I suspect the former though.
 
Last edited:

Kuranghi

Member
I don't know how the numbers are derived

They mean the lag added by the display only, they use something like this to calculate it:

 
Last edited:

Fafalada

Fafracer forever
They mean the lag added by the display only, they use something like this to calculate it:
Ok, that then falls under the case I described here: it would require 60hz input to begin with, and 0-native input lag in the stream (since it's measured off of hardware device directly).
Unfortunately that also doesn't measure at all what TV in this mode does to game's input-to-photon latency, especially sub-60fps game (that will still be 2x normal input latency + the number measured here) - though it is striking how god-awful the Sony display implementation must be for those numbers.
 
Last edited:

Kuranghi

Member
Ok, that then falls under the case I described here: it would require 60hz input to begin with, and 0-native input lag in the stream (since it's measured off of hardware device directly).
Unfortunately that also doesn't measure at all what TV in this mode does to game's input-to-photon latency, especially sub-60fps game (that will still be 2x normal input latency + the number measured here) - though it is striking how god-awful the Sony display implementation must be for those numbers.

I have no idea what you're talking about I'm afraid, cheers though. I was stating how the measure it, I don't know the efficacy of the test and I don't really care about input lag added by displays anyway, I'm more interested in how its added in other forms.
 
Last edited:

RoadHazard

Gold Member
It's mathematically impossible for games.
Minimal input lag for a modern 60hz title is 33-50ms (the cutting edge titles usually get 50-60ms, but just making a point it's 'possible' to go lower, just unlikely).
That makes minimal input lag with interpolation(assuming 0 processing overhead) = 66-100ms (and that's for 60hz native titles. For 30hz you should double that number).

I don't know how the numbers are derived - if it's just manufacturer stating 'added delay from processing' - that's possible (17ms processing-cost for interpolation) but it doesn't tell you anything about actual input-lag. If it's based on a test - it would require 60hz input to begin with, and 0-native input lag in the stream (so not how games work at all, the only thing that responds that fast is maybe a hardware mouse cursor). I suspect the former though.

Motion interpolation wouldn't double the lag though, it would just add X ms to whatever the native game lag is. If a game ran at 30fps with 1000ms input lag (extreme example to make a point), interpolation wouldn't make that lag 2000ms. It would be 1000 + the interpolation processing time.
 

Fafalada

Fafracer forever
Motion interpolation wouldn't double the lag though, it would just add X ms to whatever the native game lag is. If a game ran at 30fps with 1000ms input lag (extreme example to make a point), interpolation wouldn't make that lag 2000ms. It would be 1000 + the interpolation processing time.
Yea that was my bad (shouldn't be posting late at night -_-). The added latency would be a +1 frame-time of whatever framerate game is running at, with any additional processing-time on top of that.
So 50ms give or take for 30fps game on that Samsung panel, which indeed is not terrible for games that are already 150+.
 

PaintTinJr

Member
Yea that was my bad (shouldn't be posting late at night -_-). The added latency would be a +1 frame-time of whatever framerate game is running at, with any additional processing-time on top of that.
So 50ms give or take for 30fps game on that Samsung panel, which indeed is not terrible for games that are already 150+.
It feels like a confusing topic to chat about IMO even just differentiating between in-engine interpolation - which has all the advantages of looking inside the black box of the problem to use pre-known info, like motion vectors of coming animations or to be able to selectively interpolate just the foreground action along with a repeated (or gaussian blurred background for when the world model-view-matrix changes too) - versus the topic of this thread TV interpolation, where the TV processing is outside the box of the game engine and having to infer motion vectors, probably by needing to receive three frames - so a minimum delay of 33ms + 33ms + processing with the current frame of less than 16.6ms (in the case of 30 to 60) to be able to infer the first interpolated 60hz frame + the regular TV latency of 5ms minimum - because you need at least three samples to infer any pattern.

So AFAIK in a theoretically optimal low latency situation of a TV doing motion interpolation, it would need to just duplicate the first three 33.3ms frames - as needed - across the first five of the 16.6ms panel synchronisations before showing a first interpolated 16.6ms frame on the 6th synch; and then do the same on every other even sync (8th, 10th, ..) between real 33.3ms frames supplied by the PC/Console....

but even then, that's assuming the TV's chip could A) be able to infer the info in less than the 16.6ms window before the next sync and b) could consistently infer a frame - from the two former frames and currently display frame - in that 16.6ms window. Which in theory - unless I've got lost in the frame buffering - would add no extra latency at all when in a incrementally changing scene, but would add 67ms of latency at the start of every full scene change, which would look like stutter with the 2 sets of repeating frames . And if that's how it ever worked on TVs then maybe there would be an argument that the 1 second stutter infrequently would be less of an issue than the benefit to the gamer of being able to anticipate changing their continuous (in-sync) user inputs to a game earlier - as a result of the motion interpolation providing more info to the gamer @16.6ms earlier - than the user's brain would get from viewing at normal 30hz, because that would almost be at the level of in-engine interpolation AFAIK.
 
Last edited:

breakfuss

Member
I'm 2 weeks late seeing this but NO. Absolutely not. My Samsung and likely most other brands of TVs could do this shit years ago. This is embarrassing OP!
 

buenoblue

Member
I thought nowadays that 30fps games actually poll the controller at 60hz or even 120hz.
So even though the visuals are 30hz the input response is akin to 60hz.
 

SeraphJan

Member
The only TV I would actually turn on motion interpolation for gaming is Samsung TV, where the input lag is actually manageable

Just for a comparison, normally it would have around 20ms input lag (which most tv do, its not a monitor), with motion on, it only adds 7ms = 27ms, I can't really notice any artifact, but the judder reduction is real, frame pacing problem became less noticeable

For comparison a Sony TV, when you turn on motion the input lag jumps to 120ms+

So no OP, that is a terrible idea
 
Last edited:

Rickyiez

Member

bannable account

The only TV I would actually turn on motion interpolation for gaming is Samsung TV, where the input lag is actually manageable

Just for a comparison, normally it would have around 20ms input lag (which most tv do, its not a monitor), with motion on, it only adds 7ms = 27ms, I can't really notice any artifact, but the judder reduction is real, frame pacing problem became less noticeable

For comparison a Sony TV, when you turn on motion the input lag jumps to 120ms+

So no OP, that is a terrible idea

WTF just stop necro-ing useless threads like this , it's embarrassing . Let this die
 
Last edited:

AngelMuffin

Member
I tried this out a few days ago with Bloodborne. It was pretty neat seeing what a 60fps BB might look like but the lag and judder when turning quickly made it pretty much unplayable.
 
Top Bottom