• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

We've All Been Duped by LED LCD's Input Lag vs OLED for Gaming - Here's the Truth

Rea

Member
There are a few problems I have with A80J. Despite the Sony having much smoother gradients, the LG C1 is better overall. Not just the C1, but past Sony oleds. You name it ; A8G, A9G, A8h etc.

Most importantly, there is more base judder on this year's Sony sets, to a significant degree when compared to prior year sets. I've tested this, and some on YouTube have as well. When playing games, where you can't use interpolation to help, it's esp. bad. I I've used Sony tvs with all their different processors and I immediately noticed a problem on A80J.

Secondly, the upscaling is too heavy handed and over sharpened at the base level on XR chip tvs, and the AI processing can pick and choose what picture elements to enhance or de emphasise, leading to again an artificial and non pristine image when compared with the original intended image.

XR is a step back in everything but hdr highlight enhancement, and Sony needs to get their head out of their ass! They are so much better than what they were this year.
Currently I don't see any judder/stutter issue that you mentioned after the latest firmware update. I have A80j as my main tv for watching movies and for my Ps5. The only issue i currently having is the static logo dimming problem, but that's just happened when playing games with static HUD. It's mainly to prevent burn-in so i can live with that.
The upscaling doesn't look oversharpen, my brother has Cx and when i watch on his TV, it just looks blurry whether in motion or static image. The XR also has some issues regarding 4k120hz, but right now it is fixed.
You're the only one i see who saying XR chip is a step back for everything other than HDR and C1 is overall better.
But the fact is that Sony is just better when it comes to watching movies or image processing, C1 is better when someone wants pure gaming features. Many reviewers are saying the same thing.
 
I bet that most or people here claiming that they didn't experienced any form of burn in are wrong. It's probably happening, but isn't severe and wasn't noticed.

Oh god what will I do with my phantom burn in? Wonder if my 65" will fit through my window...


I'm also laughing at the fact that my LCD monitor is the only screen I have burn in issues with. OLED isn't the only magical TV technology that has issues. LCD panels experience burn in and failure over time as well.
 
Last edited:

Rea

Member
We all have OLED phones that also don't have burn in issues. I remember people like you complaining with the iPhone X launch about how it was doomed. I've never seen someone complain about burn in on an iPhone with a bunch of static elements.

With the new substructure and mitigation techniques OLED burn in is not something that normal users need to worry about. My LCD gaming monitor has burn in but my OLED TV is just fine.

LCD owners just insecure about those inky black levels.
Have to mention that my Samsung S20 has burn-in coz afew months ago i spent alot of time on Neogaf and that fucking neogaf logo is stuck on my screen. :messenger_pouting:
Now i only come to gaf for some updates once in a while.
 

marquimvfs

Member
Oh god what will I do with my phantom burn in? Wonder if my 65" will fit through my window...
I didn't mean to be offensive or disrespectful, I do believe that. See, most of the time someone's tv is turned on, it's displaying some form of image and that alone make it hard to notice. Have you clicked some of the links that show burn in those screens? It's a specific grey tone designed to show that. If someone is interested in putting that image in their tv, probably they will see some small pattern that isn't visible otherwise. That's exactly the point, it's not visible on normal use on most cases.
And about burn in presence in other display technologies, you're absolutely right. Even CRTs have them. But it wasn't a problem so severe as the one we see at oled panels. On CRT, for example, the same image needed to be showed on the same sceeen for the entire day for almost a decade, like a supermarket software or an ATM, for the image to be partially burned on screen.
 
Last edited:

Fare thee well

Neophyte
Jesus. Hard to get the gauge of truth on this subject matter for someone who is genuinely curious. Everyone is so defensive lol. I guess I'll roll into a Best Buy and just compare some day. If someone wanted to try an OLED monitor what's the best way to do it? I've heard some TVs do what some over-priced 'computer' OLEDs do? Is that true also?
 
Currently I don't see any judder/stutter issue that you mentioned after the latest firmware update. I have A80j as my main tv for watching movies and for my Ps5. The only issue i currently having is the static logo dimming problem, but that's just happened when playing games with static HUD. It's mainly to prevent burn-in so i can live with that.
The upscaling doesn't look oversharpen, my brother has Cx and when i watch on his TV, it just looks blurry whether in motion or static image. The XR also has some issues regarding 4k120hz, but right now it is fixed.
You're the only one i see who saying XR chip is a step back for everything other than HDR and C1 is overall better.
But the fact is that Sony is just better when it comes to watching movies or image processing, C1 is better when someone wants pure gaming features. Many reviewers are saying the same thing.
Just to clarify, I never said the upscaling on LG was better, and you are correct that it's blurry in comparison.

However, I was comparing to old Sony's, which have even better upscaling. None of the AI sharpened weirdness.

Motion is huge for me, and you get the lower input lag on c1, and Sony was missing hgig support etc. etc. When comparing an XR chip TV side by side with X1/X1 extreme chip TV, you may see the judder. If not you probably aren't sensitive to it.

It's like a pick your poison thing. Relativity poor motion on A80J, or poor scaling/gradation on C1. And both oleds have uinformity woes, and things like pink tint. NEITHER tv is good enough for me, but if I had to choose, i'd choose a C1, and then just use any non 4k gaming on my bravia x900e with x1 chip. Which is stupid, so I have just stuck with my x900e which has great motion and scaling/gradation.
 
Last edited:

Jigga117

Member


this oled tv exploded

all oleds are bad and dangerous

So have cellphones, other brand tvs and alot more electronic products in the history of electronic products hooked to bad outlets and/or recieving surges. Great entertainment though.
 

rofif

Banned
20211115-171523.jpg


My C7. A TV not even 6 years old that I paid over $2000 for with a big burn-in shit stain in the middle of the screen amongst others. You can listen to these dudes in here try to downplay burn-in if you want.
Middle? What content would be constantly in the middle. Does not seem like content burn in but heat. Most heat in the middle.
Anyway. 6 years is a fuck ton. I don’t plan to keep using tv as a monitor for 6 years. I am sure it will last 2 or 3 and that’s all I need. I am also sure lg improved the design and anti burn in features. In fact these tvs kinda get more uniform with time at least for first months because of pixel uniformity.

48” c1 is 1k now. You can have best possible image quality and hdr for at least few years or shit lcd forever. I would be buying this shit if it exploded after guaranteed 6 months
 
Last edited:

rofif

Banned
Best were CRTs. I remember there were no lag issues, no brightens issues, no burn in issues.

Now? I don't wanna think about it. Even when I am extremely annoyed with my Bravia and upper screen being dimmed issue. I have enough headache.

CRT....

homer simpson episode 3 GIF
CRTs had radiation, strobing, we’re dim, even worse burn in and were huge. I have one still for some old games and I think oled is about million times better
 

Rea

Member
Just to clarify, I never said the upscaling on LG was better, and you are correct that it's blurry in comparison.

However, I was comparing to old Sony's, which have even better upscaling. None of the AI sharpened weirdness.

Motion is huge for me, and you get the lower input lag on c1, and Sony was missing hgig support etc. etc. When comparing an XR chip TV side by side with X1/X1 extreme chip TV, you may see the judder. If not you probably aren't sensitive to it.

It's like a pick your poison thing. Relativity poor motion on A80J, or poor scaling/gradation on C1. And both oleds have uinformity woes, and things like pink tint. NEITHER tv is good enough for me, but if I had to choose, i'd choose a C1, and then just use any non 4k gaming on my bravia x900e with x1 chip. Which is stupid, so I have just stuck with my x900e which has great motion and scaling/gradation.
I have x900f in my bed room, and i agree, lcd tends to look smoother but more blurry in motion in comparison with OLED, due to the nature of OLED. Right now i prefer OLED sharper image with better contrast. And my A80J destroyed X900F in everything, not even close.
A80J also has some artifacts when "film mode = high" and "smoothness=1 or more" according to Vicent from Hdtv.
Maybe that issue is due to the old firmware, but right now after the latest update, i don't see any artifacts. Sony also fixes banding issue with 4k120hz.

My Point is, those issues can be addressed with firmware update and nothing to do with the XR processor. C1 also having issue currently with their HDR but LG promised to fix with firmware update.
You favor LG because of your gaming priorities and i fully understand that C1 is king if someone wants a tv with full gaming features, however C1 is not overall better than A80j.
 

Mister Wolf

Member
Middle? What content would be constantly in the middle. Does not seem like content burn in but heat. Most heat in the middle.
Anyway. 6 years is a fuck ton. I don’t plan to keep using tv as a monitor for 6 years. I am sure it will last 2 or 3 and that’s all I need. I am also sure lg improved the design and anti burn in features. In fact these tvs kinda get more uniform with time at least for first months because of pixel uniformity.

48” c1 is 1k now. You can have best possible image quality and hdr for at least few years or shit lcd forever. I would be buying this shit if it exploded after guaranteed 6 months

A desktop wallpaper. Do you see the taskbar icons burned in at the bottom as well?
 
Last edited:

GymWolf

Member
You've been fed straight bullshit.




The reason they're not around anymore is that they costed an awful lot of money and resources to produce and, simply put, lost the war against the much cheaper-in-any-way LEDs which were able to easily impress the uneducated customers with their (wrong) colors and (excessive) brightness in shopping centres.
They were also n°1 enemy of ECO maniacs.


FTFY.

I have both a Kuro KRP-500M and a Panasonic VT50, and 60fps on the Panasonic are just unparalled, you combine the motion of CRT and its complete absence of motion blur with reference image quality on every imaginable parameter.
The results are as mindblowing today as they were when they launched and it honestly just makes you bitter and angry at this awful industry.

Moral of the story, i'll never get used to OLED motion, let alone LCD's, and would resort to buying used late Panasonic plasma till a new technology comes around.

Yeah plasma were great, i loved my vt20 and i always wanted a piooner kuro.
 

Mister Wolf

Member
Why would you leave it on desktop? Why not have random wallpaper every 1 minute and blank black screensaver?

Why can't I just use a $2000+ display like every other display I've ever used in my entire life. Many that were cheaper yet funny enough more durable. I gotta coddle a TV now? Fuck OLED.
 
Last edited:

GymWolf

Member
I do things while my taskbar is pinned to the bottom. For hours.

Is hiding the taskbar the latest craze now?
Hiding the bar when not in use is the only thing that i had to do and it's hardly a compromise or craze, it appear when you put the mouse on it ffs.
 
Last edited:

Shmunter

Member
Hiding the bar when not in use is the only thing that i had to do and it's hardly a compromise or craze, it appear when you put the mouse on it ffs.
Must suck. Would hate that personally. My 32” curved beast is an info centre of productivity. I didn’t go for realestate to hide shit.
 

GymWolf

Member
Must suck. Would hate that personally. My 32” curved beast is an info centre of productivity. I didn’t go for realestate to hide shit.
You need to see the taskbar all the time?

For me it doesn't suck because to use a pc for gaming\mediaplayer\internet browsing i don't need to have the task bar always under control.
It's literally a non-thing for the majority of people, you still have to move the mouse down to select the window that you want to open from the task bar and when i do that it magically appear with no delay, it was a feature in windows well before oled were a thing so people actully use it even if they are not "forced".
 
Last edited:

YCoCg

Member
Unfortunately it’s not micro enough. To fit 4k worth of leds into a panel it needs to be something ridiculous size wise. More than 100” from memory.
It's currently down to something like 85'' and that reduction has happened over the past year and half, once it gets to 55'' it should be golden for most to adopt.
 

Shmunter

Member
You need to see the taskbar all the time?

For me it doesn't suck because to use a pc for gaming\mediaplayer\internet browsing i don't need to have the task bar always under control.
It's literally a non-thing for the majority of people, you still have to move the mouse down to select the window that you want to open from the task bar and when i do that it magically appear with no delay, it was a feature in windows well before oled were a thing so people actully use it even if they are not "forced".
Different use cases in that case. It’s all work and no play on the pc for me. I use cool toys to create zones so I can pin multiple apps around the screen without flipping for them. I really thrive on seeing everything in my view.
 

Shmunter

Member
It's currently down to something like 85'' and that reduction has happened over the past year and half, once it gets to 55'' it should be golden for most to adopt.
Didn’t know that. It definitely needs to shrink to be viable across the range. If it can’t be widely adopted the price will remain too high for a consumer product.
 

Kuranghi

Member
Unfortunately it’s not micro enough. To fit 4k worth of leds into a panel it needs to be something ridiculous size wise. More than 100” from memory.

Yeah as I understand it they have problems with reducing the pixel pitch any further without increasing manufacturing cost massively and/or compromising the per pixel dimming, since its partly achieved by the subpixels being a tiny proportion of the overall pixel area:


I think this is a new class of LED video wall, because the packaging of the LEDs is completely different than conventional LED video wall modules. The Canvas display is composed of Sony’s CLEDIS (Crystal LED Integrated Structure) technology. The pixel pitch is stated as 1.2mm. On a conventional 1.2mm-pitch LED video wall module, the RGB LED device will take up a fair degree of this real estate, perhaps 50-70 percent of the pixel area. With Canvas, the emitting area is only 1 percent of the pixel area! That’s right, 99 percent of the module is black, making for really high contrast potential, especially in well-lit environments.


One might think that having a pixel with only 1 percent emitting area might create a display with points of light, or a highly pixelated image. It does not appear this way at all; the CLEDIS display is quite smooth and continuous. If you get to about 2 feet away you can now see some structure in the pixels, especially on white content, but you don’t see a point of light surrounded by black.


The photo on the left below is from the Sony web site, showing the LED surrounded by black. The photo on the right is a section of white content that was displayed on the Canvas screen at InfoComm. This looks like most of the pixel area is white with some sort of structure in the corners of the pixel. That leads me to suspect that Sony has an optical layer on top of each module that is expanding the light from this tiny LED to fill the full pixel area. It could be a microlens array or diffuser, for example. This is all pure speculation on my part, mind you, but the result is a smooth continuous image with great contrast.

Sony_Canvas_2-0616


Thats specifically about Sony Crystal LED Integrated Structure though (now called just Sony Crystal LED I think) and I don't know if the consumer tech, whenever it comes will be the same as that.

I don't think the Samsung "The Wall" MicroLED display they keep pretending will come over every year is the same thing or not, but they got that down to 75" I believe. I think 65" minimum is going to be fine since the price will be so crazy that people buying it will have the space or make space for it specially.
 
Last edited:

GymWolf

Member
Different use cases in that case. It’s all work and no play on the pc for me. I use cool toys to create zones so I can pin multiple apps around the screen without flipping for them. I really thrive on seeing everything in my view.
If used strictly for work i get it, but who the hell buy a 55 panel or bigger screen for desk/office work?! Don't you get neck pain and sored eyes for using such a big screen from up close?!

You are using a 32" so i'm trying to understand the scenarios presented in this very topic where someone buy a big ass oled tv to:

-Watch the desktop icons for hours to no end without doing anything else because reasons
-do office\desk work in a gigantic monitor from up close


Not gonna lie, majority of people probably don't use tv oled for those type of usage, and that's why they are so successful even if everyone and their mother knows about burn in risks.
 

Catphish

Member
Man. I thought console threads were full of whiny manbabies. The reactions of people to slights of their tv like someone kicked their fucking mom... :messenger_grinning_squinting:

I've been doing research on OLED v QLED for the last week or so, because I don't know shit about them, and am trying to make sure I know what I'm doing before dropping $2k+ on this madness.

I think I've read through three different TV threads on GAF, including this one. Some good info, but christ does one have to sift through a fuckton of shit-talking to find it. :messenger_dizzy:

In the end, I've decided to wait to see what next year brings. I was near-sold on the QN90A, but the lack of 2.1 HDMI ports and apparent likelihood of DSE is more risk than I'm willing to take.

We'll see what next year brings. In the meantime...

Calm Down Take It Easy GIF by O&O, Inc
 

GymWolf

Member
Man. I thought console threads were full of whiny manbabies. The reactions of people to slights of their tv like someone kicked their fucking mom... :messenger_grinning_squinting:

I've been doing research on OLED v QLED for the last week or so, because I don't know shit about them, and am trying to make sure I know what I'm doing before dropping $2k+ on this madness.

I think I've read through three different TV threads on GAF, including this one. Some good info, but christ does one have to sift through a fuckton of shit-talking to find it. :messenger_dizzy:

In the end, I've decided to wait to see what next year brings. I was near-sold on the QN90A, but the lack of 2.1 HDMI ports and apparent likelihood of DSE is more risk than I'm willing to take.

We'll see what next year brings. In the meantime...

Calm Down Take It Easy GIF by O&O, Inc
I think people just doesn't want fud spreading.

I don't think that people give 2 fucks if samsung sell 10x times the quantity of led compared to sony\lg oled, i know i don't...
 

rofif

Banned
I totally disagree with this. And I still own a top condition CRT so I can easily compare motion clarity between CRT and LCD OLED. OLED is nowhere near the perfection that CRT is.
Lol I might've last connected my crt last year... but I plays since 90s.
I've experienced years of crt monitors. I still think oled looks better. Motion or not.
If motion is better on crt, I have a hard time seeing it.
I like playing with motion blur anyway - it makes no sense to me not to. Good motion blur is a benefit for me and makes motion seem more natural.

Besides. We went from this (awesome, I try to replay my all games at least once every year or two)
47v9e0J.jpg


To this:
yqqlqwc.jpeg

e36Scef.jpeg


Even if crt was sucking me off, oled is still better... except for old games made for crt. These use pixels differently like discussed in other crt topic... and my crt does not make pixel games a justice. I think it's too new.
And crt aso had burn in haha.
Anyway - motion is the LEAST complain I would have against OLED.
This looks pretty good. not as good as it could be but better than lcd. Just not better than CRT but it's not that important in real, modern games with motion blur, taa or other tech
 

rofif

Banned
The a90j is pretty remarkable. I wasn't willing to pay almost twice the price of a c1 though.
yeah exactly me neither. And now C1 is 200 euro cheaper than when I got it in march... so I will most likely get c2 next year unless a90j 2 is reasonable and comes in 48'. 55 would be fine too but I don't have enough room on my desk :p
 

OmegaSupreme

advanced basic bitch
yeah exactly me neither. And now C1 is 200 euro cheaper than when I got it in march... so I will most likely get c2 next year unless a90j 2 is reasonable and comes in 48'. 55 would be fine too but I don't have enough room on my desk :p
Don't forget the c2 will also come in a 42 incher next year. I'm looking to use that as a monitor.
 

Keihart

Member
That is crazy, 3 LG OLED sets and literally no burn in for me, one 2016 model, one 2017 model, and one 2019 model. How the heck does that kind of burn-in happen?
Yeah, i mean, my set is kinda old, a C8 and i've clocked over 100 hours on Judgement and Death Strading recently (not to mention all the other games over the years, like a bunche of R6S and whatnot), no burn in. I'm still carefull of not letting the screen with pause menus on it or shit like that.
I have around 100 or more hours on GGS too, and that has a pretty static HUD.
 

cireza

Banned
I like playing with motion blur anyway
You don't have the choice on LCD/OLED panels anyway :messenger_grinning_smiling:
If motion is better on crt, I have a hard time seeing it.
Then this is your problem actually. I don't have any problem seeing the major difference in clarity between a CRT, that displays pretty much no motion blur at all, compared to an OLED that still has a ton of it (even if it is better than LCD). We would not be seeing manufacturers trying all sorts of BFI methods (with pretty poor results honestly, you get a super dark screen, duplicating objects etc...) to compensate if OLED had zero problem, by the way.

Any scrolling test, like the one you linked, will display this in the most obvious way.
 
Last edited:

Keihart

Member
You don't have the choice on LCD/OLED panels anyway :messenger_grinning_smiling:

Then this is your problem actually. I don't have any problem seeing the major difference in clarity between a CRT, that displays pretty much no motion blur at all, compared to an OLED that still has a ton of it (even if it is better than LCD). We would not be seeing manufacturers trying all sorts of BFI methods (with pretty poor results honestly, you get a super dark screen, duplicating objects etc...) to compensate if OLED had zero problem, by the way.

Any scrolling test, like the one you linked, will display this in the most obvious way.
The lack of motion blur is actually the problem with OLEDs, the change pixels to fast, all the smothing stuff is actually to fill up this.
 

RoboFu

One of the green rats
Also: Sony’s OLED panels are made by LG. Always have been.

it doesn’t matter what ever anecdotal situation anyone wants to put forth.

EACH PIXEL DEGRADES AT DIFFERENT RATES AND EACH ONE HAS A FINITE LIFE SPAN.

That alone is irrefutable

That means they get dimmer at different rates which causes a form of “ burn in “. They stop functioning as well at different rates which causes screen uniformity issues which greatly affects solid colored screens over time

thoses are facts of oleds.. Of course different people will get different mileage depending on use cases.

some people turn down the brightness which extends the life span but at the cost of a darker image.

some people just don’t use their TVs as much which is great if you can do that but it still doesn’t mean that it will never have those issues.

my situation was that my wife and kids left the tv on channels with logos and black bars for days at a time. After about 5 months you could clearly see the shape of the black bars and logos. That’s because the pixels in between the black bars got a lot more use so they became noticeably dimmer.

same as my plasma back in the day black bars and news channels killed it pretty quickly.

really my wife’s inability to turn off a tv is a big issue for me. 😂🤣
 
Last edited:
Top Bottom