• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

HGIG on OLEDs may be a lie, sometimes

Whitecrow

Banned
Here I go again bringing some fresh 'news' (experiences) on HDR and TVs. I'm sorry I'm a nerd of this things.
This may make no sense at first, but bear with me.

I'm sure almost everyone with and HDR TV and a console know Vincent Teoh and already knows about HGIG tone mapping and how to correctly set up the TV and the console for correct HDR display.
But to put everyone here into context, HGIG tone mapping on the TV (those who have it) what actually makes, is disable any tonemapping so the TV follows the HDR PQ luminance curve as is, and just clip at the maximum display luminance.
In the case of OLEDS, maximum luminance sits around 800 nits, so the TVs, with HGIG, everything above 800 nits will clip to full white and all that detail will be lost.

But the thing is, HDR format goes as further as 4000 nits for luminance (Dolby Vision goes up to 10.000), and even if that maximum is just for future-proofing, we cant really know what's the maximum nit value HDR games are 'mastered' to.
So as soon as you engage HGIG, and you limit the console to output at that max luminance of your TV, you are actually leaving detail out, for the sake of luminance accuracy.
HDR TVs already have a tone mapping (in LG OLEDs, this is Dynamic Tone Mapping set to off) curve which instead of trying to get the luminance right according to the PQ curve, it tries to retain as much detail as it can from 1000+ nits sources.

I was thinking about this yesterday, and then I came to a stupid realization. HDR can make the image pop... but it was never about that pop! (or at least, not only)
It called High Dynamic Range for... having high dynamic rangue, not high brightness. HDR is about detail! It's about the number brightness levels and colors! And guess what, the image have more detail with HGIG off, with the default TV tone mapping.
(And the console tone mapping set accordingly).

Yes, the average brightness is lower, but overall the image looks a lot more rich and full since there's a lot more data packed in. But obviously, since each game HDR is different, results will differ.
Supporting this 'theory', is the fact that all color theory, just as chroma subsampling, is developed taking into account that the human eye is a lot more sensitive to light than to color, so the 'quantity of detail' over brightness, may have a reason to be a thing.

I'm actually testing it on FF7R, which looks amazing to me now, and GoW:R.
GoW:R didnt change much, but I would need to play more.

So, I invite you to try at least and see how it goes for you. I'm just throwing my thoughts and impressions here.

EDIT
Let me show you Tone mapping curves from a Vincent Teoh video about LG E9:
M2x0zzg.png
K5CpBgy.png
You can see, that with HGIG, as I already mentioned, the TV only handles from 0 to 70 brightness levels, accurately following the PQ curve.
But with no HGIG, it follows the curve below its corresponding brightness, but can handle up to 90 levels, hence retaining more detail.
What else do you need me to explain : /
 
Last edited:

DanEON

Member
You should use HGIG in games that support it only (games that you can set the hdr peak brightness). The PS5's HDR setting was supposed to do it for every game (but there are some games that just ignore it, like Demon's Souls Remake).
IMO, on PS5, just adjust the HDR settings with HGIG on and forget it.
On PC, only use HGIG in games that you can set the peak brightness.
 

Tygeezy

Member
Hgig turns off tonemapping on the display. You use it for games that use the console settings for hdr or games that allow you to set the peak brightness.

Some games don’t support either so you will want to turn hgig off on the display so you aren’t clipping the hdr highlights.
 

Swift_Star

Banned
Do you really see a difference? After your eyes adapt to it, it doesn't make any difference. I'd say leave the option on or off based on what looks better for you and forget about it. Thinking about this and constantly messing with the settings takes the joy of it. The IQ will be amazing on both OLED and Neo QLED TVs regardless of HGIG being on or off.
 

rofif

Can’t Git Gud
HGIG does not really disable tonemapping. It just iteration Tone Mapping Off.
But Tone mapping off still does tone map at least on lg. At lest it does not do it dynamically.
The game might request 4k nits and your tv will just tonemap it down to 1k to be able to display anything.

But yeah, it's not about the pop. Its about the contrast and elements being correct luminosity.
We know that black will be 0 and sun will be 100%, so we can give other values some numbers inbetween.

HDR is so fucking good on oled but everyont keeps talking about the fucking brightness while it does not matter that much
 

Venuspower

Member
you are actually leaving detail out, for the sake of luminance accuracy.

Only if the game does not have proper HDR calibration/does not use the system tone mapper. As long as the game provides proper HDRcalibration/uses the system tone mapper no details are being lost because the game will tone map the picture based on these settings. In this case, the TV does not need to do any tone mapping, since it already gets the picture "pre-chewed" from the source. This is why HGiG is activated in such cases to prevent double tone mapping.
 
Last edited:

Perfo

Thirteen flew over the cuckoo's nest
Before even going for that they have to solve that horrendous auto dimming on C series without recurring to tricks that could hurt the screen. Goddamn God of Was is continuously doing it bright one second super dark the next. Wtf!!!!
 

Filben

Member
My experience was based mainly on Horizon Forbidden West:
When HGIG is supported (as it is by HFW), you get a nice tone mapping without clipping (=over-saturating and losing details in bright scenes). When HGIG is not supported, there's no difference to tone mapping 'off'. Tone mapping 'on' will increase brightness/luminance of bright scenes (like daylight skies) and really make it pop out but can cause clipping and lost of details. However, since the overall picture is much brighter, this might be necessary if I play in a bright environment and tone mapping off or HGIG makes the picture too dark.

So what I'm doing basically: leave it on HGIG because in best scenarios where games support it, I get the best tone mapping curve. If the game won't support it, there's no clipping. I only use 'on' if I play in summer and my living is too bright during the day and I need that extra brightness given by tone mapping 'on'.
 
Last edited:

Dibils2k

Member
this is false, HGIG condenses the peak brightness to 800nit (or whatever you set it to), the whole point is it doesnt clip detail

but the game needs to support it or have their own ingame max luminance. DTM just pushes the base brightness up and depending on the scene allocates the brightness as needed

anything but HGIG looks off to me, you lose the depth and contrast, yes you need soemwhat dark environment to appreciate it though

as far as i know Sony and Samsung dont have true HGIG btw, they still do their own tonemapping
 
Last edited:

Dibils2k

Member
Before even going for that they have to solve that horrendous auto dimming on C series without recurring to tricks that could hurt the screen. Goddamn God of Was is continuously doing it bright one second super dark the next. Wtf!!!!
this is the game, not TV related

it has some kind of a dynamic scene lighting
 

DeaconOfTheDank

Gold Member
You should use HGIG in games that support it only (games that you can set the hdr peak brightness). The PS5's HDR setting was supposed to do it for every game (but there are some games that just ignore it, like Demon's Souls Remake).
IMO, on PS5, just adjust the HDR settings with HGIG on and forget it.
On PC, only use HGIG in games that you can set the peak brightness.
 
Tested many games which should, in theory, support Hgig on my calibrated LG CX.

Almost all of them failed. The picture is always way too dark. The only game that was calibrated correctly in Hgig is A Plague Tale: Requiem.
 
Last edited:

dotnotbot

Member
So as soon as you engage HGIG, and you limit the console to output at that max luminance of your TV, you are actually leaving detail out, for the sake of luminance accuracy.

The idea is that the game itself should tonemap to your TV's max luminance capabilities since it knows the best how to do it. So we aren't leaving details out, HGiG purpose is actually the opposite - give you the most details you can get on your display, since tonemapping done by the game should be more accurate (because it has access to all the data provided by the game engine).

But that's just a theory, considering how developers are still a bit clueless about how to implement HDR properly, results are probably all over the place.
 
Last edited:
I was thinking about this yesterday, and then I came to a stupid realization. HDR can make the image pop... but it was never about that pop! (or at least, not only)
It called High Dynamic Range for... having high dynamic rangue, not high brightness. HDR is about detail! It's about the number brightness levels and colors! And guess what, the image have more detail with HGIG off, with the default TV tone mapping.
(And the console tone mapping set accordingly).

There's a lot of misunderstanding here. HDR is indeed all about high brightness... and dark darkness. You have to have both to have a truly HDR image, because that's where you get the high "range." The darkest darks, combined with the brightest brights gets you a true HDR image. Tone Mapping is not actually anything good. It exists purely to "map" a HDR image onto a display that is not fully capable of displaying the true HDR image. A perfect HDR display would have no tone mapping at all.
 

Whitecrow

Banned
this is false, HGIG condenses the peak brightness to 800nit (or whatever you set it to), the whole point is it doesnt clip detail

but the game needs to support it or have their own ingame max luminance. DTM just pushes the base brightness up and depending on the scene allocates the brightness as needed

anything but HGIG looks off to me, you lose the depth and contrast, yes you need soemwhat dark environment to appreciate it though

as far as i know Sony and Samsung dont have true HGIG btw, they still do their own tonemapping

The idea is that the game itself should tonemap to your TV's max luminance capabilities since it knows the best how to do it. So we aren't leaving details out, HGiG purpose is actually the opposite - give you the most details you can get on your display, since tonemapping done by the game should be more accurate (because it has access to all the data provided by the game engine).

But that's just a theory, considering how developers are still a bit clueless about how to implement HDR properly, results are probably all over the place.

There's a lot of misunderstanding here. HDR is indeed all about high brightness... and dark darkness. You have to have both to have a truly HDR image, because that's where you get the high "range." The darkest darks, combined with the brightest brights gets you a true HDR image. Tone Mapping is not actually anything good. It exists purely to "map" a HDR image onto a display that is not fully capable of displaying the true HDR image. A perfect HDR display would have no tone mapping at all.

I think you are not understanding me, at all. There's a reason HGIG and default tone map curve are different things. Default tone map curve preserves a lot more detail than HGIG because it's designed to do that. It can make the TV have varying levels of brightness from 0 up to watever its designed to (more than 800), by not following the HDR PQ curve, darkening mid tones, and leaving more room for the lights.

For example, if you have HGIG enabled on the TV and you calibrate a game that have HGIG tone mapping, you are telling the game to output a max luminance that corresponds to displays capabilities, instead of lets say, 1500 nits.
WIth HGIG, the TV will display accurately from 0 to 800. No HGIG will display, not accurately, but noticeably, from 0 to 1500.

Doesnt matter if a game is HGIG compliant or not. After the game applies the tone mapping, it will send the TV brightness values from 0 to 800 for it to display. But TVs without HGIG are designed to retain detail up to more than 1000 nits. So yeah, HGIG gets you no no clipping and accuracy, but details are lost, thats for sure.
 
Last edited:
HDR feels so half baked right now it's not even funny. I won't be buying an HDR display until one exists with the following requirements:

-3840x2160 at 240hz true 10 bit color
-classic subpixel arrangement (red-green-blue, no white subpixel, no triad or pentile layout etc)
-2000 nits full field sustained
-full RGB, 4:4:4 chroma subsampling
-10,000+ dimming zones capable of shutting off completely
-true 1ms response times
-IPS preferably or VA if no outstanding artifacts are present

To me, that would be my ultimate display. Until then, everything else HDR feels really gimmicky. OLED in SDR accomplishes 99% of the "wow" factor just by virtue of having infinite contrast but it lacks in peak brightness. QD-OLED is a step in the right direction with brightness, but a step backwards with subpixel arrangements. So I continue to wait and wait for the display of my dreams and it's getting really old. Shitty thing is, even if said display existed today, software side is a mess. HDR in Windows on PC is in such a bad state right now and doesn't appear to be getting noticably better any time soon. It's a shame.
 

fart town usa

Gold Member
Sony doesn’t have a true HGIG setting but this should help:

In the PS5 HDR menu: 15 clicks from 0, 15 clicks, 0/lowest value. Tv menu: Gradation preferred on, HDMI signal format “enhanced (VRR)”, and copy settings from rtings for the rest
Ahhh, gotcha. Yea, I don't know shit about modern TVs. Bravia came in this arrived and I was able to get the VRR set up, tried out Ghostwire on the PS5. I'm assuming I need to get the video settings dialed in cause the game honestly looked better on my Panasonic Plasma, lol. I also can't figure out how to get the truemotion to stay OFF. My wife is watching GOT and it looks like a damn soap opera.

Gonna be googling shit for a while, lol. Bought a $1,600 TV and not sure if it's an upgrade when compared to my 12 year old TV. :messenger_grinning_sweat:
 

Jackgamer XXXX

Neo Member
I always use HGiG.

It gives the most accurate picture.

Remember HDR is meant to be enjoyed in a dark room (very little light).

If your room is very bright, than use what ever gives the most bright overall picture, such as Dynamic tonemapping.
 

Pimpbaa

Member
Went through this shit with a ton of games since I got my LG CX, and usually always HGIG looks best. Except in games that do not use the system level settings for HDR or have no in game settings for it.
 

Jackgamer XXXX

Neo Member
I think you are not understanding me, at all. There's a reason HGIG and default tone map curve are different things. Default tone map curve preserves a lot more detail than HGIG because it's designed to do that. It can make the TV have varying levels of brightness from 0 up to watever its designed to (more than 800), by not following the HDR PQ curve, darkening mid tones, and leaving more room for the lights.

For example, if you have HGIG enabled on the TV and you calibrate a game that have HGIG tone mapping, you are telling the game to output a max luminance that corresponds to displays capabilities, instead of lets say, 1500 nits.
WIth HGIG, the TV will display accurately from 0 to 800. No HGIG will display, not accurately, but noticeably, from 0 to 1500.

Doesnt matter if a game is HGIG compliant or not. After the game applies the tone mapping, it will send the TV brightness values from 0 to 800 for it to display. But TVs without HGIG are designed to retain detail up to more than 1000 nits. So yeah, HGIG gets you no no clipping and accuracy, but details are lost, thats for sure.
You misunderstand…

Read this article, it explains everything you need to know.

https://www.whathifi.com/advice/hgig-explained-what-is-hgig-how-do-you-get-it-and-should-you-use-it
 
Before even going for that they have to solve that horrendous auto dimming on C series without recurring to tricks that could hurt the screen. Goddamn God of Was is continuously doing it bright one second super dark the next. Wtf!!!!
That's the game. I remember horizon zero dawn doing that with the PS4 and thinking my OLED was exhibiting that behavior.
 
That's the game. I remember horizon zero dawn doing that with the PS4 and thinking my OLED was exhibiting that behavior.
Does that mean gamers are better off going with OLED's from Sony or Samsung? I'm not in the market for an OLED at the moment but at some point I hope to be, and I'd hate to spend all that money only to have to deal with the screen bouncing back and forth in brightness.
 
Does that mean gamers are better off going with OLED's from Sony or Samsung? I'm not in the market for an OLED at the moment but at some point I hope to be, and I'd hate to spend all that money only to have to deal with the screen bouncing back and forth in brightness.
It'll bounce between light and dark on an LCD as well. That's not ABL.

I'd go with an OLED, no doubt. I like my Samsung S95B, but maybe wait for the second gen.
 

b0uncyfr0

Member
It all comes down to the game ultimately.

Alot of HDR games get overly bright with just DTM - if you dont like that, switch to HGiG
Some HDR games have better highlights with HGig - noticed that especially with older games that support HDR

Some SDR games dont benefit at all with forced HDR - thats when you switch the HDR mode on your LG TV to HLG mode. Love this mode.

I personally test each game on every mode to see whats best - is it annoying = Yes. But ill be getting the best out of my CX.
 

Perfo

Thirteen flew over the cuckoo's nest
That's the game. I remember horizon zero dawn doing that with the PS4 and thinking my OLED was exhibiting that behavior.
Seriously? What's the point it makes everything so difficult to see at times :_
Another bug they haven't solved yet is the audio output device that regardless of what I do always comes as Stereo and nothing else. The only game where I can't freely select 5.1 lol

Fantastic game btw
 

Kupfer

Member
One thing I've been wondering since I bought an OLED TV, does HDR / HGIG do anything for me at all if I have my brightness reduced to only 60%? I love my LG OLED48CX, so I want to use it as long as possible, and I remember setting up the TV and PS5 according to Vincent's video - except for the brightness, which he has at 100%, I only at 60%.

The picture is great, no question, but honestly I'm not sure I see a difference between HDR / non-HDR. Maybe if I set up side by side two of the same TVs with the same picture, but with only one TV in a relatively dark room, I haven't had the WOAH-experience with games that supposedly use HDR that others seem to have.
 
Last edited:

buenoblue

Member
I stopped following professional calibration tips a long time ago. They all seem super dull and lifeless. I like my images to pop. Is it technically correct, probably not but but just use what's best. I set ps5 to 4000 nits and let my tv do the rest and it looks great.
 

Whitecrow

Banned
One thing I've been wondering since I bought an OLED TV, does HDR / HGIG do anything for me at all if I have my brightness reduced to only 60%? I love my LG OLED48CX, so I want to use it as long as possible, and I remember setting up the TV and PS5 according to Vincent's video - except for the brightness, which he has at 100%, I only at 60%.

The picture is great, no question, but honestly I'm not sure I see a difference between HDR / non-HDR. Maybe if I set up side by side two of the same TVs with the same picture, but with only one TV in a relatively dark room, I haven't had the WOAH-experience with games that supposedly use HDR that others seem to have.
If you reduce your brightness to 60% in HDR you are simply not getting the best out of your display. I cant tell how much that is different from SDR.
Ultimately, the goal is to be happy with what you are getting.

But objetctively speaking, HDR is capable of far more realistic images than SDR. And the WOW factor may depend on a few things. For example, a person used to play already at max brightness and contrast in SDR, wont get much more out of HDR at first glance.
 

amscanner

Member
In the case of OLEDS, maximum luminance sits around 800 nits, so the TVs, with HGIG, everything above 800 nits will clip to full white and all that detail will be lost.

Only if you don't care anything about ps5 HDR calibration and in-game HDR calibration. Almost every modern game have in-game HDR calibration and some first party game refer to console system HDR calibration settings. So if you set them correctly, brightest high light in game presents in 800 nits and there's no clipping or lost detail. If you are using tone mapping, it would just compress the default HDR range ( 0~ 1000-4000 nits - depends on each game) into 0~800 nits (your OLED TV) which is not ideal compared to HGIG.
 
Last edited:

Tarnpanzer

Member
It really depends on the game. If it supports HGiG I use HGiG. If not, I use DTM.

I compare DTM with HGiG for every game I play at the beginning. A good place for a comparison is a bright scene, like looking at the sky. If the HGiG-implementation is good, DTM is not brighter in these scenes when switching between the two.. If there is not HGiG supported and you switch to DTM, the screen should be brighter with DTM. In this case the picture is more accurate with DTM in my opinion(because a bright sky should always be as bright as possible and not dim), so I use DTM instead of HGiG.

If the HGiG-implementation is decent and you use DTM, u just get blown out details in bright scenes and darker scenes are actually brighter then the creators intent.
 
It's all an enormous shit show. Your TV, receiver, game, and console need to all be setup properly, or something will look fucked up.

When you get that picture right, though, it's worth it. Until they change the firmware on your TV.
 

kainslayer

Neo Member
I wasted enough time of my life dealing with purist settings I don't agree with the majority settings that they suggest,just adjust your settings that they work for you in your living space and your eyes...hgig is to dark on Ragnarok and c1 I agree on that....on pc for example I like to use digital vibrance from the Nvidia drivers Menu...and also reshade for my games... fuck creators intent that's why we have mods and I will give an exaggerated example like playing Zelda botw 4k fake hdr and raytraycying...and you will see the creators intent in 720p on switch
 

Whitecrow

Banned
Honestly, since my starting point in this settings shitshow was accuracy I never used DTM, since it is the mode that distorts more the so called 'creators intent'.
IMO, one thing is having a bad HDR implementation, and another thing is having a good HDR implementation that we dont like.

It happens to me in a lot of times in a lot of games, things like, ' I would add more contrast, or change exposure to that scene', or whatever. For me it's easy to not be happy.
But in reality, the color grading process to create a good looking image is a really hard task and is incredibly easy to end up with results that doesnt please everyone.

Right now, I just get along knowing what my settings are doing so I can choose whatever I prefer.
 
Last edited:
HDR feels so half baked right now it's not even funny. I won't be buying an HDR display until one exists with the following requirements:

-3840x2160 at 240hz true 10 bit color
-classic subpixel arrangement (red-green-blue, no white subpixel, no triad or pentile layout etc)
-2000 nits full field sustained
-full RGB, 4:4:4 chroma subsampling
-10,000+ dimming zones capable of shutting off completely
-true 1ms response times
-IPS preferably or VA if no outstanding artifacts are present

To me, that would be my ultimate display. Until then, everything else HDR feels really gimmicky. OLED in SDR accomplishes 99% of the "wow" factor just by virtue of having infinite contrast but it lacks in peak brightness. QD-OLED is a step in the right direction with brightness, but a step backwards with subpixel arrangements. So I continue to wait and wait for the display of my dreams and it's getting really old. Shitty thing is, even if said display existed today, software side is a mess. HDR in Windows on PC is in such a bad state right now and doesn't appear to be getting noticably better any time soon. It's a shame.
Keep waiting until 2050 the earliest and enjoy your black/white tv until then. in the meantime we’ll enjoy our crappy new TV’s from last year with our crappy HDR implementations and other abysmal specs.
 
Keep waiting until 2050 the earliest and enjoy your black/white tv until then. in the meantime we’ll enjoy our crappy new TV’s from last year with our crappy HDR implementations and other abysmal specs.
It's almost as if you didn't read my post hmm? Your HDR implementations are garbage today and OLED achieves 99% of the true wow factor people look for in displays. Eg - true black, pristine viewing angles and super fast response times. HDR is a joke today that nobody takes seriously because display side almost nothing does it right. Look up what Dolby Vision 12 bit can achieve and link me to a single consumer priced television or monitor that can hit those marks. I'll be waiting guy.
 

Buggy Loop

Member
But the thing is, HDR format goes as further as 4000 nits for luminance (Dolby Vision goes up to 10.000), and even if that maximum is just for future-proofing, we cant really know what's the maximum nit value HDR games are 'mastered' to.
So as soon as you engage HGIG, and you limit the console to output at that max luminance of your TV, you are actually leaving detail out, for the sake of luminance accuracy.
HDR TVs already have a tone mapping (in LG OLEDs, this is Dynamic Tone Mapping set to off) curve which instead of trying to get the luminance right according to the PQ curve, it tries to retain as much detail as it can from 1000+ nits sources.

Sorry if this sounds ignorant but, since i don't have any HDR displays, maybe i don't "get it". So isn't it fucking stupid for game devs or for movies to set nits to these levels if the tech will crush it anyway? Like ain't there a normalized level that would fit for any displays?

What's the point of 10,000 nits anyway? This meme below is already a problem with current SDR i find, do we really need to melt eyes?

2ake1l9285l81.jpg
 

Whitecrow

Banned
Sorry if this sounds ignorant but, since i don't have any HDR displays, maybe i don't "get it". So isn't it fucking stupid for game devs or for movies to set nits to these levels if the tech will crush it anyway? Like ain't there a normalized level that would fit for any displays?

What's the point of 10,000 nits anyway? This meme below is already a problem with current SDR i find, do we really need to melt eyes?

2ake1l9285l81.jpg
It's not the devs or the movies. It's the standard itself. And it's designed in a different way than SDR.

SDR have a gamma formula of 2.2 or 2.4 in order to distribute luminance levels from black to pure white, but that's a relative formula since it doesnt take into account what the luminance of the black is. It only cares about the difference between one brightness level and the next. That is, that formula can work equally in a range of 0 to 100 nits (wich is actually the SDR container), than in a range of 300 to 400, making it that each display, while not calibrated, is in reality outputting differnet images, that depends on contrast, max brightness.... etc.

HDR wanted to fix that somehow and went with an absolute formula instead of a relative one, the PQ curve in this case, that translates color values to nits, and tells the display exactly how many nits to output, giving a more standarized image across all displays. This PQ curve stablishes a corresponding luminance for all colors from 0,0,0 to 1023, 1023, 1023 (10-bit HDR) and to 4,096, 4,096, 4,096 in 12-bit Dolby Vision.

The thing about 10.000 nits is for future-proofing. It's taking into account the possible advance in technology and the possibility of screens being capable of outputting more nits in the future, afaik.

I hope all of this makes sense :messenger_grinning_sweat: I'm not an expert, and things are a bit more complex than this, but I think I did a good summary,
 
Last edited:
Top Bottom