• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

It's a shame HDR (for me)is so hit or miss in games. Some help needed

Stafford

Member
HDR can be a great improvement for a game ,if the implementation is well done. And there are games out there that truly benefit from it and I wish this was the case for every game. Or if the implementation is just shit, add a option in-game to disable it, instead of us constantly having to switch it on/off in the system depending on the game.

I'm playing Borderlands 3 now and the HDR is hit and miss at the same time. It hits with fantastic looking explosions and just general special FX. It misses with the black levels, they are raised compared to SDR. So where a lodge in the Hammerlock DLC looks nice and moody with the lights and dark in places where there's no light source with SDR, with HDR it's just overly bright, making a night sky too bright as well.

I have my Xbox Series X calibrated in the HDR calibration thingy app, but I might have set this wrong and maybe someone can tell me if it is. So here we go

TV: LG OLED C9

HDR pic mode : Game
Dynamic tone mapping: HGIG

Xbox Series X HDR Game Calibration

So in the Minimum luminance part it tells you to adjust image until you can barely make out the checkerboard pattern. However I was told to put this all the way to the left so you don't see a pattern at all. Is this OK?

In the Maximum luminance part I set it to the right until it's just one white box and I stop there. And the same for the second luminance box.

There's absolutely a chance it's just the game , but it's a shame. Too many games still suffer from this.
 

j0hnnix

Member
Definitely agree with you OP. I follow HDTV and find his videos very informative. I'm sorry if you already tried this but if you haven't give his video and explanation a whirl.

 
Last edited:

TrueLegend

Member
Well you should set minimum luminescence until barely make out the checkerboard pattern. Its definitely not the game I think. Most games have good HDR. The problem could be your TV. OLEDs don't handle gray right so they may be elevating your black levels, its a known thing, especially in LG OLEDs. It could be the game too if for some reason they ended up screwing for series X. Basically, the tone mapping is the problem but what is causing it could be many things. If it's just elevated black level that's just your TV but you are saying that you are finding the night sky bright which should not be a problem with C9 unless it too is of grey shade. How bright do you find the night sky, is it overly bright. Edit:The video above was posted while i was typing.
 
Last edited:

AJUMP23

Member
I feel like my HDR on my CX is calibrated pretty well. On MLB the show though the tv doesn’t like the menu gradient and will flicker in the edge at time.
Overall I am good with hdr.
 

Stafford

Member
Definitely agree with you OP. I follow HDTV and find his videos very informative. I'm sorry if you already tried this but if you haven't give his video and explanation a whirl.


Yeah I remember seeing this video when I had just gotten my XSX and tried to get that trainwreck AC Valhalla in terms of HDR looking properly. I never succeeded because the black levels truly are elevated with HDR there, such a shame.
Well you should set minimum luminescence until barely make out the checkerboard pattern. Its definitely not the game I think. Most games have good HDR. The problem could be your TV. OLEDs don't handle gray right so they may be elevating your black levels, its a known thing, especially in LG OLEDs. It could be the game too if for some reason they ended up screwing for series X. Basically, the tone mapping is the problem but what is causing it could be many things. If it's just elevated black level that's just your TV but you are saying that you are finding the night sky bright which should not be a problem with C9 unless it too is of grey shade. How bright do you find the night sky, is it overly bright. Edit:The video above was posted while i was typing.
Sadly more often than not I get HDR that just ruins the picture more than it should. Most of the time it's elevated black levels yeah. AC Valhalla is a real offender here, sometimes it's just a choice by the devs like the RE2 Remake.

Dead space, now there's a horror game that had perfect black levels. It's gotten to a point for me where I'd pay money to get games looking like SDR but with the benefits of HDR.

The night sky in AC Valhalla for example, it's like it's daytime almost, but even with SDR that's far from a dark sky. Most of the time for me it's areas that should be pitch black or just dark that have these raised black levels yeah. Interestingly enough Xbox's own auto HDR feature has been more hit than miss for me.


Excellent posts so far and I'd also add that if VRR is on then that will also elevate the black levels slightly.
Really? I had no idea. So if I disable "enable variable refresh rate" in the Xbox settings it should improve a little?
 

acm2000

Member
HDR can be a great improvement for a game ,if the implementation is well done. And there are games out there that truly benefit from it and I wish this was the case for every game. Or if the implementation is just shit, add a option in-game to disable it, instead of us constantly having to switch it on/off in the system depending on the game.

I'm playing Borderlands 3 now and the HDR is hit and miss at the same time. It hits with fantastic looking explosions and just general special FX. It misses with the black levels, they are raised compared to SDR. So where a lodge in the Hammerlock DLC looks nice and moody with the lights and dark in places where there's no light source with SDR, with HDR it's just overly bright, making a night sky too bright as well.

I have my Xbox Series X calibrated in the HDR calibration thingy app, but I might have set this wrong and maybe someone can tell me if it is. So here we go

TV: LG OLED C9

HDR pic mode : Game
Dynamic tone mapping: HGIG

Xbox Series X HDR Game Calibration

So in the Minimum luminance part it tells you to adjust image until you can barely make out the checkerboard pattern. However I was told to put this all the way to the left so you don't see a pattern at all. Is this OK?

In the Maximum luminance part I set it to the right until it's just one white box and I stop there. And the same for the second luminance box.

There's absolutely a chance it's just the game , but it's a shame. Too many games still suffer from this.

your tv only hits 500-800 nits in hdr, ideally you want 1000-1500 to get a proper hdr effect.
 
Last edited:
Definitely agree with you OP. I follow HDTV and find his videos very informative. I'm sorry if you already tried this but if you haven't give his video and explanation a whirl.


if u have to change the hdr settings from game to game then it means something is wrong with hdr
 

TrueLegend

Member
Yeah I remember seeing this video when I had just gotten my XSX and tried to get that trainwreck AC Valhalla in terms of HDR looking properly. I never succeeded because the black levels truly are elevated with HDR there, such a shame.

Sadly more often than not I get HDR that just ruins the picture more than it should. Most of the time it's elevated black levels yeah. AC Valhalla is a real offender here, sometimes it's just a choice by the devs like the RE2 Remake.

Dead space, now there's a horror game that had perfect black levels. It's gotten to a point for me where I'd pay money to get games looking like SDR but with the benefits of HDR.

The night sky in AC Valhalla for example, it's like it's daytime almost, but even with SDR that's far from a dark sky. Most of the time for me it's areas that should be pitch black or just dark that have these raised black levels yeah. Interestingly enough Xbox's own auto HDR feature has been more hit than miss for me.



Really? I had no idea. So if I disable "enable variable refresh rate" in the Xbox settings it should improve a little?
  1. Most likely, you have seriously messed up your TVs HDR. By all account, it should be good if not great.
  2. You might be one of those guys, not to insult you, but you should look at the most popular sweetfx preset of all games on thelazy.net, because it is full of people who like dark aesthetic and kill bloom and shadow details.Maybe you are not used to experiencing shadow details and have not yet developed a taste for it(which actually makes premium OLEDs premium OLEDs). A lot of people in Dark Souls community shit on remaster because it brings out shadow details which they think makes the game look bad. Some people hated the volumetric fog in Shadow of Colossus remake because it wasn't flat like the original making it less aesthetically appealing to them.
  3. Perhaps you are also not aware that the mastering monitor do not have OLED screen so actual (0,0,0) are rarer than you think.
  4. RE2 has stunning hdr. It is as good as it gets but I have it on PC so can't say about how series X does it. Series X has great HDR and Valhalla actually uses right kind of night lightning but I FULLY understand its not upto your taste as half of the skyrim enbs exist to address the darkness of night. It is slightly brigher in assassin's creed games though for playability reason but they have great hdr implementation (HDTV test has a video on AC Valhalla calibration). If by chance you had AC Origins and you said you found some of the tomb of pyramids not dark enough than without a doubt I will know that you dont have taste for shadow details rather than speculating it or that your setup maybe just messed up.
 

recursive

Member
your tv only hits 500-800 nits in hdr, ideally you want 1000-1500 to get a proper hdr effect.
It is more complicated than just nits.

 

TrueLegend

Member
With perfect blacks I would pick an OLED any day over any LED TV for the best HDR experience
That's the most noob argument going on the internet. Here is the news for idiots like you, 99% of movies are shot in 2K still(yes all those 4K you see are upscaled from 2k raw of Arri) and most HDR movies are mastered below 1000 nits, infact 90% of scenes barely break 400nits. Its usually the sun that goes beyond 1000 nits.
 

HeisenbergFX4

Gold Member
That's the most noob argument going on the internet. Here is the news for idiots like you, 99% of movies are shot in 2K still(yes all those 4K you see are upscaled from 2k raw of Arri) and most HDR movies are mastered below 1000 nits, infact 90% of scenes barely break 400nits. Its usually the sun that goes beyond 1000 nits.
Yeah thanks I have seen fairly high nit Tvs in the 1500 range and pretty sure I know what I prefer and its OLEDs.

Nice instantly going to name calling but pretty happy with the current TVs I own and have tested over the last couple of years
 
Hdr is hit or miss everywhere, all games needs to come with a advanced calibration tool, most games don't come with one.
 
Last edited:

TrueLegend

Member
I found it works way better on console than on pc.
Mostly with HGIG
yes because its absolutely pain in the ass to get it right. It took me more than a month to do that because I thought its just broken in windows 10. The reason is you need SDR and hdr color profile from your Monitor/TV manufacturer and then use custom settings in PC. Then enable a game with HDR and then exit it for your screen to both display SDR and HDR right at the same time
 
Last edited:

TrueLegend

Member
Yeah thanks I have seen fairly high nit Tvs in the 1500 range and pretty sure I know what I prefer and its OLEDs.

Nice instantly going to name calling but pretty happy with the current TVs I own and have tested over the last couple of years
Have you seen content you idiot. I am talking content not TVs. Most content is not mastered at that kind of luminescence level. The matrix is the best there is and even it doesn't have scenes with 1500 nits.
 

nkarafo

Member
I have a PC and i never saw a single example of a game looking good in HDR.

I think the only game i ever saw looking good was Shadow of Colossus on PS4.
 

HeisenbergFX4

Gold Member
Have you seen content you idiot. I am talking content not TVs. Most content is not mastered at that kind of luminescence level. The matrix is the best there is and even it doesn't have scenes with 1500 nits.
Nope have never seen true HDR content ever.

the naked gun facepalm GIF


Kind of done with this btw

I hate to ignore people here because maybe one day they might add something of use but you did it in record time.
 
Last edited:

Stafford

Member
  1. Most likely, you have seriously messed up your TVs HDR. By all account, it should be good if not great.
  2. You might be one of those guys, not to insult you, but you should look at the most popular sweetfx preset of all games on thelazy.net, because it is full of people who like dark aesthetic and kill bloom and shadow details.Maybe you are not used to experiencing shadow details and have not yet developed a taste for it(which actually makes premium OLEDs premium OLEDs). A lot of people in Dark Souls community shit on remaster because it brings out shadow details which they think makes the game look bad. Some people hated the volumetric fog in Shadow of Colossus remake because it wasn't flat like the original making it less aesthetically appealing to them.
  3. Perhaps you are also not aware that the mastering monitor do not have OLED screen so actual (0,0,0) are rarer than you think.
  4. RE2 has stunning hdr. It is as good as it gets but I have it on PC so can't say about how series X does it. Series X has great HDR and Valhalla actually uses right kind of night lightning but I FULLY understand its not upto your taste as half of the skyrim enbs exist to address the darkness of night. It is slightly brigher in assassin's creed games though for playability reason but they have great hdr implementation (HDTV test has a video on AC Valhalla calibration). If by chance you had AC Origins and you said you found some of the tomb of pyramids not dark enough than without a doubt I will know that you dont have taste for shadow details rather than speculating it or that your setup maybe just messed up.
I have to go now so for now I'll make it short. Later I'll write a longer reply.

When I fired up ACV and right in the beginning of the game with the boy, the room looked ridiculously bad. There were very few light sources and yet the areas that should be pitch black were light grey, you can't tell me that's how it's supposed to look like. Then when I watched a HDTVtest video on it, it was in SDR I saw a world of difference, the room was indeed dark as hell.

I'd say in ACV that has nothing to do with shadow detail. It's just really bad. But that's not everywhere. In outdoor areas during daytime it looks really good!
 
Last edited:

TLZ

Banned
That's the most noob argument going on the internet. Here is the news for idiots like you, 99% of movies are shot in 2K still(yes all those 4K you see are upscaled from 2k raw of Arri) and most HDR movies are mastered below 1000 nits, infact 90% of scenes barely break 400nits. Its usually the sun that goes beyond 1000 nits.

Have you seen content you idiot. I am talking content not TVs. Most content is not mastered at that kind of luminescence level. The matrix is the best there is and even it doesn't have scenes with 1500 nits.
Stop with the idiot crap already.
 
hdr is a certificate , jut like thx , if one can self calibrate the picture to grab most , then you can achieve the same levels as a hdr setting , correct me if wrong , i just had to duck hdr to grab a 4k signal thorough a edid to my ultra wide, after tinkering it looks fkin awesome, hdr is a none worry, if you got a certified tv.
 

Keihart

Member
It looks very different depending on the game, some games have close to 0 real implementation of it.
You can clearly see it in other games, i have the theory that some games are designed with HDR in mind, U4 and Lost Legacy have colors on the clothes of the main characters that are not possible without HDR and default to a similar variant in normal mode.

Not long ago i was playing FFXV in HDR and it was a whole lot of nothing, maybe some more vibrant grass here and there but pretty underwhelming, good HDR makes the regular version look washed out.
 

Keihart

Member
That's the most noob argument going on the internet. Here is the news for idiots like you, 99% of movies are shot in 2K still(yes all those 4K you see are upscaled from 2k raw of Arri) and most HDR movies are mastered below 1000 nits, infact 90% of scenes barely break 400nits. Its usually the sun that goes beyond 1000 nits.
IMAX tho, no idea if any of the home versions are mastered from them tho, End Game was digital IMAX (smaller camera in exchange of some res) so it shouldn't have been an issue there at least.
(i meant the scenes obviously, i have no idea if there is a full movie filmed on IMAX besides the digital ones)
 
Last edited:

Kuranghi

Member
I can already tell what your issue is, its because we are so used to the old way of mapping an image for SDR since the advent of LCD TVs, as far as I know the idea was to make things pure white to show "brightness" and make them pure black to show shadows, they did this because otherwise our old shitty LCD TVs would look insanely grey and flat, they were trying to increase the perceived contrast.

You do set the min luminance all the way to left, as thats the minimum black level, otherwise you will have elevated blacks when the game wants pure black, this is especially important for an OLED or FALD LCD TVs that can turn there backlight zones fully off.

I checked with Spiderman: MM and R&C: Rift Apart, both games which use the PS5s HDR calibration page and there are a few errant scenes in the game where there is proper black crush with the min luminance set to 0, I thought that meant a setting of 0 was causing that issue but if you raise the min luminance up a few clicks and check those scenes it doesn't fix the black crush so thats just an artistic error as far as I'm concerned, and considering its the same dev makes that very likely imo.

The reason the black levels look raised is because the image now has more dynamic range, so really you are seeing those shadowed areas as they should/could look. When something is supposed to be pitch black it still should be in HDR, are there any areas of BL3 where you actually see pure black within the frame, as in, in a dark room the pixels are off completely?

You can find people complaining about this with HDR in Jedi: Fallen Order because it makes a lot of the shadows brighter, but when the game is actually supposed to be be pitch black it is and it looks fantastic.
 
Last edited:

Excess

Member
Have you seen content you idiot. I am talking content not TVs. Most content is not mastered at that kind of luminescence level. The matrix is the best there is and even it doesn't have scenes with 1500 nits.
First of all, HDR has no real standard, unlike Dolby Vision, which is what causes most of what OP is describing, eg. "Why does one game look differently in HDR than the other?" The gamma is logarithmic, so with HDR you have to scale it properly with metadata, and a lot of that depends on how the content was mastered. With games, you can at least attempt to adjust this, but much of this purely subjective and non-technical.

The point is that whether or not a screen can output 1500 nits doesn't really matter because the adjustments are logarithmically arbitrary, at least until we have some widely adopted standard such as Dolby Vision or HLG.
 

Kuranghi

Member
Sadly more often than not I get HDR that just ruins the picture more than it should. Most of the time it's elevated black levels yeah. AC Valhalla is a real offender here, sometimes it's just a choice by the devs like the RE2 Remake.

There are definitely many games with either subpar HDR or its just straight up "broken" (mostly its just the calibration tools are insufficient) and the only way to know is to ask nerds who are in the know and investigate this sort of stuff tbh, I do my own investigations and use my own experience but then temper it with as many other reputable sources as I can to find out the "truth".

RE2R (and RE3R probably too, but I haven't spent 6+ hours examining HDR in that, only RE2R) after years of research (lol) is, imo, just broken due to not being able to control the black level via the HDR calibration screen. The SDR mode doesn't have the bright highlights of HDR ofc, but everything else is better in SDR, if you have the calibration page set up correctly then it looks fantastic and has way more depth to the image than HDR mode just due to having proper black levels, like in the pre release videos by Capcom.

Go back to RE2R and try these settings for the SDR calibration page, don't worry about what it says on screen:

* Set 1st screen to -1 from maximum.
* Set 2nd screen to minimum, no ifs or buts on this one, it MUST be set to minimum to not have elevated blacks, you can see on the 3rd screen if you set it anywhere except min then the blacks are raised immediately.
* Since you are on console then set the 3rd screen to default, then check in-game what that looks like, go back and lower it by 1 and compare, repeat until there is a massive difference in shadows, ie there will be some crushing, at that point go back up one pip. Best place to check this is in the parking garage of the police station, look at Leons back and you'll see that there is a massive difference in shadows at some point when reducing it.

Hope that helps, if you like PM me when you have a new game you are trying in HDR and I'll share what I've learned about it.
 

Pagusas

Elden Member
It’s sadly true, I love HDR but it’s rollout has been a giant cluster and still rarely done right. Hitman 3 was basically unplayable until I got a reshade profile that corrected its black levels. One of the forget benefits of gaming on a PC, being able to tweak things like that.
 
Last edited:

Kuranghi

Member
It’s sadly true, I love HDR but it’s rollout has been a giant cluster and still rarely done right. Hitman 3 was basically unplayable until I got a reshade profile that corrected its black levels. One of the forget benefits of gaming on a PC, being able to tweak things like that.

Mate, nu-Hitman is basically my fav modern game and I still haven't continued H3 yet due to this (got to Dartmoor and called it quits even with 0.8 gamma it looks like I have grease on my screen/glasses), its not really even the HDR thats the problem the whole gamma curve is just fucked. The eye adaptation interacts badly with the gamma/tonemapping and in HDR its worse, everything looks even more grey.

Normally I'd just go with SDR but the issue is present there too on my display, which is a FALD VA LCD, I've checked on my OLED too though and its still present just slightly less so due to how OLED tech works. I've been in direct conversation with iOi devs and submitted tickets to the support staff about it and I'm told a fix is coming but its been 6 months so I'm losing hope at this point.

Can you link me the reshade profile please? I normally hate those but since I consider this broken instead of an artistic choice I may try it this time.
 

rofif

Can’t Git Gud
With perfect blacks I would pick an OLED any day over any LED TV for the best HDR experience
yep. It's a big deal and even at "only 700-800 nits peak on oled, when you exit a cave in uncharted 4 it blows me away how bright and saturated it looks.
with oled, HDR became a setting I look for in every new game
 

Pagusas

Elden Member
Mate, nu-Hitman is basically my fav modern game and I still haven't continued H3 yet due to this (got to Dartmoor and called it quits even with 0.8 gamma it looks like I have grease on my screen/glasses), its not really even the HDR thats the problem the whole gamma curve is just fucked. The eye adaptation interacts badly with the gamma/tonemapping and in HDR its worse, everything looks even more grey.

Normally I'd just go with SDR but the issue is present there too on my display, which is a FALD VA LCD, I've checked on my OLED too though and its still present just slightly less so due to how OLED tech works. I've been in direct conversation with iOi devs and submitted tickets to the support staff about it and I'm told a fix is coming but its been 6 months so I'm losing hope at this point.

Can you link me the reshade profile please? I normally hate those but since I consider this broken instead of an artistic choice I may try it this time.

I completely agree with normorally hating reshade, but this one was design specifically to fix this issue while keeping the game natural looking (no gimmicks or stupid addons)



Before:
3-1611412775-854220770.jpeg


After:
3-1611412777-1241847237.jpeg


 
Last edited:
To me the shame is how window handles HDR, your desktop applications (icons, and desktop wallpaper too) look ugly and washed out when you enable the HDR colors option. This should be a per application setting of they can't do better.
 

Pagusas

Elden Member
To me the shame is how window handles HDR, your desktop applications (icons, and desktop wallpaper too) look ugly and washed out when you enable the HDR colors option. This should be a per application setting of they can't do better.

Have you used it recently? The fall update last year brought new color accurate SDR to HDR conversion to Windows (the same one they use on Xbox), it fixed all the color issues for me. Dont forget you can manually set the SDR to HDR brightness conversion via the HDR settings option in you display options. inside windows.
 

Stafford

Member
There are definitely many games with either subpar HDR or its just straight up "broken" (mostly its just the calibration tools are insufficient) and the only way to know is to ask nerds who are in the know and investigate this sort of stuff tbh, I do my own investigations and use my own experience but then temper it with as many other reputable sources as I can to find out the "truth".

RE2R (and RE3R probably too, but I haven't spent 6+ hours examining HDR in that, only RE2R) after years of research (lol) is, imo, just broken due to not being able to control the black level via the HDR calibration screen. The SDR mode doesn't have the bright highlights of HDR ofc, but everything else is better in SDR, if you have the calibration page set up correctly then it looks fantastic and has way more depth to the image than HDR mode just due to having proper black levels, like in the pre release videos by Capcom.

Go back to RE2R and try these settings for the SDR calibration page, don't worry about what it says on screen:

* Set 1st screen to -1 from maximum.
* Set 2nd screen to minimum, no ifs or buts on this one, it MUST be set to minimum to not have elevated blacks, you can see on the 3rd screen if you set it anywhere except min then the blacks are raised immediately.
* Since you are on console then set the 3rd screen to default, then check in-game what that looks like, go back and lower it by 1 and compare, repeat until there is a massive difference in shadows, ie there will be some crushing, at that point go back up one pip. Best place to check this is in the parking garage of the police station, look at Leons back and you'll see that there is a massive difference in shadows at some point when reducing it.

Hope that helps, if you like PM me when you have a new game you are trying in HDR and I'll share what I've learned about it.
Thanks for the RE2 tips. I do have to mention that I only played the demo but apparently HDR wise it's the same.

Back then I still had the Sony 930e and a year after the Samsung Q9FN. It was the exact same on those sets and they aren't OLED, so yep, definitely a game thing. I kept hearing Digital Foundry saying how amazing the HDR was but no mention of the ugly as hell raised black levels.

Take the demo for RE3, in the loading screens you see the street and it's dark as hell, light a night time should look, and the highlights really stand out. But once in the game itself and it's one big elevated black levels mess.
I can already tell what your issue is, its because we are so used to the old way of mapping an image for SDR since the advent of LCD TVs, as far as I know the idea was to make things pure white to show "brightness" and make them pure black to show shadows, they did this because otherwise our old shitty LCD TVs would look insanely grey and flat, they were trying to increase the perceived contrast.

You do set the min luminance all the way to left, as thats the minimum black level, otherwise you will have elevated blacks when the game wants pure black, this is especially important for an OLED or FALD LCD TVs that can turn there backlight zones fully off.

I checked with Spiderman: MM and R&C: Rift Apart, both games which use the PS5s HDR calibration page and there are a few errant scenes in the game where there is proper black crush with the min luminance set to 0, I thought that meant a setting of 0 was causing that issue but if you raise the min luminance up a few clicks and check those scenes it doesn't fix the black crush so thats just an artistic error as far as I'm concerned, and considering its the same dev makes that very likely imo.

The reason the black levels look raised is because the image now has more dynamic range, so really you are seeing those shadowed areas as they should/could look. When something is supposed to be pitch black it still should be in HDR, are there any areas of BL3 where you actually see pure black within the frame, as in, in a dark room the pixels are off completely?

You can find people complaining about this with HDR in Jedi: Fallen Order because it makes a lot of the shadows brighter, but when the game is actually supposed to be be pitch black it is and it looks fantastic.
Yeah, I remember Jedi: Fallen Order. I didn't like the look of the very beginning of the game, brightness wise it didn't look right to me. In SDR this looked better. But apparently this is how the devs meant it to look like?

As for other games, thanks man I appreciate that a lot. I hate it when I buy a new game and I see some people say the HDR is amazing, getting my hopes up and all and then when I try it out I already see elevated black levels in splash screens , whereas in SDR they are pitch black. But I do think it's worth spending some time getting it right, because it can look amazing.

And I assume using HGIG instead of DTM on, on the C9 is the way to go, right?

Soon I plan to start with Gears 5 Hivebusters, I want the HDR to look the best it can. But I am not sure where to put the three settings at in the game itself.

Oh as for Borderlands 3. There are some areas where the black levels look fine actually, so maybe this is indeed how the Dev meant it to be. Which would not be my personal preference, but alas. I could make photos of it, but since it's HDR there's not really a point to do that I think .
 
Last edited:

Utherellus

Member
I recently bought budget-tier HDR10 capable monitor for PC and it sure is time consuming to set it up properly(given how bad Windows implementation is)

but when it works, it looks beautiful. even on my cheap arse "hdr" 350 nit monitor.

I wish HDR in Win11 works better and more games implement it.
 
Last edited:

Kuranghi

Member
Thanks for the RE2 tips. I do have to mention that I only played the demo but apparently HDR wise it's the same.

Back then I still had the Sony 930e and a year after the Samsung Q9FN. It was the exact same on those sets and they aren't OLED, so yep, definitely a game thing. I kept hearing Digital Foundry saying how amazing the HDR was but no mention of the ugly as hell raised black levels.

Take the demo for RE3, in the loading screens you see the street and it's dark as hell, light a night time should look, and the highlights really stand out. But once in the game itself and it's one big elevated black levels mess.

Yeah, I remember Jedi: Fallen Order. I didn't like the look of the very beginning of the game, brightness wise it didn't look right to me. In SDR this looked better. But apparently this is how the devs meant it to look like?

As for other games, thanks man I appreciate that a lot. I hate it when I buy a new game and I see some people say the HDR is amazing, getting my hopes up and all and then when I try it out I already see elevated black levels in splash screens , whereas in SDR they are pitch black. But I do think it's worth spending some time getting it right, because it can look amazing.

And I assume using HGIG instead of DTM on, on the C9 is the way to go, right?

Soon I plan to start with Gears 5 Hivebusters, I want the HDR to look the best it can. But I am not sure where to put the three settings at in the game itself.

Oh as for Borderlands 3. There are some areas where the black levels look fine actually, so maybe this is indeed how the Dev meant it to be. Which would not be my personal preference, but alas. I could make photos of it, but since it's HDR there's not really a point to do that I think .

HGIG and DTM is something I can't really speak to as Sony doesn't expose that as a setting on my sets, but from what I've seen from Vincent HGIG if its supported by the console/game is going to give the best image relative to your TVs performance, DTM will look brighter but wont be accurate at all.

There is a similar setting to DTM on older Sony sets but generally I would always leave DTM off because it messes with the skin tones, hope that helps. My TV is calibrated so turning stuff like that on makes the calibration worthless and introduces problems.
 

Toots

Gold Member
It's true. I just turned it off on mine because I'm tired of messing with it.
Same here :/
I was getting crazy in rdr 2 trying to get it right but i had bright days but "blue" nights or dark nights but dark days (I know it reads like i'm quoting Buddy Holly :messenger_grinning_sweat: ). Everything was too bright or too dark... I tried x amount of presets but never got it right.

Then i read somewhere that there were dude with pro equipment who could come at your place and set everything perfect, at the small cost of like 300 bucks... That's when i though nvm i'll just turn it of...
 

Rikkori

Member
Really? I had no idea. So if I disable "enable variable refresh rate" in the Xbox settings it should improve a little?
Yes. For example:
1-jpg.508183


Left is ON, right is OFF. Sadly the guy who first exposed this deleted (?) his youtube channel so the videos are gone, he had some nice tests for OLED & LCD TVs when gaming.
Same here :/
I was getting crazy in rdr 2 trying to get it right but i had bright days but "blue" nights or dark nights but dark days (I know it reads like i'm quoting Buddy Holly :messenger_grinning_sweat: ). Everything was too bright or too dark... I tried x amount of presets but never got it right.

Then i read somewhere that there were dude with pro equipment who could come at your place and set everything perfect, at the small cost of like 300 bucks... That's when i though nvm i'll just turn it of...
RDR 2's HDR is sadly broken. SDR is much better in that game, particularly for LCD users. If you're on console then you get hit with the double whammy of undefeatable vignette as well, so it's a complete mess.
 

Stafford

Member
Yes. For example:
1-jpg.508183


Left is ON, right is OFF. Sadly the guy who first exposed this deleted (?) his youtube channel so the videos are gone, he had some nice tests for OLED & LCD TVs when gaming.

RDR 2's HDR is sadly broken. SDR is much better in that game, particularly for LCD users. If you're on console then you get hit with the double whammy of undefeatable vignette as well, so it's a complete mess.

Wtf, that's really bad. Can't LG patch this, or is this on MS to do? I would rather leave VRR on for the games that really benefit from it. But for games that don't need it such as Gears 5 I'll disable it in the Xbox settings for sure.

Maybe I could try lowering the TV brightness just a little, or would that not really do the job?

As for RDR2, that sucks. I thought they had greatly improved the HDR but not fully there yet. i I heard that the daytime was great, night was still washed out or just too bright for nighttime. A shame.
 
Last edited:

rofif

Can’t Git Gud
Wtf, that's really bad. Can't LG patch this, or is this on MS to do? I would rather leave VRR on for the games that really benefit from it. But for games that don't need it such as Gears 5 I'll disable it in the Xbox settings for sure.
it's not that bad in reality but that is what vrr does to oled.
It is more washed out, the lower you get.
 

Gamer79

Predicts the worst decade for Sony starting 2022
I recently got a hisense hg9. The HDR is incredible. My old tv said it was HDR but never really shown true hdr color. My new tv is insane. OLEDS are nice but find them too dim.
 

Stafford

Member
it's not that bad in reality but that is what vrr does to oled.
It is more washed out, the lower you get.
And that is with both SDR and HDR? This is not something they can't fix with firmware?

I tested it today with Borderlands 3 DLC #2 and I thought I was seeing things, it definitely did seem the black levels were deeper when I disabled it.
 

rofif

Can’t Git Gud
And that is with both SDR and HDR? This is not something they can't fix with firmware?

I tested it today with Borderlands 3 DLC #2 and I thought I was seeing things, it definitely did seem the black levels were deeper when I disabled it.
SDR and HDR. don't matter. That's an issue also with some lcd fald displays too. not only OLED.
Just start up resident evil 2 with vrr and there is internal 60/120hz switch in real time. You will notice gamma changing a bit in 60hz.
That's a reality. It used to be flickering when fps changed radically but lg managed to patch that to some degree.
It's onyl dark greys changing (gamma) so it's not a huge issue. I was afraid of it to but it's fine
 
Last edited:

Connxtion

Member
And that is with both SDR and HDR? This is not something they can't fix with firmware?

I tested it today with Borderlands 3 DLC #2 and I thought I was seeing things, it definitely did seem the black levels were deeper when I disabled it.
We on about the CX gamma shift issue when using VRR & 120FPS mode?

If so they can’t fix it as it’s a panel issue. C1 has it also it seems 🤷‍♂️🙈 but they added a extra option under additional settings under the picture settings of the TV, bottom line it doesn’t work.

So you have 2 choices, disable 120Hz or disable VRR, I just live with it since it’s really only visible in menus. When playing the games you don’t notice it.

It’s a crap situation as I spend 1400 on this TV only to find out it had this bloody issue. Not one review for this TV talked about it or even acknowledged it. But even with this fault it’s a damn nice TV and VRR is brilliant for crappy frame rates.

Edit:
As for HDR 99% of games look great out of the box, but AC I had to bump the ingame HDR settings. But I just wish the games night time was actually dark.

Also the longer you use the TV the better the blacks get, seems you need to break in a OLED.
 
Last edited:

TIGERCOOL

Member
I have a PC and i never saw a single example of a game looking good in HDR.

I think the only game i ever saw looking good was Shadow of Colossus on PS4.
most computer monitors can't display true HDR. They don't get bright enough and they don't have local dimming zones for those bright highlights and realistic contrast. I have an HDR 400 certified monitor with a wide colour gamut and it still pales in comparison to my tcl r615 because of the peak brightness and local dimming zones. I keep it off for pc gaming.
 
Top Bottom