• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why are game trailers not in HDR?

Do you like HDR?

  • Yes

    Votes: 73 79.3%
  • No

    Votes: 8 8.7%
  • Haven't tried it yet.

    Votes: 11 12.0%

  • Total voters
    92

RCU005

Member
I just realized that no one releases their trailers with HDR. I know many companies are just releasing them in 4K, but they should add HDR, too. After all, both PS5 and Xbox Series X fully support it, as well as PC.

Is there a reason for it? Do they just not want to?

When I bought my TV, I didn't have any HDR content, so I watched some Youtube videos and wow! It looked so good. Some movie trailers look so much better with HDR on, although there are no official movie trailers released with HDR either! Why?
 
Last edited:

ChiefDada

Member
My guess is due to additional bandwidth consumption. It's a shame because some games' HDR implementation are truly transformative; God of War 2018, for example.
 
  • Like
Reactions: Rea

jaysius

Member
I'm one of the outliers that doesn't like a blown out contrast, HDR is a bad feature that was sold at the time when the industry was scrambling to find the NEXT BIG THING to make people want to throw away perfect 1080p sets.
 
I'm one of the outliers that doesn't like a blown out contrast, HDR is a bad feature that was sold at the time when the industry was scrambling to find the NEXT BIG THING to make people want to throw away perfect 1080p sets.
I for one am perfectly content with SDR. I think I actually like how SDR looks more than HDR. A lot of games just look washed out and screwed up with HDR and yes I’ve tried multiple TVs (including OLED) to make sure it wasn’t my particular screen or calibration
 

jaysius

Member
I for one am perfectly content with SDR. I think I actually like how SDR looks more than HDR. A lot of games just look washed out and screwed up with HDR and yes I’ve tried multiple TVs (including OLED) to make sure it wasn’t my particular screen or calibration
I really don't get HDR, we were striving for ACCURATE color and picture back when 1080p was fresh and new, and now we're throwing that out the fucking window and blowing out all the colors.

HDR is needed because people were told they need it.

I too have seen a variety of TVs and monitors with HDR and it's not great.
 
  • LOL
Reactions: Rea

Meicyn

Gold Member
I really don't get HDR, we were striving for ACCURATE color and picture back when 1080p was fresh and new, and now we're throwing that out the fucking window and blowing out all the colors.

HDR is needed because people were told they need it.

I too have seen a variety of TVs and monitors with HDR and it's not great.
HDR and accuracy are two entirely different things. HDR allows for more accuracy when we’re talking about duplicating what we see vs what is displayed. The market being flooded with budget TVs featuring piss poor HDR implementation is not an indictment of HDR. You had awful SDR TVs as well. My first HDTV was a set that took both 720p and 1080p, but the actual resolution of the LCD panel was 768p, so all content was either stretched or compressed to fit the native resolution.

HDR is great… if you have the proper hardware and the product has good implementation. Ratchet and Clank Rift Apart is an example of how HDR can be done right and looks incredible. Red Dead Redemption 2 is an example of HDR done wrong, and it looks worse than its SDR counterpart.
 
I'm one of the outliers that doesn't like a blown out contrast, HDR is a bad feature that was sold at the time when the industry was scrambling to find the NEXT BIG THING to make people want to throw away perfect 1080p sets.
You sound like someone who still defends 1080p vs 4k when the difference is night and day, particularly on large displays lol
 

Pagusas

Elden Member
I really don't get HDR, we were striving for ACCURATE color and picture back when 1080p was fresh and new, and now we're throwing that out the fucking window and blowing out all the colors.

HDR is needed because people were told they need it.

I too have seen a variety of TVs and monitors with HDR and it's not great.
you are right, you don’t get it.
 
Last edited:

Rykan

Member
You sound like someone who still defends 1080p vs 4k when the difference is night and day, particularly on large displays lol
That's a weird comparison. 4k is simply better than 1080p. The game will look the same but with a much better quality image.

This is very different from HDR, where the results are largely based on implementation and user calibration. Personally, HDR was so much hassle in so many games that I've stopped using it all together.

As for the OP: It's probably because HDR is less common on computer screens and cellphone screens which is where most trailers are watched.
 
I'm one of the outliers that doesn't like a blown out contrast, HDR is a bad feature that was sold at the time when the industry was scrambling to find the NEXT BIG THING to make people want to throw away perfect 1080p sets.
HDR was one of those technology that used to be so iffy that the sun, moon and cables between them had to all align correctly to work.
PS5 has it all figured out for the most part all the way to controlling the Display properly, when you have a good display at least.
Series X needs a bit more work by enabling HDR Always both game and UI (Why can't they do this yet?)
PC has dropped the ball completely on HDR, and thats where all the Youtubes are.

But now that it has been brought to my attention I need to see an example of a game trailer that is in HDR.
 
Last edited:
That's a weird comparison. 4k is simply better than 1080p. The game will look the same but with a much better quality image.

This is very different from HDR, where the results are largely based on implementation and user calibration. Personally, HDR was so much hassle in so many games that I've stopped using it all together.

As for the OP: It's probably because HDR is less common on computer screens and cellphone screens which is where most trailers are watched.
I used that argument because it's a common go-to for people (particularly with gaming) to justify outdated equipment.

What games did you have trouble with for hdr? I know some are out there but I just haven't come across them I guess
 

Rykan

Member
I used that argument because it's a common go-to for people (particularly with gaming) to justify outdated equipment.

What games did you have trouble with for hdr? I know some are out there but I just haven't come across them I guess
The Last of Us Remastered had a pretty different look when HDR was turned on. A little more "recent" example would be Resident Evil 2. Turning on HDR crushed the black levels pretty hard. I've been told it looks great if set up well, but I gave up after a bit of tinkering.

To be fair, some games look really good with HDR. Final Fantasy XV especially looked fantastic with HDR turned on.
 
The Last of Us Remastered had a pretty different look when HDR was turned on. A little more "recent" example would be Resident Evil 2. Turning on HDR crushed the black levels pretty hard. I've been told it looks great if set up well, but I gave up after a bit of tinkering.

To be fair, some games look really good with HDR. Final Fantasy XV especially looked fantastic with HDR turned on.
I didn't play resident evil 2, have you tried since the next gen updates last week?

Seems like hdr is pushed harder than ever, particularly with 4k pretty much becoming more of a standard
 

Rykan

Member
I didn't play resident evil 2, have you tried since the next gen updates last week?

Seems like hdr is pushed harder than ever, particularly with 4k pretty much becoming more of a standard
I wish!

I moved to a different country about 9 months ago. I couldn't bring my PS5 and my Series X, so I had to sell both of them.

Now I only have a used PS3 I picked up for cheap because PS4's are sold out as well. Least I don't have to worry about HDR now, right?
 

ParaSeoul

Member
HDR10, HDR10+, HLG, Dolby Vision ?

HDR is still a fucking mess, it'll be worth looking into once there's a fully agreed upon standard.
Yeah it is but whats that have to do with good HDR implementation being good? Gaming wise I think its a little better,on console anyway. HDR10 seems to be whats used most commonly.
 
Last edited:

CamHostage

Member
Everything I know about HDR is that it's a pain in the ass to work with, it doesn't translate back to looking great on a non-HDR monitor, and it looks great only when carefully done to look great (or automated to look great, which isn't necessarily "looking great" for the content so much as just a general, "Oh, that looks great" because the algo cannot be 100% natural to the content but it can pick up stuff people like at a quick glance like real deep, crushed blacks.) Games are great for HDR because they actively render right to the display (and yet that still gets screwed up,) but once you sever that communication between the hardware and the display, you have locked recorded footage and are locked into whatever constraints come with what/how you captured. Also, I believe it's lossy, even though the range is higher it transforms the content rather than having an additive channel, so you would think HDR would be just SDR plus a high-dynamics range and would be like resolution where you could scale back down to the old way easily, but I don't believe that is so comfortable a case.

So, for publishing a trailer to the internet which will mostly be watched on phones or work laptops, it makes sense why HDR just isn't happening with game trailers. (I believe you would have to master it start-to-finish in HDR to really get it right, not just apply an HDR pass of the completed footage. Also, I'm not sure how cutting actually works in HDR, if there are times where cutting between different dynamic ranges is something to be careful of, similar to cutting on depth if you have a 3D project?)

A game clip or even a livestream, that makes more sense because you can narrowcast to a select customer base looking for HDR footage. A trailer is too important to hit perfectly every time a new eyeball falls upon it, so you would want none of the HDR fibbergibbity and standards-bickering to impede the first impressions of your big title.

(*Please feel free to correct any/all the technical fuck-ups I made in the post above. I know people who grumble about having to work with HDR, but I have little personal experience with it...)
 
Last edited:

Reallink

Member
HDR? Bruh that's the least of the problems, you see this shit?



They spend 100's of millions of dollars making these games only to unveil and market them with 64Kbps Youtube streams that look like AI water color paintings. WTF happened to the 4K livestreams, they killed that shit with the Covid bandwidth crackdown and are never bringing it back.
 
Last edited:

lucius

Member
To get great HDR in many games you need to get 3 settings right and you have to go back and forth in all 3 to get it the best so it’s not worth it to many people. If you watch most PC stream hardly anyone is playing in HDR, I always turn it on but I have had to fiddle so much with it in some games it’s dumb. Also need to fiddle with the system wide HDR settings that don’t always affect the game depending on the game, yep it’s been a mess.
 

Keihart

Member
TLDR, youtube kinda sucks at HDR right now and thus, almost nobody bothers.
So until Youtube decides to fix their shit, you aint getting consistent HDR content, including trailers.
 
Last edited:

MistBreeze

Member
HDR is great concept but it released too early

I think consumer tvs much reaches 5000 nits highlights at least to it to be viable

Honestly I do not know if this will happen ever

To me contrast is the most important factor in picture quality

I have sony x900e it is a great tv but when I watched the same content in my parents 2012 samsung plasma it makes my x900e look washed out

( both are calibrated )
 

RPSleon

Member
I just realized that no one releases their trailers with HDR. I know many companies are just releasing them in 4K, but they should add HDR, too. After all, both PS5 and Xbox Series X fully support it, as well as PC.

Is there a reason for it? Do they just not want to?

When I bought my TV, I didn't have any HDR content, so I watched some Youtube videos and wow! It looked so good. Some movie trailers look so much better with HDR on, although there are no official movie trailers released with HDR either! Why?
My assumption would be that hdr content displayed on sdr screens can look worse than plain old sdr.

Alot of people dont have hdr displays.

HDR is undeniably better than sdr though. I dont know why people would choose no in the poll.
 

RoadHazard

Gold Member
I really don't get HDR, we were striving for ACCURATE color and picture back when 1080p was fresh and new, and now we're throwing that out the fucking window and blowing out all the colors.

HDR is needed because people were told they need it.

I too have seen a variety of TVs and monitors with HDR and it's not great.

Eh no. The real world is MUCH brighter and has a MUCH higher dynamic range than what the best screens in the world can even come close to, so if you're talking accuracy HDR is closer to real life than SDR. Or it CAN be. But then it depends on the implementation and the content of course. HDR doesn't have to mean "make the colors unnaturally vivid" at all. It just means you can display a larger range of colors and brightness at the same time (but still far from what our eyes can actually perceive).
 

Haggard

Member
HDR is great concept but it released too early

I think consumer tvs much reaches 5000 nits highlights at least to it to be viable
ever heard of tonemapping?
The difference between SDR and HDR on a higher end TV is night and day, and has been for years....
 
Last edited:

rofif

Gold Member
I really don't get HDR, we were striving for ACCURATE color and picture back when 1080p was fresh and new, and now we're throwing that out the fucking window and blowing out all the colors.

HDR is needed because people were told they need it.

I too have seen a variety of TVs and monitors with HDR and it's not great.
Hdr allows for accurate Colorapace.
The goal is for tv to be a window. If you look outside the window, some elements are 500 nits, shadow can be black, flower can be 2k nits and sky 10k nits (examples)
Now look at sdr and everything is 150 nits evenly.

In reality bright elements reflect more light than dark. Even hdr of only 1000 gives waaaay more lead way than sdr. Black can be really not illuminated at all and sky can be 800.
You exit the pitch black cave in uncharted and look at the cave portal outside that’s 800 nits, it looks incredibly alive.

I know that opening would be like 10k nits in reality but sitting in home, even 800 nits is great. I don’t think there is much reason to chase super bright tvs. The important part is contrast and oleds get it right
 

rofif

Gold Member
The Last of Us Remastered had a pretty different look when HDR was turned on. A little more "recent" example would be Resident Evil 2. Turning on HDR crushed the black levels pretty hard. I've been told it looks great if set up well, but I gave up after a bit of tinkering.

To be fair, some games look really good with HDR. Final Fantasy XV especially looked fantastic with HDR turned on.
Hdr looks great in re2. If on oled. First slider on test image should be set until box disappears. 6 tick for me. 2nd box default middle.
Looks great. Nothing is crushed. Some parts are washed out but that’s how devs designed the few rooms
 

rofif

Gold Member
HDR is great concept but it released too early

I think consumer tvs much reaches 5000 nits highlights at least to it to be viable

Honestly I do not know if this will happen ever

To me contrast is the most important factor in picture quality

I have sony x900e it is a great tv but when I watched the same content in my parents 2012 samsung plasma it makes my x900e look washed out

( both are calibrated )
5000 would be more life like true. Having 0 and 5000 nits elements in a scene would be like looking out the window.
But keep in mind that you play games at night in a dark room. Even 1000 nits is already super bright. You don’t want daylight window on night room :p
 

Kuranghi

Gold Member
I would guess because most HDR implementations are done after the game is mostly complete, while we might now be mostly past the intial stage of HDR implementations which consisted of just hastily converting the SDR output to HDR automatically using tools and they now use pipelines that can render everything internally in HDR (or rather, way more range even than HDR10 can display usually) from the start of production it doesn't mean they don't still primarily focus on the SDR output because that the displays most people still have.

Maybe its also just to save marketing money, youtube can have an SDR and HDR version of a video - along with different resolution versions of them, so if theres 7 resolutions then add up to an extra 3 files for 1080p, 1440p and 2160p HDR versions, I don't believe you can have an HDR video with resolution less than 1080p on YT but someone please correct if thats wrong, it might just be the choice of the marketers since there aren't many devices that support HDR but have a resolution less than 1080p - so having the team also have to make up an HDR version and do QA checks on it would probably increase costs not insubstantially.
 

YCoCg

Member
Is there a reason for it? Do they just not want to?
The main method of watching is YouTube and sadly YouTube sucks when it comes to handling SDR and HDR, it's been a long requested feature to allow videos to have both a SDR and HDR profile for advanced uploads but well, you know YouTube.
 

Kuranghi

Gold Member
5000 would be more life like true. Having 0 and 5000 nits elements in a scene would be like looking out the window.
But keep in mind that you play games at night in a dark room. Even 1000 nits is already super bright. You don’t want daylight window on night room :p

Aye totally, I think specular highlights and the sun IRL can easily reach 10000 nits even. 1000 nits IS super bright in a dark room but the point is you wouldn't be showing things at 4000 nits plus for more than a fraction of a second to get the point across (a lightning strike for example) or you would only actually make a tiny tiny portion of the displays area be that bright so you don't go blind.

I got "roasted" by people the other day about how "700 nits is more than bright enough in a dark room", they just don't really understand what increasing dynamic range of an image means I guess 🤷‍♂️ why is 700 nits "enough" to display content thats mastered to 1000, 4000 or 10000 nits when we used to have SDR and that capped out at 100 nits in the source while TVs were coming out before HDR really existed that could display way more than 100 nits in smallish areas (not OLED per pixel ofc, but still great FALD sets), fuck knows but I just stopped replying because it hurts my head to engage with people who don't even understand the basics.

I'm basically one level above imbecile on the depth of my understanding of HDR on a technical level but somehow I'm a bloody expert compared to most engaging in arguments about how bright displays need to go to show the full range of HDR content (99% of them being OLED owners ofc :messenger_smirking:).

Whoa, sorry for ranting at you about things unrelatd to your post rofif :LOL:
 
I'm one of the outliers that doesn't like a blown out contrast, HDR is a bad feature that was sold at the time when the industry was scrambling to find the NEXT BIG THING to make people want to throw away perfect 1080p sets.
On cheaper TV sets that advertise HDR features, all it does is blow up the backlight, brightness, and contrast sliders to max which completely ruins the picture and burns your retinas. You need a set that is at least ~$1,000 to understand what HDR can really add to the experience.
 

Kuranghi

Gold Member
The main method of watching is YouTube and sadly YouTube sucks when it comes to handling SDR and HDR, it's been a long requested feature to allow videos to have both a SDR and HDR profile for advanced uploads but well, you know YouTube.

Are you saying they aren't separate versions? I thought they were separate files according to the youtube downloader I use. Is the SDR one just tone-mapped from the HDR one?
 

jaysius

Member
HDR:

  • No agreed upon standard
  • Multiple companies fighting to make a standard
  • varies widely from TV models/brands
  • successful implementation varies widely from content to content
  • pushed by the industry as the new hotness
  • has frothing fanbois.
If the fucking industry would nut up and try and create a standard instead of fighting to make it's own HDR it would be much clearer.

HDR and 3DTVs same energy.
 
Last edited:

YCoCg

Member
Are you saying they aren't separate versions? I thought they were separate files according to the youtube downloader I use. Is the SDR one just tone-mapped from the HDR one?
Yes, when you upload a video with HDR metadata YouTube will automatically create a tone mapped SDR version, however this isn't always accurate, so it's been a long time request from uploaders to be able to upload metadata profiles for HDR, SDR and even things like Dolby Vision so each can be mastered properly to look their best without automatic tone mapping.
 
HDR is great concept but it released too early

I think consumer tvs much reaches 5000 nits highlights at least to it to be viable

Honestly I do not know if this will happen ever

To me contrast is the most important factor in picture quality

I have sony x900e it is a great tv but when I watched the same content in my parents 2012 samsung plasma it makes my x900e look washed out

( both are calibrated )
5000 nits to be viable? LOL

I'm not sure about the newer ones, but lg cx panels run 800-1000 nits

Go watch something like planet earth with/without hdr on (I'd say dolby vision but your TV doesn't have it) the x900e and tell me which looks better

You're literally just making shit up
 

Kuranghi

Gold Member
Yes, when you upload a video with HDR metadata YouTube will automatically create a tone mapped SDR version, however this isn't always accurate, so it's been a long time request from uploaders to be able to upload metadata profiles for HDR, SDR and even things like Dolby Vision so each can be mastered properly to look their best without automatic tone mapping.

Not sure its the same thing but I've seen comments from the "Jacob + Katie Schwarz" channel saying they provide their own HDR LUTs, thats not for SDR though obviously.
 

spawn

Member
Probably has to do with Youtube and other video streaming sites. My TV has HDR and if I upload to Youtube a saved video it just doesn't have the same look to it
 

AGRacing

Member
I just realized that no one releases their trailers with HDR. I know many companies are just releasing them in 4K, but they should add HDR, too. After all, both PS5 and Xbox Series X fully support it, as well as PC.

Is there a reason for it? Do they just not want to?

When I bought my TV, I didn't have any HDR content, so I watched some Youtube videos and wow! It looked so good. Some movie trailers look so much better with HDR on, although there are no official movie trailers released with HDR either! Why?
It's worse than that.. they often are introduced to the public for the first time on low quality YouTube streams.. so they're not even in 4K quality.

So much of the real fidelity improvements of next gen games on modern TVs are not being properly represented on YouTube.... Horizon : FW is a great recent example of this. I've yet to see a video of this game that does justice to how it actually presents on the TV.
 
I'm one of the outliers that doesn't like a blown out contrast, HDR is a bad feature that was sold at the time when the industry was scrambling to find the NEXT BIG THING to make people want to throw away perfect 1080p sets.
HDR is more accurate and allows for far greater detail in both dark and bright scenes.
if it looks too "blown out" to you, reduce the HDR intensity in-game or peak brightness in the TV's menu. you have a lot of control over how "contrast-y" an image looks.

ps i still have a 9th gen kuro. never use it because 1080p sucks.
 
HDR and accuracy are two entirely different things. HDR allows for more accuracy when we’re talking about duplicating what we see vs what is displayed. The market being flooded with budget TVs featuring piss poor HDR implementation is not an indictment of HDR. You had awful SDR TVs as well. My first HDTV was a set that took both 720p and 1080p, but the actual resolution of the LCD panel was 768p, so all content was either stretched or compressed to fit the native resolution.
I had a TV like that too 😣 it was horrible!

HDR is great, but you need a TV that does it well enough for it to shine.

Also, Windows looks horrible when you enable HDR (on win 10 at least) whatever they do to "tone map" just makes SDR content look dull, in a way that it did not before. Because of this I keep disabling the option on Windows and I enjoy HDR content on the TV apps, the PS5... devices that have OSes that handle it correctly.

MS are really sh*itting the bed here here, it's as if when they added support for true colors that made 256 colors content look worse for some reason. There must be some excuse, but I honestly don't care for it, applications should just enable it as needed, I don't want all my colors screwed up because I want some games to look shinier.
 
Top Bottom