• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

4K UHD/Blu-ray/DVD Community Thread: Bringing the Theater Home!

Great idea for a thread! The RealorFake4K site sounds like a great resource. I don't have a 4K set yet, but I was hoping to pick up the Unforgiven UHD release and I'm glad to see it in the "real" column!

WATCH THE VIDEOS FIRST.

Seriously, everyone who works in film, or imaging science, or display engineering, or who even remotely considers themselves a home theater nerd (aka thread members) need to watch the videos as it will help your understanding as to where what you watch came from and what pixel count versus real resolution is.

Also there is an American Cinematographer article about it here.

http://yedlin.net/ResDemo/

Direct download of video 1
Direct download of video 2

Warning: Around 2GB each. Watch on your best display device if you can. Doesn't need to be 4K because the cinematographer breaks it all down by zooming into details in part 2. Super impatient or ADHD? Skip straight to part 2.


And welcome. :)
 

Tapejara

Member
WATCH THE VIDEOS FIRST.

Seriously, everyone who works in film, or imaging science, or display engineering, or who even remotely considers themselves a home theater nerd (aka thread members) need to watch the videos as it will help your understanding as to where what you watch came from and what pixel count versus real resolution is.

Also there is an American Cinematographer article about it here.

http://yedlin.net/ResDemo/

Direct download of video 1
Direct download of video 2

Warning: Around 2GB each. Watch on your best display device if you can. Doesn't need to be 4K because the cinematographer breaks it all down by zooming into details in part 2. Super impatient or ADHD? Skip straight to part 2.


And welcome. :)

I'll be sure to watch these and read the article. Thanks for the links!
 

Trago

Member
When I finally get me a 4K TV and a UHD player, it's on like Donkey Kong.

I'm having trouble finding out which UHD players support Dolby vision or not since that's the new hotness.
 
Here's my setup.
-Insignia NS-43D420NA16 43 inch 1080p HDTV
-Sony STR-DH550 5.2 surround sound receiver
-Panasonic DMP-BD903P-K Smart Network Blu-ray/DVD Disc Player (I also have an Xbox One and PlayStation 3 if I want to watch Blu-rays on those instead)
EGlvesv.jpg

plaTBsL.jpg

hbUJQFh.jpg

CQ9B1Lz.jpg

n6Psgpy.jpg

Here's my Blu-ray collection.

And my DVD collection.
 

F34R

Member
I'm on a Vizio P65 C-1 (65" obviously) Playing with my Samsung K8500 for disc, and Vudu/Netflix for streaming 4k.

I'm at 70 (disc based) movies so far.
 

Westonian

Member
My current setup:
1bh9JGX.jpg


TV: Vizio M70-D3
AVR: Denon AVR-X3200
Speakers: Athena AS-F1 (L/R and surrounds) AS-C1 (center) AS-P400 (sub)
Blu-ray: Oppo UDP-203

Plus all the other crap: DirecTV DVR, PS4, RetroUSB AVS (HD NES), SNES (with SCART RGB to HDMI), Wii-U, and Switch.

I have no idea what my Blu-ray collection is at. I stopped keeping up with it on Blu-ray.com years ago.
 
Don;t forget not all transfers are equal, the same thing happened when bluray first released as well.

check here for PQ reviews.
HiDefDigest.com
bluray.com
 
The current setup for Atmos on my Xbox One S turns off Optical audio, so my headphones stop working. If I turn it back on then I stop getting Atmos audio.. Usually I forget to turn it back.

Wonder if there is a work around?

Got about 10 UHD disks now.. It's kind of fun to collect them again, been a minute since I bothered with Blu-ray.
 
Bless you OP for making this thread, subbed.
Glad that people seem to like it. I've been meaning to make this thread for a long time.

Don;t forget not all transfers are equal, the same thing happened when bluray first released as well.
Yeah, that's one of the reasons I made this thread, to help people find info on different transfers in releases.

Subbing this thread, although I haven't purchased a standalone UHD player yet.
Well, the thread is not just for 4K, but regular ol' Blu-ray and DVDs, too!
 
Here's a comparison between the DVD, original Blu-ray, and the latest Blu-ray release of The Terminator. It shows how much good a new transfer can do for a movie.

Here's the DVD version of The Terminator (upscaled to 1080p):
PaB4yPZ.png


Here's the original Blu-ray release:
o1gsavu.png


And here's the remastered version Blu-ray:
lhZNS24.png


Again, DVD:
l1UHBuz.png


Original Blu-ray:
glM1AZt.png


Remastered Blu-ray:
VN2imSD.png


Images sourced from https://caps-a-holic.com
 

Kambing

Member
I stopped watching GoT after season 1... at this point i feel that we are bound to get a 4k version, right guys? RIGHT?
 

BobLoblaw

Banned
Next year is the year that I upgrade to 4K. Gonna get LG's next OLED 65" C7 equivalent as well as a UHD player. I've already got the Atmos/DTS:X receiver and sound system set up thankfully. Stocking up on UHD movies this black Friday. :)
 

Westonian

Member
Next year is the year that I upgrade to 4K. Gonna get LG's next OLED 65" C7 equivalent as well as a UHD player. I've already got the Atmos/DTS:X receiver and sound system set up thankfully. Stocking up on UHD movies this black Friday. :)
Is your AVR Dolby Vision ready?
 
Im going to pick up pacific rim this weekend on uhd. I hear its a real show case, may also grab ghost in the shell. I know blue planet 2 probably wont hit uhd til early next year but after planet earth 2 its one of my most anticipated releases. I cant wait til more films start filming in 8k. 8k downsampled to 4k with hdr looks fucking stunning
 
Why are all the best movies fake 4k?

"fake 4k" is kind of a shitty term.

Most of the industry operates on a 2k pipeline, from what I understand. I'm sure J & DPP can explain it more clearly, but basically - 2k looks great as it is, and its easier for pretty much the entire industry to finish & distribute stuff in 2K than to rebuild whole production pipelines for 2x the resolution when many exhibitors can't show in that resolution, and the cost in finishing at that resolution is higher.

(plus, the part that kinda has to be said even though results vary quite a bit from person to person, but: A lot of people honestly can't tell the diff between 2k & 4k resolution at the theater)
 

Realyst

Member
"fake 4k" is kind of a shitty term.

Most of the industry operates on a 2k pipeline, from what I understand. I'm sure J & DPP can explain it more clearly, but basically - 2k looks great as it is, and its easier for pretty much the entire industry to finish & distribute stuff in 2K than to rebuild whole production pipelines for 2x the resolution when many exhibitors can't show in that resolution, and the cost in finishing at that resolution is higher.

(plus, the part that kinda has to be said even though results vary quite a bit from person to person, but: A lot of people honestly can't tell the diff between 2k & 4k resolution at the theater)
Case in point: John Wick 2 was finished with a 2K process (2K DI), and is one of the best looking transfers out right now. It runs neck-and-neck with Planet Earth 2 (shot mostly in 4K) as my number one reference disc.
 

Hex

Banned
"fake 4k" is kind of a shitty term.

Most of the industry operates on a 2k pipeline, from what I understand. I'm sure J & DPP can explain it more clearly, but basically - 2k looks great as it is, and its easier for pretty much the entire industry to finish & distribute stuff in 2K than to rebuild whole production pipelines for 2x the resolution when many exhibitors can't show in that resolution, and the cost in finishing at that resolution is higher.

(plus, the part that kinda has to be said even though results vary quite a bit from person to person, but: A lot of people honestly can't tell the diff between 2k & 4k resolution at the theater)

Agreed.
On home titles, you genuinely will not notice as they still look amazing though it is easy to develop a thing in your head where you insist that you are sure.
The bottom line is that with 4k (true) and 4k(2k master) some look better than others.
My personal go to right now I think is Sicario and Planet Earth II are my go tos right now.
 

Westonian

Member
My favorite outside of Planet Earth is Miss Peregrine's Home for Peculiar Children. The level of detail, colors and contrast is really demo worthy.
 

StudioTan

Hold on, friend! I'd love to share with you some swell news about the Windows 8 Metro UI! Wait, where are you going?
I haven't bought a Blu-ray since getting my JVC faux-K projector. Even though it's using pixel shifting to get 4K the picture is incredible.

I think I'm at over 80 UHD movies at this point.
 
What? Hardly. That movie was shot with Red's 8k cameras with a 4k DI. It could hardly be any less fake for the live action scenes, at least.

At least according to realorfake4k it is. I know it was shot with good cameras, but apparently the masters were set to 2k or some crap, which I find hard to believe (yet, somehow, don't put it past anyone). I guess we'll see when it's out on disc, but there's at least some confusion with this one, apparently.
 

F34R

Member
At least according to realorfake4k it is. I know it was shot with good cameras, but apparently the masters were set to 2k or some crap, which I find hard to believe (yet, somehow, don't put it past anyone). I guess we'll see when it's out on disc, but there's at least some confusion with this one, apparently.

Special effects shot in 2K, rest of film 4K, 4K DI. There is a review already out for the Vudu stream of it:

http://ultrahd.highdefdigest.com/50098/guardiansofthegalaxyvol24k.html
 

GTI Guy

Member
WATCH THE VIDEOS FIRST.

Seriously, everyone who works in film, or imaging science, or display engineering, or who even remotely considers themselves a home theater nerd (aka thread members) need to watch the videos as it will help your understanding as to where what you watch came from and what pixel count versus real resolution is.

Also there is an American Cinematographer article about it here.

http://yedlin.net/ResDemo/

Direct download of video 1
Direct download of video 2

Warning: Around 2GB each. Watch on your best display device if you can. Doesn't need to be 4K because the cinematographer breaks it all down by zooming into details in part 2. Super impatient or ADHD? Skip straight to part 2.


And welcome. :)

So what I gather from this is that anything above 2k video is pointless?
 

GTI Guy

Member
No. Just that pixel count numbers by themselves don't mean anything.

Edit coming soon.

Sure...I guess maybe I should restate that. All things being equal, so all of the elements that go into an image or a video, other than the pixel count the difference is imperceptible. Perhaps that is still incorrect though.
 
Sure...I guess maybe I should restate that. All things being equal, so all of the elements that go into an image or a video, other than the pixel count the difference is imperceptible. Perhaps that is still incorrect though.

Skip the edit, new post.

"realorfake4k" means nothing. Not as far as image detail goes. It literally means nothing.

To this (trash) website, "real" 4k means this:

1) shot on cameras with a 4K (or higher) sensor (doesn't mean that sensor is actually capturing 4K of detail).
2) Workflow was 4K (or higher...but doesn't mean the algorithms they are using preserve detail at 4K or higher).
3) The mastered format (DI or Digital Intermediate) is 4K or higher.

To that site, that means 'real' 4K. The (false) implication is this means a more detailed picture.

It doesn't mean that.

Why?

Some reasons, among others, are:

1) The sensor in the camera could be noisy (by design or limitation) and 4K resolution is not achieved.
2) The optics in the camera could be insufficient for 4K resolution.
3) The conditions (lighting, which affect the sensor et al) or settings the cinematographer and camera operators are using may not be able to achieve 4K resolution.
4) The workflow, comprising dozens if not more elements of processing, including the algorithms used, may not preserve 4K resolution even if the source photography provides 4K or greater resolution.

STOP PIXEL COUNTING AS A MEASURE OF DETAIL. IT DOES NOT WORK THIS WAY.

Per the ASC article:

"No other information is given about how those images were created and nobody asks any questions. Well, have they both been mastered in 4K? Or are we comparing what it's like to use 2K as a source for a 4K master? Is one thing 2K all the way through and the other 4K all the way through? If we're comparing them back-to-back, is this being done on a 4K projector? Doesn't that mean the 2K is being scaled to 4K? How is that being done, as there are different scaling algorithms which all affect the image differently? I've seen multiple situations where filmmakers or studio decision makers are shown something that's meant to be a comparison and they are being shown this not by a technology expert from their own company but by a vendor who stands to gain by whatever decision is made based on the demo they are giving. That's not really a fair comparison situation. So these decisions are not only being made with entrenched presuppositions about what makes an image look the way it does, but these false comparisons that are only nominally scientific and actually more of a marketing manipulation."

Don't be a tool for marketing. An don't use a goddamn website that does nothing but compare pixel count in acquisition, workflow, and mastering to decide which fucking discs are worth buying. Yet we have way more posts talking about "realor4k" as a useful decision making tool than the ones talking about how movies are actually made.

It's pointless.

Just watch the videos.
 

Realyst

Member
My TV has HDR10 but not Dolby Vision. What am I missing out on?
As I understand it, it basically comes down to two main differences: static vs dynamic HDR metadata, and 10 vs 12 bit color.

HDR10 is an open format uses a fixed (static) set of HDR metadata that won't take full advantage of the nits capabilities of your display. Movies may be mastered up to 4000 nits, while your display may only be capable of somewhere between 500 and 1000 nits of luminance. The HDR10 metadata won't take your display's capabilities into consideration when adjusting scene to scene dynamic ranges. Dolby Vision is a proprietary HDR format that uses extra processing within your display to know exactly how much luminance to apply to a scene based on your display's nits capabilities, thanks to its use of dynamic metadata. Right now, this only works for TVs with the embedded Dolby Vision chip. Not sure if there will be a software solution for this in the future.

For wide color gamut capabilities, Dolby Vision is capable of displaying more colors than HDR10. This may be a moot point as currently there are no consumer grade displays capable of anything beyond 10-bit color.

There is at least one open source HDR technology with dynamic metadata being worked on (HDR10+) that should be able to compete with Dolby Vision, but who knows when it will be adopted by the BDA for use on UHD discs.

I haven't really seen any AB testing to tell you if Dolby Vision is noticeably better, but I'm sure it's capable of much more than the current HDR10 standard.
 

Dpp1978

Neo Member
To add to Beer Monkey's post, if all else is equal there is more information in a native 4K image. The issue is whether it is useful information. By useful information I mean is it perceivable by human vision at normal viewing range? The videos posted demonstrate that any increase in useful information is negligible.

A little background.

4K is not new technology. In the early '80s when ILM were doing their experiments on digital effects work they did tests. They found to capture all the detail from motion picture film they had to scan it at 4K resolution: that is around 4000 discrete photosites between the perforations of the film frame. If 4K video is a resource hog today you can imagine how bad things were 30 odd years ago. But they found there was no significant perceivable difference when the film was scanned at 2K (around 2000 photosites across the frame, which is almost identical in terms of spatial resolution as 1080p) and recorded back to film. That meant that at 2K all the useful information had been captured. So it became the standard. It was still a massive amount of data for the time but it was manageable.

Nobody outside the small community doing this work knew or cared about this. It is only when companies have to sell something that these things get hyped. The easiest way to sell something new is to show its benefits. But many (most?) of those holding the purse-strings are not technically minded. The guys in the trenches know all this stuff anyway and they'll do their own tests. But they rarely get to make financial decisions. You have to find a way to condense all the information into a digestible chunk for the money-men to consume. Tell them there is better resolution and give them a nice easy way to measure it. It is basic psychology. Give them a bigger number and most people will be impressed.

When it trickles down and becomes a consumer item the same thing happens. 480i/576i had been the SD standard for decades when over the space of around 15 years we have (as average consumers) gone from SD to ED to HD to Full HD to UHD. We are now in the bizarre position that the devices we have at home are "higher resolution" than the content being made. Think about that.

At the advent of home video you had a number of formats but I will stick with VHS and Laserdisc as those are the only two which had any real legs on the consumer market. VHS has about 240 lines of useful information per picture height and Laserdisc has around 400. Colour information is stored at much lower resolution (as low as 40 lines for VHS) than brightness information. This is because we are more sensitive to brightness (luminance) than we are to colour (chrominance). Both elements are bundled together and stored as a composite video signal which has to be converted back into its component parts to be viewed. How well this is done depends on how good the decoder in your equipment is.

This is obviously far less visual information than 35mm film. When projected on a screen a standard 35mm release print resolves between 700 and 1000 lines of perceivable information per picture height. Colour is, due to its nature, at full resolution. Obviously far better than home video of the time.

DVD arrived and offered 480p/576p video. It stored colour as component information rather than composite. That means luminance and chrominance are stored separately. Colour is at half resolution, which is okay as, again, we are less sensitive to colour than we are to image brightness. Colour is also separated into, and stored as, two channels which can be cleanly converted back to the three primaries. For practical purposes this is a non issue as the difference between component and native RGB is almost imperceptible.

It has the benefit of being digital which means, as long as the signal path is clean and your equipment is up to spec, you can be confident the signal out is the same as the signal encoded onto the disc. You could have no such confidence with analogue video. There is a reason NTSC was disparagingly said to stand for "never twice the same colour".

The move to digital required a sampling size for the video which was set at slightly less than 8bits per pixel, per channel. This means for each primary channel there are 219 possible degrees of intensity combining to around ten and a half million possible colours (full range 8 bit video has 256 degrees of intensity per colour and is what most computer monitors use). It uses a colour gamut defined as rec 601. It uses MPEG-2 compression at a maximum data rate of 9.8 Mbit/s

Much better than what came before but still a long way from competing with 35mm.

HD formats started to appear but I'll only look at Blu-ray as it is the one which survived (which still irks some HD-DVD fans). It raised its spatial resolution to 1080p, which is very nearly the same as full 2K digital cinema video (to the point they can pretty well be used interchangeably), but kept the 8 bit colour depth. It moved to a slightly wider colour gamut (rec 709) so has the potential for better colour fidelity. It offers better compression codecs, notably h.264 which is more efficient than MPEG-2 by a significant margin. That means you can store the same amount of video at the same quality in a smaller space. It has a bandwidth of up to 54Mbit/s, which is far more than streaming sites typically stream 4K video.

In the meantime the shift to digital projection in cinema was well underway. The main difference between the 2K DCPs used to show films and Blu-ray video is not one of spatial resolution: it is in its colour gamut, colour depth and compression.

DCPs use a colour gamut called P3 which is wider than Blu-rays and stores colour at 10 or even 12 bits per channel. For comparison 10 bit video offers 1024 degrees of intensity per colour as opposed to Blu-ray,s 219. That is over a billion different possible colours. 12 bit has 4096 degrees for a total of over 68 billion colours! Of course you are still limited by the constraints of the gamut you are working within, but you have a ridiculous level of precision.

The biggest problem with a lower bit depth comes when you have subtle graduations of colour. Think a sky at sunset where it goes from a deep red at the horizon to a rich blue as you look upwards. If there aren't enough degrees of intensity available to accurately reproduce this you get colour banding, which is, to my mind, the ugliest video artefact. There are ways to minimise this, but that is outside the scope of this post. A higher bit depth can eliminate this altogether.

DCPs use JPEG 2000 compression, where each frame is captured as a still image and played back at a bit rate of up to 250Mbit/s. It is not lossless but does not rely on temporal encoding so there is less likelihood of motion artefacts being present than there would be on consumer grade video.

Sad as I am to say, all things considered a DCP at 2K is at least as good as a 35mm print. In many ways it is better.

I'll skip 3D, not because it is not interesting but because it is a whole topic on its own.

So now we have 4K in the home when most cinemas still run 2K projectors. We have 10 or even 12 bit video in the home. We have a wide colour gamut in the home with intent to move to rec.2020 which is far wider than anything commonly seen, even in a commercial setting. We have HDR, which is, again, something most of the very best commercial screens do not have.

Apart from sheer scale and issues around compression, the latter of which is negligible with the h.265 codec used on UHD disks at the bandwidth provided (up to 128 Mbit/s), we have video on a home format which competes on almost every level on a par with commercial cinema. We have higher spatial fidelity than most of the content produced (although that is a moot point really).

It is crazy.

Sound is another issue to look at and I purposely have not looked at IMAX as that is its own thing apart from mainstream theatre. But I think this is enough for one post
 
There's some phenomenal information in this thread. Thank you to everyone who has taken the time to post said information.

Really excited to see GITS this week.

Has anyone else seen Power Rangers? I was blown away by the opening scene on my LG OLED and the colours of the suits were gorgeous, honestly expected much more muted colours.
 

captive

Joe Six-Pack: posting for the common man
Skip the edit, new post.

"realorfake4k" means nothing. Not as far as image detail goes. It literally means nothing.

To this (trash) website, "real" 4k means this:

1) shot on cameras with a 4K (or higher) sensor (doesn't mean that sensor is actually capturing 4K of detail).
2) Workflow was 4K (or higher...but doesn't mean the algorithms they are using preserve detail at 4K or higher).
3) The mastered format (DI or Digital Intermediate) is 4K or higher.

To that site, that means 'real' 4K. The (false) implication is this means a more detailed picture.

It doesn't mean that.

Why?

Some reasons, among others, are:

1) The sensor in the camera could be noisy (by design or limitation) and 4K resolution is not achieved.
2) The optics in the camera could be insufficient for 4K resolution.
3) The conditions (lighting, which affect the sensor et al) or settings the cinematographer and camera operators are using may not be able to achieve 4K resolution.
4) The workflow, comprising dozens if not more elements of processing, including the algorithms used, may not preserve 4K resolution even if the source photography provides 4K or greater resolution.

STOP PIXEL COUNTING AS A MEASURE OF DETAIL. IT DOES NOT WORK THIS WAY.

Per the ASC article:

"No other information is given about how those images were created and nobody asks any questions. Well, have they both been mastered in 4K? Or are we comparing what it’s like to use 2K as a source for a 4K master? Is one thing 2K all the way through and the other 4K all the way through? If we’re comparing them back-to-back, is this being done on a 4K projector? Doesn’t that mean the 2K is being scaled to 4K? How is that being done, as there are different scaling algorithms which all affect the image differently? I’ve seen multiple situations where filmmakers or studio decision makers are shown something that’s meant to be a comparison and they are being shown this not by a technology expert from their own company but by a vendor who stands to gain by whatever decision is made based on the demo they are giving. That’s not really a fair comparison situation. So these decisions are not only being made with entrenched presuppositions about what makes an image look the way it does, but these false comparisons that are only nominally scientific and actually more of a marketing manipulation."

Don't be a tool for marketing. An don't use a goddamn website that does nothing but compare pixel count in acquisition, workflow, and mastering to decide which fucking discs are worth buying. Yet we have way more posts talking about "realor4k" as a useful decision making tool than the ones talking about how movies are actually made.

It's pointless.

Just watch the videos.
while i largely agree with what you have to say. I'm so tired of seeing the bolded. 4k image is roughly equivalent to a 8mp image. Not exactly demanding for a lens, pretty much any professional cinematography lens should be able to easily out resolve that.

I see this a lot in still photography, people saying some of nikon's lenses can't resolve the d800 which is 36mp, which simply isn't true. I've used rather old lenses with an 80 megapixel sensor, resolved it just fine.
 

captive

Joe Six-Pack: posting for the common man
Added this to the "Real or Fake 4K" link in the OP. Is that fine?

its fine. All else being equal a movie shot in 4k or above mastered in 4k should look better than an upscaled 2k master. Obviously other factors come in.

honestly blu-rays look great on my 106." the real upgrade for me is wider color gamut we get with HDR.
 
its fine. All else being equal a movie shot in 4k or above mastered in 4k should look better than an upscaled 2k master. Obviously other factors come in.

Likewise a movie shot on film and mastered in 2K or 1080p should look better than an upscaled DVD.

But then we have the first Terminator Blu Ray. Which does, but just *barely*.
 
Top Bottom