MadVR HTPC Enthusiasts' Thread

Most of the 4K UHD is mastered at 1000nit or 4000nit, it refers to maxCLL (max Content Light Level) IIRC. So even when the contents shot up to 4000nit, only portion of the screen is with bright pixels, and FALL (Frame Average Light Level) could be still low.

Movies like The Dark Knight, Kong Skull Island, Blade Runner 49 were graded on a mastering monitor able to reach 4000nits. but they do not have any content above 1000nits by looking at the MaxCLL value (brightest subpixel in the whole picture). The Dark Knight, Kong Skull Island, Blade Runner 49, all under 500nits. Even The Shallows is only around 2500nits.

On the other hand, films like Mad Mad and Batman vs Superman actually have contents above 4000nits, for instance, below scene has maxCLL >4000nits, it won’t make consumer feel uncomfortable when watching them in batcave with projectors (most of them <100nits) or in living room with TV + plenty of ambient lights.

I’m no expert in this topic too, can’t offer an answer to bro Gavin’s questions, but below are some information I quote from AVSforum which I think quite relevant and just for sharing (I took notes when I found some good post, and keep them in notion):

For SDR, the standard target in dedicated rooms is 50nits, not 100nits, despite the fact that the bluray was mastered that way. This works because power gamma is relative, unlike PQ that is absolute.

So those with significantly more than 50nits will tend to have two different calibrations, one for SDR with 50nits and the iris closed to max on/off contrast, and one for HDR with the iris fully open (sometimes even with a different lamp mode if they can deal with increased temp/noise) with as many nits they can get (though some prefer to keep the iris somewhat closed to lower the black floor/increase native contrast with good tonemapping).

For example, my SDR calibration is set to 50nits (iris -13) and my HDR calibration is set to 120nits (iris fully open).

If I used 120nits (or even 100nits) for SDR, not only would it be too bright with SDR content but I would lose a lot of on/off contrast and raise my black floor unnecessarily.

Bluray are 100% mastered to 100nits, that’s the SDR standard for consumer content

BUT when you calibrate a projector in a dedicated room, you use the same reference white as in a cinema room (for SDR), which is 48nits.

This is not a problem because as power gamma is relative, unlike PQ.

In a non dedicated room with ambient light, people use 100nits or more, with projectors or flat TV.

in a dedicated room (no ambient light, reflection treated). the calibration standard for SDR content is 48nits, not 100nits, even if the content was mastered to 100nits.

To recap:

2K SDR content is mastered to 48nits in cinema because there is no ambient light (P3), 100nits on bluray because ambient light is expected in consumers’ living rooms (Rec-709/BT1886). Reference white for home calibration in a dedicated room (usually with a projector, no ambient light) is 48nits. If flat TV with ambient light, it’s 100nits (or more).

4K HDR content is mastered to 106nits in cinema (P3 or BT2020), 1000-4000nits on UHD Bluray (BT2020 container/PQ).

Both are different grades (cinema vs consumer), but we can use 50nits for bluray/HD content in a dedicated room because power gamma is relative, unlike PQ gamma which is absolute. The main difference is ambient light/no ambient light.

1 Like

I actually don’t know it can be disabled, sorry, my living room TV (>5 years old) is just for show, it is only on occasionally for my wife to watch YouTube clips

Thanks Bro Matrix, lots of info that you shared, I will slowly digest.

I think the gist of it it seems that the mid tones cannot be mapped significantly brighter even if the projector system is capable of high nits on screen.

All that extra headroom is really only reserved for the peak brightness highlights.

The question then is for typical images of mid tones APL 20-30%, none of that extra nits headroom is being called into use, at all.

I really wonder how would the image look in SDR (50nits) , vs HDR on a 50 nits PJ, and HDR on a 100+nits PJ.

The comment of a very high calibrated brightness raising the black levels floor is something that I think is very important.

How is this trade off in high nits that is perhaps only rarely used, vs black floor levels, which is maybe more appreciated in wider content and usage?

To illustrate what I mean, here’s a picture and some data about ADL, which I think means average brightness of the image. Full screen max brightness white would be 100%.

That image is only at 5.9%! It is a full daylight outdoor scene, albeit overcast. Not even night or shadowy scenes.

How would it look on a 50nits calibrated PJ vs a 100+ nits calibrated PJ?

I guess ideally one would want the high peak brightness without sacrificing the black floor.

Here’s the link for the full thread.

https://www.avsforum.com/threads/the-sony-vs-jvc-projectors-comparison-thread.3049760/

Matrix, I found that scene in Batman vs Superman, but turning on MadVR’s OSD it wasn’t that bright when I tried to analyze it. Not sure whether it was the fact that I was streaming it in Dolby Vision from Movies Anywhere. For anyone who wants to look, its at 1:45 timecode

I thought I would share the scene that I used to debug bright scenes on my projector. Its accessible to anyone who has Netflix. Its from Selling Sunset S5:E3 “Coming for All Your Con” 4:08 timecode :point_down:

It’s a scene of a lifeguard sitting against a very bright background. If you look on the left, I have turned on MadVR’s OSD :point_down:

It shows for that frame, it has 2,320 peak nits :point_down:

LG’s DTM does a great job on bright scenes like this

Bro Sammy, are you referring to “measured frame” in madVR OSD? If yes, that’s odd, as AVS guys measured more than 4000 nits (in fact in BT2020 should be < 4000 nits but with gamut conversion, the brightness increase and resulted in > 4000 nits if IIRC). Here is the pic:

They are probably right, but I’m using Ver 113 of MadVR, which has a much simpler OSD and its not showing 4,744 nits. Could also be that I’m using streamed Dolby Vision from Movies Anywhere. If I can dig out the disc, I will try again.

I will see whether I can do a quick measurement this weekend, not sure whether I got this disc.

The more complicated OSD is not dependent on madVR build, you can get more HDR info by creating an empty folder or file named showhdrmode in madVR folder, and create another empty folder named showrendersteps to assess how much time is used for each render steps.

Mathias seems like to use empty folders, you can also create an empty folder named fastblur to cut short some rendering time (but this one might cause some issues, so use with care, you can delete the folder if encounter issues).

I finally found the disc and all is well. I’m not getting > 4,000 nits, but 3,434 nits in HDR converted to SDR BT2020. The problem previously was that I was streaming in Dolby Vision from Movies Anywhere and the Dolby Vision file was being tone mapped down to about 1,000 nits LLDV before being passed to MadVR. Here is my output from the UHD disc :point_down:

I didn’t know about the empty folders and additional HDR Info. Thanks

3434 nits for BT2020 makes perfect sense, P3 and BT709 are with higher luminance level due to gamut conversion (highest for BT709)

You are most welcome Bro Gavin. I just came across this article below, very informative. Though it still does not answer your question, it still very fun to find out a lot of studies, including human eyes’ detectability of black level, with different initial adaptation levels taken into consideration, etc.
High dynamic range television for production and international programme exchange

In the begining of the article, it also corrects the misconception about HDR being simply brighter pictures:

The misconception about HDR being simply brighter pictures arises from the fact that the maximum luminance capability is indeed much higher than standard dynamic range (SDR) television. However, this higher maximum is primarily used by the highlight regions of images. While the highlights will indeed appear brighter [1], they are nearly always small in region, and the overall image may not necessarily appear brighter. This is because the overall appearance of an image’s brightness is dominated by the average brightness, not the small regions usually occupied by highlights. One type of highlight is the specular reflection. The advantages of having more accurate specular reflections enabled by HDR include better surface material identification [2] as well as in depth perception, even with 2D imagery [3] [4].

Interesting question.
Presuming that we start with a modern movie that was shot in digital with high dynamic range. Say Batman vs Superman which has highlights close to 4,000 nits.

The SDR image (from 1080p Blu Ray) is what the studio wants you to see. So a human, maybe the colorist has taken the 0-4,000 nit source and compressed it into SDR’s 100 nits before pressing it onto the physical media. The look of the SDR Blu Ray is what the human wanted you to see. His decision.

For HDR on a 50 nits projector, you have a choice because you can do your own tone mapping from the 0-4,000 nit source UHD Blu Ray. You have a variety of tone mapping options:

  1. Most projectors have static curves. My BenQ had 5. Epson has something like 10. With the BenQ, I fiddled the curve with every movie until I got the best curve before watching. Some JVCs allow you to load a custom curve.
  2. Some projectors (LG, JVC) have dynamic tone mapping i.e., frame by frame analysis by the projector and tone mapping. Your picture reflects the decision of the creator of the DTM algorithm. For JVC, you can alter this algorithm in subtle ways
  3. MadVR and Lumagen that use powerful external processors to do dynamic tone mapping. Both have a myriad of options to adjust how you want the picture to look. After processing the HDR, the output for projectors is generally SDR, but with a REC 2020 colorspace. This should be as much as a projector can handle.
  4. A few Chinese projectors now support Dolby Vision and Epson etc support HDR10+. For the appropriate content, the projector should tone map according to the a human’s decision on how the DV or HDR10+ meta data is encoded. Of course, Lumagen and MadVR are free to improve on the studio’s decisions.

It’s possible for Lumagen or MadVR to perfectly replicate the human colorist who did the SDR mapping, but since humans vary, its unlikely that they will come out with the exact same decisions.

For HDR on a 100 nit projector, all of the above apply, but the highlights have the capability of being tone mapped brighter since you have higher dynamic range. So the image should look ore like a higher dynamic range LCD or OLED. However, if you tell MadVR or Lumagen to tone map to a maximum of 50 nits, then despite having 100 nits available, it will still look like the 50 nit projector. Might be useful for those who don’t like the HDR look

1 Like

Some related trivia in Zack Synder’s Batman vs Superman

There are at east three UHD releases. The original one mirrored the theatrical release. Then they released the Ultimate Edition, which added about 30 minutes, making a 3hr movie and it storyline made more sense. This was in preparation for Justice League

Finally there is a 2021 remastered edition. Seems like good old Zack was not happy with the colorist who did the 2016 extended edition, especially since his upcoming remake of Justice League was coming. So, he remastered it with more Rec 2020 colors to make it look similar to the colors of Justice League. If you have a Rec 709 display or a P3 display that doesn’t cover a lot of colors, you won’t get much from the new UHD Blu Ray.

I think this highlights that what we see is the decision of a human, and that human is sometimes not the director. So I think it’s ok for MadVR or Lumagen to just produce eye candy whether or not it looks like what the studio would have wanted.

1 Like

This is how madVR custom curve looks like (one example in below attached photo), custom curve feature was first introduced to madVR in build 123, below example illustrates how does a frame peak nit up to 2230 nits map to a 100 nit display. Users can tweak the curve by adjusting knee and steepness. madVR offers curves @110nit, 509nit, 2230nit and 10000nits, for scenes fall in between these 4 frame peak nits, it will calculate the curve with build in algorithm (for the 4 frame peak nits, it further breaks down to x1, x2, x4, x8 and x16 mappings so end up user needs to define 20 curves) . From build 137 onwards, madVR offers predefined custom tone mapping curves so users can ignore the tedious process to define 20 custom curves. and now the latest build is 169, almost finalized all the predefined custom curves for user to select.

The curve gives you a rough idea how the 0-2230 nit HDR picture mapped to 0-100 nit display.

2 Likes

Nice, thx for sharing the curve bro

you are most welcome bro Foodie!

i think it makes a lot of sense, it comes back to the average APL/ADL mapping vs the peak highlights mapping.
If we look at the 15.6 nits input, and follow the straight line linear map to 15.6nit output, before applying the curve from ~there to the 100nit . So anything from 15.6 to 2230 nits input is mapped to 15.6nit to 100 nits.

Meaning on low ADL/APL scenes below 15.6nits, mapped on the straight line EOTF, it should look identical between HDR and SDR?

I would think so, and this is for scenes go up to 2230 nits, majority of the movie scenes won’t go up so high. so for scenes with frame peak at 110 nits, can do 1:1 without issue for 100nit display, 509 nit scenes can have more 1:1 than 2230 nit scenes, so on and so forth. I attach the other 3 graphs below for reference:



Pardon my ‘theoratical questions’ in this case 0-15.6 nits input is mapped to 0-10.7 nits ( or maybe i’m reading Y axis wrong) means the output image in the mid tones may look slightly dimmer than the intended input?

Sample EOTF curve from Rtings. Here the input and output is relative 0-1, not absolute nits. But they typically try to keep the mid tones (~ up to 60%) linear on the straight line EOTF, and then do a soft knee clip to the max display capability.

I understand that to mean that the mid tones say 0-10FL or 0-30 nits, (wherever it is ) is not compressed on the PJ image, and it is only the highlights that is severely compressed to fit into the remaining projector headroom, whether from 30-50 nits, or 30-100+ nits?

So different types of curves, where the knee starts, and how hard or soft the clip will determine how the HDR actually looks. Eg the blue curve is a hard clip, while the green curve is a soft gradual clip.

it would be fun that MadVR allows to experiment with these and see how the HDR image looks.

Here you can use this…
https://drive.google.com/file/d/1qL3EmkF2R23ACIDbh2l5S9NgZ0o4tk07/view?usp=sharing

I have just confirmed the same scene on my OLED on Disney+, using the OLED’s native DV processing…

the image looks identical

It looks vastly different from Sammy’s and Desray’s calibrated version

As you have correctly observed, the LG projector version looks much darker… May not be to yours or others liking…Its perfectly adjustable with Iris-Gamma and energy saving/ tone mapping curve adjustments… but i have tweaked it to such as it suits my environment on a 135’ screen in the living room…

But in person, the face looks black exactly like an African Lady… there is no “presence of any red” in the image…

One thing i have not seen is “Foodie’s” version of calibrated image…

One conclusion, tweak to your heart’s content. choose whichever curve you like, as long as you are happy with the image balancing between details , black levels , tone mapping and contrast

Enjoy!!

1 Like