*Official Thread* Lumagen Radiance Pro Video Processor

I think our discussion on getting the “right” BT2020 is more appropriate in the other thread: Why it is so hard to calibrate HDR on a Projector and more

Let me briefly just point out the way Lumagen Radiance Pro do HDR since this thread is on Lumagen Radiance Pro. There are 2 ways to do it. One is to calibrate calibrate to SDR 2020 using SDR test patterns built in the Lumagen (advocated by the founder of Lumagen) and the other way is the software calibration creator (in my case Chromapure). Both have a different take on how HDR should be calibrated on a projector. The punchier, detailed but more “reddish” tone that many have had an opinion about is largely due to the use of expanded color modes or what mostly known as WCG. SDR in 8-bit can only reach within the Rec 709 gamut. There are far more different shades of red, green and blue (formed the primaries) and secondaries.

To date, doing 3D LUT in a colorspace is NOT perfect (as of 2019) for all home-based calibration software for projector, which is to say that none of us can firmly say that this IS or IS NOT how a HDR PQ should look like. In Lumagen camp, SDR 2020 is the way to go as it believe projector is unable to provide a good HDR image due to its low nit (hovers around 400 - 500nits) hence it is neigh impossible to do a proper 3D LUT calibration using purely HDR test patterns. Instead Lumagen utilize its own proprietary EOTF PQ curve to reproduce HDR but in SDR 2020 container. The result is more natural PQ that is nearer to SDR content (almost like Rec 709). Depending on the intrascene, SDR 2020 will shine when specular highlights are shown displayed and to capture those moments.

For Chromapure, it has a set of HDR10 gamma tone curve (HDR10-Projector, HDR2020, HDR10 with black compensation) and advocate the use of those in conjunction with their own set of HDR test patterns suited for HDR display. The software aspect is more versatile as it targets more on a general display market for both TV and projector or even a good gaming PC monitor. The results will be different depending on the content you played. This will often result in more vivid colors and warmer color tones bordering “red”.

At the end of the day, BT 2020 should push for “more shades of primaries and secondaries” whenever a HDR metadata is flagged by the source to the display. It is up to individual display (projector or TV) how well it can do their tone mapping. We are used to Rec 709 because for the longest time we have been watching bluray content authored in Rec 709 at 8-bit YCC 4:2:0. The moment we are presented with more vivid colors, we tend to question ourselves is this correct? That doesn’t look natural at all, especially the skin tones but there are times you can’t deny that the luscious green pastures or the bright burning cinder and the bright luminous stars in the dark skies look breathtaking.

And yes, both JVC and Sony projectors are capable of displaying P3 colors albeit not 100% as it is not possible for modern display to show all that shades and hues, definitely not for Low nit display like projector. The safest way to calibrate is still SDR (using Rec 709 or SDR 2020) depending on your display and the accuracy of your color probe. I will say if your display is capable of displaying “more than Rec 709 colors” then one should always calibrate at BT 2020 but make sure to set the target reference gamut in your calibration software to BT 2020 before you start calibrating. So far, I’m still experimenting with SDR 2020 and HDR 2020 calibration and the former produce more stable results compared to the latter because of the color probe limitation to do a proper 3D LUT calibration. I believe there is a probe out there that can do that but it will definitely not be cheap.

Keep your observations and comments coming. :grinning: