For those who had followed this thread: A newbie who seeks to learn TV calibration - need advice, you know I don’t own a LG OLED TV as such it makes it harder for me to dispense any useful advice on how to do a proper calibration for a TV display. I know XP member Petetherock paid an ISF-certified calibrator to help calibrate his TV set. So Gil aka skull_candy might want to drop Pete a PM to discuss whether you can engage the calibrator service. With that out of the way, let’s get back to the main topic on HDR calibration on a Projector.
Difference in display calibration between TV and a Projector
If you ask any projector owner what is the hardest part in calibrating his projector…Almost everyone, me included, will tell you, HDR. And this is true and there are good reasons why projector, compared to its TV brethren is harder to showcase the glory of HDR. The simplest culprit is of course, “brightness” (luminance) or the lack of thereof. This, we all know. That’s why there is a demand for a laser projector that can provide a more consistent and brighter output compared to a lamp-based projector - e.g. JVC N series NX5, 7 & 9. But if a laser projector is the answer to achieving a better HDR image, why then it is still so hard to calibrate HDR on a projector? The answer unfortunately still remains the same - not enough headroom in the brightness or luminance department to be able to reproduce the right color gamut at a given level of luminance. This is not the fault of a projector but it is just the way home theater projector is designed for home use. Just imagine, using a data projector with very high lumens and watch it in a light-controlled environment or worse in a very dark viewing environment (man-cave) and you will give up watching any movie within a matter of minutes because your eyes will be squinting. Hence a suitable home theater projector will usually sport a lumen of around 1700 to 2500 in order to ensure plenty of brightness over the projection of a large screen - e.g. 135" - 150" while not making it comfortable enough for humans to sit through the entire length of the movie comfortably w/o suffering a headache! Now that we know the reason why the projector is designed in this way and its limitations that comes along with the design. The key benefit of a laser projector is the consistency in achieving a more constant lumen output over a longer period without the drastic drop in lumens like a mercury lamp module in a lamp-based projector. In short, there is less incidence of color shift over time.
Tone-Mapping is a must-have for Projector
So can projector owners still able to enjoy good HDR content on a big screen? The answer is yes but with the use of tone mapping. In fact, a good tone-mapping algorithm is more important for projector owners compared to OLED TV owners. By and large, HDR content is actually authored with 4K TV owners in mind rather than projector owners, considering what I have already mentioned earlier - i.e. lack of luminance for a more accurate rendition of wide color gamut (WCG) that usually falls within the spectrum of BT2020 colorspace. DCI-P3 is usually used for cinema projection of HDR content and also falls within the boundary of BT2020 colorspace. Unless we are using a reference monitor that is able to reproduce master nit levels of 4,000 to truly get a near 100% DCI-P3 levels, no conventional TV in the market is able to get that, let alone a low-nit display like a projector. In short, it is easier for TV owners to get better HDR images with very minimum effort expended. More often than not, the TV already comes with a preset for HDR viewing and will switch to that preset when an HDR10 or Dolby Vision flag is detcted. This is also the reason why you don’t see many 4K HDR calibration threads for good 4K TV from prominent makers like LG and Samsung. Of course, I am not saying there isn’t a need but rather less of a concern to obtain relatively good HDR images out of the box compared to a projector. One can always enlist Vincent Teoh’s help since he had reviewed and calibrated many 4K HDR TVs over the years
As mentioned previously in my various calibration posts back in XP forum days…tone mapping is a form of compression, meaning it compresses the color gamut of a master bootleg copy of a movie (say authored in 4,000 nits) to fit and conform to the display capabilities of a projector which could be 1/10 or less of 4,000 nits. If you still don’t get what I am saying, think of JPEG vs RAW image file captured by your camera. JPEG is highly compressed, certain details are lost in translation but it still retains much of the content that you framed when you took the shot. If we look at the RAW image, the colors will always be somewhat “muted” and requires further processing to be done in the aftermath - i.e. via the use of photoshop to enhance the colors to make them more vibrant. JPEG image already has some kind of processing algorithm going on to make the image looked more vibrant. Recall the feature called HDR? Translate this into the projector and you will begin to see the similarity between camera HDR and projector HDR. So JPEG in a way can be viewed as a form of tone-mapping that provides “on-the-fly” scene by scene or frame by frame analysis of each image pixels on an image taken by the camera and infusing HDR algorithm into a “still” image. Many people termed this as computational photography and Pixel phone owners will know how good Pixel phones take photos from their camera s/w.
The “proposed” way to calibrate HDR on a Projector
There must be a way else why third-party calibration s/w in the likes of Chromapure, Calman, Lightspace to name a few are all rolling out HDR calibration modules. That’s right! It is possible but probably not in the fashion you envisage. Before my encounter with the Lumagen Radiance Pro, I am also struggling with HDR calibration. It is a “hit-or-miss” for the most part. You will realize in an instant when something is not “quite right”…For instance, blue is the hardest color to calibrate imo especially if we delve into the BT2020 colorspace. Different nits take on a different shade and tone of blue. If the projector does not have the “headroom” to display the wider spectrum of blue during calibration, it will take on a “purplish” hue or tone. This is a very common color anomaly when we do HDR calibration using the wrong color gamut intensity since human eyes are more sensitive to different shades of color hues as the luminance changes. A typical bluray movie is mastered in YCC 4:2:0 at 8-bits and conformed to Rec709 colorspace. This is easier to calibrate since most of the conventional display these days have no problem getting almost 100% of the Rec709 color gamut right on point. Fortunately, projector display is able to achieve that with relative ease. Even any 1080p projector these days can attain that. But with the advent of 4K HDR content, the nits mastered can be as high as 10,000 nits for a movie authored in Dolby Vision or 4,000 nits for HDR10. No consumer-grade TV is able to reproduce the HDR images faithfully as the Director’s intent, let alone projector.
I discovered that in order to calibrate a projector display suitable for 4K HDR viewing, we will need to calibrate it using BT2020 within an SDR container - i.e. SDR2020. This is how Oppo 203/205 is able to reproduce brighter and more colorful images when the user invokes the option of “Playback HDR content in SDR2020” mode. When we do that, we change the way gamma is calibrated for HDR content. How so? Recall, HDR utilizes the PQ Curve (ST2084) or Electro-Optical Transfer Function (EOTF) while SDR utilizes the typical gamma curve which is linear by nature. It is easier to calibrate colors when we have more control over how the gamma curve will react at each IRE levels (10IRE to 100IRE). So what you can do is to select BT2020 colorspace and choose a familiar gamma like 2.3 or 2.4 to work with. Depending on the viewing environment like whether it is a dark room. If it is a dark room, power gamma curve of 2.4 is recommended. Choose 100% intensity for the color gamut which will be used for your 1D or 3D LUT CMS. By choosing SDR2020, the color patterns used for CMS will be more accurate since gamma is now linear (at 2.4) without the fluctuations which will occur if we follow the PQ Curve which will change at each stage and even flat out once it reached a certain level. It is the changes in PQ curve that will affect the chromacity of the colors and the resultant shift in color hues and intensity. Most conventional calibration s/w and colorimeter simply may not have the capability to resolve such high intense colors due to the lack of luminance - recall the limitation of a projector that I have had mentioned time and again. And this is the reason why HDR calibration on a projector is sooooo damn hard and at times impossible. My advice for those who are unable to do a proper HDR calibration to stick to SDR Rec709 instead or use SDR2020 for a change. If you use SDR2020 and the result is poor, this is likely attributed to the colorimeter that you are using. You will need to buy or change to a better one. I recommend iDisplay 3 Pro which will not burn a hole in your pocket but was able to accomplish the task with a faster reading time (especially for near black at IRE10 - 15 range).
Dynamic Tone-Mapping (DTM) ain’t perfect…
DTM on a projector is not perfect. Anything that has a “dynamic” as a prefix, you will know there will be compromises and trade-offs. This comes in the form of clipping. Yes! white and black clipping are the by-products since the projector (in the case of JVC FrameAdapt HDR feature) and Lumagen Radiance Pro VP will have to analyze scene-by-scene on what is the Average Picture Level (APL) and what is the highest recorded nits (brightest) within a scene. This will require a lot of computational prowess and is CPU intensive. It is very hard to get it right all the time. In some scenes, there will be very intense specular highlights of saying an incandescent lamp lit brightly at 1,000nits but the APL of the scene may hover around 200nits for the most part, the deviation between 1,000nits and 200 nits gap will have to be filled in by the DTM algorithm. This is why it is so tough and that that is also why I am super impressed by JVC and Lumagen’s proprietary DTM which did a very respectable job to bring HDR content as close to the Director’s intent as possible.
For TV, this is a breeze walk in the park since the more expensive ones come with Dolby Vision decoding, this is less of a problem since Dolby Vision contains dynamic meta-data which contain every scene-by-scene analysis for the entire movie and encoded in the enhancement layer of the disc itself. So all the TV needs to do is to “read” and “execute” the meta-data and display the image as close to the Director’s intent as possible (when properly calibrated).
The day when projector is able to incorporate Dolby Vision or its variant called Low Latency Dolby Vision (LLDV) into the display chip, we may not need DTM anymore since LLDV is presumbly using the same set of dynamic meta-data used on a TV. I believe this is possible so long Dolby does not demand a high royalty premium to incorporate LLDV into a projector. So far, we already see some “middleware” gadget like HD Fury Vertex 2 and Diva which can “spoof” the display EDID to allow player-led LLDV like NVidia Shield TV 2020, Apple TV 4K and Ziddo Z9X to name a few to do LLDV on te source itself before sending it over to the display. Of course, the PQ will always be less superior compared to a standard Dolby Vision decoding performed by the TV itself.
Hope this discourse will change the way you look at how HDR is implemented and calibrated on a projector.