tonight did a check on my madVR and realizing there are something wrong with it, change different settings resulted in no PQ change. I uninstalled it completely, and reinstall, and setup from scratch with some modifications from previous settings. and took a few photos, for you all to see the differences.
taking a closer look at bro Sammy and bro Desrayâs photos of the same scene, seems I have a bit of black crush - the outline of scarf is not visible, need to revisit this scene tomorrow to see whether itâs camera issue, or it is really invisible.
@Ronildoq bro Bryan, do you have a sample of this scene? I donât have black panther movie, my demo clip is only a few minutes long for movie starting scenes. Iâd also like to see how my projector + madVR perform for this scene if you have a demo clip for me to play.
ya, only a few have the luxury to own a reference monitor, itâs not surprising Samsung boosted some colors to make it more pop to please normal consumer as shown in the Vincentâs video , after all, not many people care about directorâs intent (and without a reference monitor, they canât tell as well), while eye candy can boost the sales, why not.
I agreed with Gavin. I wonder if there is a website that showcases scenes using the Sony Reference LCD monitor that Vincent and others alike are using as a comparison rather than posting our comments based primarily on personal preferences. Once we have a baseline to reference, we will know how far each display deviates from the reference. The rest of the variables like I want more highlights in this scene or less of something will be entirely moot.
My personal take on HDR, DTM and projector as displayâŚ
TL;DR version of itâŚThere is no way to get a proper, much less a reference image for a low-nit display like a projector. The moment we allow each manufacturer to come up with their own iteration of the DTM algorithm, there is no such thing as a reference or perfect image. A projector is unable to do 1,000 or 4,000 nits like reference a Sony monitor which does not really need any DTM. This is exactly the reason why DTM results will vary from display to display the moment HDR is utilized. So instead of pursuing something that is arbitrary and beyond oneâs control, the best way to implement DTM for HDR content is to have HDR converted to SDR using a power gamma of a certain value - e.g. 2.2 2.3 or 2.4 and then apply the scene-by-scene or frame-by-frame analysis (DTM) to come up with a more accurate representation of the image near to reference. This is what MadVR and Lumagen Radiance Pro are doing and I still believe this is the only way to watch HDR content.
I have been calibrating various 4K HDR-compatible projectors over the years on my own and I have witnessed the limitations in getting HDR to look right. This is why a JVC projectorâs very own proprietary DTM algorithm like Frame AdaptHDR will never achieve the best HDR image because there are so many limitations as it tries to conform to the PQ ETOF curve which is so much harder compared to a normal gamma curve. Iâm sure LG has a pretty good DTM for their laser projector but it will still face the limitations that I have mentioned above. This is why some brighter scenes at certain APL settings will yield good results in one movie but didnât do well in another movie.
I havenât done any calibration myself so I canât really contribute on the technical aspects of SDR gamma vs HDR EOTF curve.
However, I have some fundamental questions that I hope the combined expertise here can help answer.
One of the key difference between TV and projector, besides peak brightness, is the screen area. A 110in image is equivalent to 4x the screen area of a 55in TV. I learned recently that lumens = screen brightness in foot - Lamberts X screen area in sqft.
For a quality 55in TV displaying a calibrated HDR image, if the projector is actually able to output enough lumens to have the same on screen nits for 110in image in HDR, there will be 4x the amount of lumens vs the 55in image.
I think we can probably agree this will be way too bright and uncomfortable to watch. This is the theoratically upper limit for the projector HDR image to be equivalent to the TV image.
Another calculation perhaps, is to aim to have the same total lumens output, which means the PJ 110in image brightness (nits, FL) will be 1/4 of the TV image, since the lumens is distributed over 4x the screen area.
My 2nd question, based on the above approach, compare a PJ image with 50nits peak brightness, vs one with a very impressive (theoratical) 200nits peak brightness. Say the same image size 110in.
For the 50nits image, I would think, the way the EOTF curve works, the image from 0%~70%IRE is mapped onto the first 0-30nits. The highlights 70-100%IRE is mapped to 30-50nits. This is already slightly brighter, or similar to a bright SDR image.
For the 200nit image, I donât think it make sense to map 0-70% IRE to 0-120nits. And then 70-100% highlights from 120 - 200nits.
If this doesnât make sense, is the EOTF then supposed to map 0-70% to the similar 0-30nits? Maybe use 0-40nits. And the remaining 70-100% IRE is using 40-200 nits?
If so, the difference is really only in the highlights brightness? And with the large screen, those highlights will have 4x or more lumens than the equivalent TV image, making it totally blinding, especially if watching in a bat cave.
Your questions are all too technical and also not sure if that is the correct way to peg IRE Ievels to lumens. I donât think anyone here will have an answer for you. Just remember that for comfortable viewing, the projector needs to provide at least 16 - 22fL for SDR content and at least 16fL (or 51 nits) for what I would term as âwatchableâ for HDR content. The bigger the screen, the more light output you will need from your projector and screen gain will help to a certain extent but only up to a certain gain (it should not exceed 1.3 imo).
A good level will be around 30fL and above (which translates to about 102 nits). My JVC NZ7 projector is able to churn out 96 nits at high lamp mode and that is already pretty impressive for a 2,200 lumens projector using laser as the light source.
DTM is all about compression which I believed you already knowâŚthis is akin to the recent topic of dynamic range compression (DRC) that we talked about in the other thread that delves into DialNorm etc.
Bottom line is a typical home theater laser projector simply doesnât have the ability to produce a high enough light output to lit up a big screen like 150" for instance as it becomes exceedingly hard to maintain even at a reasonable range of 70 - 80 nits for HDR content.
If you want bright and able to preserve the dynamics, go for OLED TV that will not exceed 100". If you want to go the projector route, you need DTM to preserve the specular highlights and shadow details albeit at the expense of brightness.
Iâm not that technical, but Iâll give your first question a shot from what I know
Whether something is blinding or not on a screen is foot-lamberts, not nits. It also depends on how much ambient light there is. In a bat-cave, you need a minimum of about 12 foot lamberts. Anything over 22 foot-lamberts would start to be blinding and a good target is about 16 foot lamberts.
Given a projectorâs spec, say my LG laser projector is 2,700 lumens, the brightness (in a completely dark room) would depend on how far away the screen is from the projector. The closer I move the projector to the screen, the brighter the image will be. But there is a limit of the projectorâs lens design in terms of minimum focussing distance. Minimum focusing distance not withstanding if I bring most projectors close enough to the screens, the screen would measure > 22 foot lamberts and be blinding. The only reason projectors are not blinding is that they are normally far enough away from the screen to be in the 12-22 foot-lambert range.
Ambient light in the room makes a big difference. You have to achieve a higher foot-lambert on the screen if you have ambient light. To see that ideal 16 foot-lambert image on your screen in ambient light, you may need 50 foot-lamberts.
So when selecting a projector, there are many factors to consider including the projectors lumens output, the size of your screen, distance of the screen from the projector and ambient light. Before my current LG that output 2,700 lumens (less in more accurate modes), I could not get a satisfying picture during the daytime. For my room, a projector has to be around 3,000 lumens to project satisfyingly during the day. However, if I use my daytime mode at night, the projector would be blinding because I have exceeded that 22 foot-lamberts.
The same is true for my Panasonic OLED. I have calibrated both a day and night mode. If I watch the OLED night mode in the daytime (no curtains), its too murky and dark. If I watch the OLED day mode in darkness, its too bright, and headlights of cars would be blinding.
For a projection system, the other issue is the screens reflectivity or gain. All I know about that is that you can measure it in something called candelas per sq foot, which is derived from the foot-lamberts and the gain. Perhaps @Alf can help
You will come across people saying that some movies are very bright or dark. The Meg is a very bright movie, whereas The Batman is very dark. It depends very much on what the movie director decides to impose on his audience and most directors will not want to blind the audience. Displays have become so bright now that a badly designed movie can cause visual discomfort, but well designed high nit scenes are usually ok because average light levels are controlled. For example, if there is a starfield in something like Startrek, its ok for some of those stars to be, say 4,000 nits because they are about a mm squared and relative to the black background, the overall screen bightness would be low enough to be comfortable. Displays which canât produce the 4,000 nits will look less realistic. Similarly for small fires and explosions.
So, in answer to your question about the 110 inch image being blinding, it generally wonât be in movies, because even if the 55in TV outputs 4,000 lumens, it will just be in a tiny area of the frame like for a starfield or for a few small highlights like fires or lights. A big explosion or sun shot would definitely be tuned down or displayed just for a short time to avoid video discomfort in the audience. A projection system with lower maximum nit capability, would not capture the same highlights as a TV and be less realistic, but it would still be quite visible.
I think its a power issue. My 65 inch OLED already consumes 500W so if it had to throw a full field image at maximum brightness, it might need 4,000+W and I would have an oven in my living room. But that 10% window is important for star fields, small fires and lights to make them look realistic, Besides as mentioned before displaying a full field image in 1,000 nits would probably blind the audience
OLED has ABL (Automatic Brightness Limiter) Circuitry which is designed to protect the panel, resulted in brightness drop from 10% window to full screen display; For example, 700nit 10% window brightness might drop to 150nit in full screen display in oled TV.
Thanks Sammy. It has been the prevailing message that larger projector screens require higher lumens. What I was trying to say, not so directly, is that 16 FL minimum brightness on screen doesnât exactly scale linearly.
Also, I just recently realized that nits and FL is just a units conversion⌠As if it is not confusing enough.
The common 16FL (14.6 FL) threshold is roughly 50 nits. A 110in screen is about 36 sqft. Assuming screen gain of 1.0 for easy math, that needs 16 x 36 = 576 lumens. So 576 lumens is about 50nits on a 110 in screen. 1100+ lumens calibrated is a common result I see in PJ reviews.
A 55in TV is 1/4 of the area. With the same 576 lumens into the room, that would be 200 nits. For entry level TV, 400 nits brightness spec is normal. We are back at about 1100 lumens.
Absolutely agree that ambient light makes a big difference on the required screen brightness. Critically so for a PJ image. I was speaking purely from a bat cave perspective, for maximum image quality and performance.
I think as screen sizes get larger, the target FL or nits on screen for a comfortable image to the eye, in a bat cave situation, goes lower. Otherwise, for the same nits, the number of lumens goes too high and the image becomes too bright and glaring.
I suspect that is why Bryan is finding he likes a lower calibrated FL image. Because his screen is large, and therefore the light into the room is higher.
Personally, Iâve been finding that I seem to have plenty of light even with my old Eshift5 JVC, which from reviews puts out about 1100 lumens on low lamp. On my 150in screen , that is 16FL. And that 16FL can already get eye glaring bright on SDR and HDR.
In fact, I have stopped using High lamp mode on HDR, crazy as that sounds. Will get it calibrated eventually and then see how the numbers add up.
I seeâŚthe JVC lamp module is not bad. It can last quite long. The first few batches very poor quality that I need called JVC and replaced the lamp for at least 3 times until I gave up and switched to Sony.
I was also trying to make the point that very high lumens projectors, say 3,000-4,000 in a home theater, wonât blind you even in a bat-cave because film directors will put only small dots or small flames at those high brightness levels. Similarly, a star field with a few 4,000 lumen dots on a 4000 nit Sony reference monitor will be ok and will look more dramatic than the same 100 lumen star field dots on a projector screen. Flames will burn brighter at their center and explosions that output 4,000 lumens for a brief second will look nice and realistic. Overall, with a brighter projector, you should get more of that HDR look of high nit TVs without being blinded. Whether one likes the HDR Look of high nit TVs is debatable. However, if the video puts out 16FL on a large area in white that gives eye glare discomfort, then that may be an error on the part of the film editing.