So what is Dolby Vision and what’s the hype compared to HDR10?
In a nutshell, Dolby Vision (DV) is a form of HDR hybrid which derives its HDR layer from HDR10 (open source). By adding a separate layer “underneath” the HDR10 (base layer), DV is born. There are some technical differences that make DV reign supreme over its HDR10 counterpart. Let’s see:
Colorimetry: P3^ up to Rec 2020
Colorimetry: P3^ up to Rec 2020
‘^’ - For many player-led decoding of DV layer (LLDV), the colorimetry is utilizing ICpCt (or IPT) colorimetry within the BT2100 boundary. It is close to the DCI-P3 colorimetry that most of us come to know. The most well-known will be streamers like Apple TV 4K, nVidia Shield TV 2019, Roku Ultra 2020, and players like Oppo 203 (which you have a choice between Player or TV-led) and of course for the media front, we have the models that runs the RealTek (RTL1619DR) chipsets like Zidoo Z9X.
Mastering Luminance: Usually up to 4,000 nits although it can go up as high as 10,000nits (no display capable of doing that at the moment, let alone the content)
Mastering Luminance: Usually between 200 nits and 1,000 nits (Depending on the types of TV or Projector display in use. For instance, to achieve a brighter image quality for more brilliant HDR imagery, LCD panel with FLAD and multiple zones will be the best compared to its OLED counterpart which will not get as bright but win over Contrast what makes the picture pops)
Another important difference between DV and HDR10 is that DV contains “dynamic” metadata as opposed to “static” metadata from HDR10. The TL;DR version of it is that dynamic metadata allows scene-by-scene real-time analysis by the display (OLED/LCD TV) or source (bluray player) to determine what is the peak white and black for each scene in a movie and then adjust the settings of the display accordingly to provide the best PQ in terms of colors and contrast. HDR10, on the other hand, receives “static” metadata which tells the display to work within the parameters of the movie authored in the studio. Since it is not “dynamic”, it will utilize one set of number (brightness or luminance) to determine the relative brightness of the “entire” movie based on the PQ Curve (ST2084 standards).
Image source derived from: The HDR video ecosystem tracker - FlatpanelsHD - courtesy of member Yoeri G
The confusing state of Dolby Vision…
So far, I have touched on the basics of DV and its benefits over HDR10. Now let’s talk a little bit more in-depth of DV itself. Recall I mentioned LLDV (Low Latency Dolby Vision), this is in fact a variant of DV. As you can see, DV, unlike HDR10 is not FREE, TV and 4K UHD players manufacturers have to pay royalty fees to Dolby before it can decode the DV layer of the content (e.g. 4K UHD bluray disc). But Dolby knew their ultimate specs like having content authored in 10,000 nits and up to 12-bit bit-depth etc will never be realized any time soon. Compounding the issue is the limitation in hardware processor present in most TV, 4K UHD bluray player and media streaming devices. Sony OELD TV produced in the 2016 (like A1 and X1) will not have the ability to run full 4K/60fps content because it is only capable of decoding up to 4K/30fps.
Dolby likes to complicate matters, just like that way it complicates Dolby Atmos when it first rolled out circa 2013 - 2014. They provided a white paper on this and requires manufacturers of Atmos-enabled speakers to conform to certain requirements like HPF to crossover at 180Hz before it can be certified as a true Dolby-Atmos enabled speakers :P. I guess history repeats itself with the Dolby Vision standards. The problem with Dolby Vision is the complexity of the way Dolby roll out its DV implementation. Believe it or not, there are at least 6 known profile levels. Within a given profile, the maximum level a base layer (BL) or enhancement layer (EL) is restricted by the Profile. If you have used MakeMKV s/w to make back-up copies of your precious bluray or 4K UHD bluray titles, you should be familiar with terms like Profile H.265 Main10 4.1/5.1 etc.
Now let us break down what are some of the Profiles that Dolby has had dished out over the years.
First is the single-layer (Profile 5 & 8) that most LLDV-based sources will prefer - e.g. Netflix, Apple TV+, Disney Plus. Besides streaming devices that utilize this type of DV implementation are the media players such as Zidoo Z9X and nVidia Shield TV 2019 that comes with DV decoding capability (LLDV). Now there has been rumors that the new Xbox Series X/S console will also utilize LLDV (Player-led) as the preferred form of DV instead of the unadulterated version. If this is true, then the built-in 4K UHD bluray drive may not be able to read the dual-layer (usually Profile 7) containing the DV for TV to process and decode.
If we look carefully at the table, you can see the maximum Base Layer (BL) supporting the HDR10 layer to be at 10-bit H.265 (HEVC) and supports Profile of 4.2 up to 6.2 while the Enhancement Layer (EL) supports anything between 4.1 to 5.1. Within the EL, there are Full EL and Minimum EL.
In essence, the EL is a 10-bit video bitstream that carries the residual between the source and the BL and the “dynamic” metadata for DV. An enhancement layer that carries a residual signal is = 0 (zero), that is, the decoder does not need to process the residual signal, is called minimum enhancement layer (MEL). If the residual signal is > 0, it is called full enhancement layer (FEL).
An example of a typical MKV file encoded with a DV stream:
Disc Title: Gladiator.2000.MULTI.COMPLETE.UHD.BLURAY-EXTREME
Disc Size: 92,459,896,796 bytes
Extras: Ultra HD
BDInfo: 0.5.8.7 (compatible layout created by DVDFab 10.0.8.4)
Length: 2:50:56.495 (h:m:s.ms)
Size: 79,038,529,536 bytes
Total Bitrate: 61.65 Mbps
Codec / Bitrate / Resolution / Frame-rate / Aspect Ratio / DV Profile / Chromacity / Bit-Depth / Colorspace
Base Layer: MPEG-H HEVC Video / 43319 kbps / 2160p / 23.976 fps / 16:9 / Main 10 Profile 5.1 High / 4:2:0 / 10 bits / HDR / BT.2020
Enhancement Layer: MPEG-H HEVC Video / 6826 kbps / 1080p / 23.976 fps / 16:9 / Main 10 Profile 5.1 High / 4:2:0 / 10 bits / Dolby Vision / BT.2020
Yes, another interesting bit in the way MKV encodes its DV stream or EL is the resolution is defaulted to 1080p resolution while the BL (HDR10) preserves the 4K (2160p) resolution. This is because, for Profile 7, the maximum EL codec level is maxed out at 5.1. For a Main 10 Profile 5.1 High, the maximum bitrate is up to 40 Mbps and 130 Mbps. These bitrates is controlled by the various Dolby Vision Levels (DVLs). Again we use Gladiator 4K UHD 1:1 rip as an example, we knew the disc contained an EL because it has a DV layer (at Profile 7) which is encoded at Main 10 Profile 5.1 High. Look at the 5th column (first table titled, “Constraints on codec level”), you will see Dolby Vision Level (maximum) is “09”. If we cross-reference it to the DVL table (see table below), you will see that at DVL = 09, the maximum resolution is set at 3840 x 2160p @60fps. The maximum bitrate for a Profile 5.1 High Tier ranged between 40 Mbps and 130 Mbps.
Which is superior? Full unadulterated Dolby Vision or Low Latency Dolby Vision?
The answer depends. If you have a fully-compliant OLED or LCD based display like a TV, it will be good for TV to do all the DV processing instead of the source (4K UHD bluray player) because the display is almost always performed consistently across the board when it comes to image upscaling and resolving individuals pixels. There is a reason why you are paying so much for a FLAD QLED/OLED TV for a reason - i.e. the ability to scale the whitest whites and the blackest blacks with accuracy and finesse.
For Projector users, it is a Hobson’s choice. LLDV is the way to go if we wanted to get “dynamic” metadata working on our display. We need an EDID manager like the Lumagen Radiance Pro Video Processor which is uber-expensive or the more affordable HD Fury Vertex 2 or Diva to “spoof” the display EDID to sources like the Apple TV 4K or even Zidoo Z9X that a DV-capable display has been detected, please process the EL of the media file within the source and send the processed signal to the display, which in this case, the Projector.
In short, a player-led (LLDV) device will have limitations in terms of processing power. The side effects include unsightly “banding effects” and less punchy colors or uneven luminance levels which may cause the overall image to be too dark or too bright if the source like media players or streaming devices fail to decode the various type of DV Profiles and its accompanying DVLs properly. As the name (LLDV) implies, “low latency” requires less computational powers which equates to less than ideal PQ.
To date, Oppo 203 is the only 4K UHD player that I know of that allows both formats of DV (full DV and LLDV) to co-exist in one single solution. So for those who are still holding on to one, you should keep it until 4K streaming becomes a mainstay and we see the demise of physical media. Seems like Dolby is not going to rest on its laurels when it comes to making things even more complex for the implementation of DV. It is now looking at a new DV variant styled, “Dolby Vision IQ” which seeks to utilize the light sensors inside a TV, to dynamically adjust the HDR picture based on the content and ambient light conditions in a viewing area.
Now that you know more about Dolby Vision and its quirks, you can decide for yourself if it is worth the hype.