In light of the recent HDMI 2.1 incompatibility with most TVs and gaming consoles (PS5 and Xbox Series X), case in point, if we turn on Variable Refresh Rate (VRR), it will either screw up HDR or the display might even omit HDR altogether. With 8K gaming become the “new kid on the block”, how much are we willing to sacrifice HDR for better frame rate? For those sitting on the fence fearing that the HDMI 2.1 incompatibility might have an impact on your movie viewing experience…fear not. All these incompatibility issues (VRR and HDR) arose from the gaming community as it does not really affect movie-lovers.
So this topic is geared towards console owners…which one will you sacrifice - bit-depth (8bit vs 10/12bit) which ulitmately affects “banding” or wider color gamut (photo-realistic scenes with optimized whites and blacks)?