Originally Posted by crofter:
“I know about the shortage of previous generation native content - but what helped the uptake of 1080p was console gaming. We haven't really got "true 4k" consoles although that might change in the coming 12 months.
With regard to HDR - demo's have wowed but you got to remember that is on top of the range TV's and only showing a few select snippets. Real world HDR material has been a different story with manufacturers slapping HDR on products that are not capable of showing HDR material.
Even the TV's that are capable have shown massive flaws - edge-lit TV's are simply not fit for purpose with a full HDR movie or TV show as it exaggerates the already known weaknesses of the tech and even FALD TV's have limitations - they can go bright enough but they can't at the same time go dark enough.
OLED TV's are not bright enough to show HDR content mastered @1,000 or 4,000 nits so they simply clip the missing detail. Then of course we have Dolby Vision and HDR10 plus the rumoured dynamic HDR10 which may or may not require you to buy an entirely new TV and equipment ...
With all that in mind you could be forgiven people thinking that they may just sit on the fence until all this settles down - so I would argue that HDR has done more damage than good thus far ... it is one big mess.”
Gaming had nothing to do with adoption of 1080 as a broadcast (and disc-based) standard. Console gaming, specifically the PS3, helped Sony to win the BD vs HD-DVD format war. Most games on the PS3 (to begin with) were 720 rather than 1080 anyway.
There is a standard which sets the minimum requirement for a TV to have the
UHD Premium badge stuck on it in showrooms. Within
UHD Premium there are two standards for the HDR part of the requirement, one of which OLEDs meet and the other that LCDs meet. It is a classic fudge with opposing commercial interests wanting a spec that their chosen display technology can meet.
As far as I know, to qualify for the
UHD Premium label, a TV must be able to handle HDR10. Not necessarily to display it fully, but to interpret it and display it correctly, within the limitations of the display panel being used. A display has to be able to reproduce more than 93% of the DCI P3 color spectrum, which the latest LCD and OLED displays do. Dolby Vision is built on HDR10 with higher specs (12 bits, Rec.2020 colour gamut and higher peak brightness). No panel yet exists than can display Dolby Vision. As far as I know that's quite a few years away.
The point of having HDR10 or Dolby Vision capable TVs now, is that properly mastered HDR content can be be viewed on today's TVs (as best as they're able to reproduce it) and that same content is ready for the improved displays of the future Though I have no doubt the studios have a lot to learn about mastering for HDR). As far as I know, though the studios have said they'll support it, there are no titles encoded with Dolby Vision as yet.
About maximum brightness. At the moment it's not possible to have both black blacks and a 10,000 NIt maximum brightness, The ratio between the darkest blacks and brightest whites is more important than the absolute value of the maximum brightness. For me it was far more important to have black blacks than a very bright display. I prefer to watch TV in a darkened room, so the maximum brightness is simply not an issue for me. I chose OLED over LCD. It's also HDR10 and Dolby Vision compatible, as I think all the 2016 LG range is.
I don't believe that in your scenario that the latest 2016 OLED displays simply clip the maximum brightness for correctly mastered HDR material, but, inherently, LCDs simply cannot do true, zero output, blacks. Light always leaks and you always get dark greys instead.