In theory does HDR video content have to be bright? If an OLED display has essentially infinite contrast, doesn’t that mean that if you ran the display at say 100 cd, you could have the same HDR, as say an LCD running at 2000 cd with 1000:1 contrast? The OLED would have probably even better dynamic range and wouldn’t hurt your eyes with a lower peak output.
Basically, it seems like HDR is misused as a marketing tool to sell inferior TVs and/or to make TVs pop in the store. Am I way off base here? I was watching HDR and it kinda hurts my eyes, and I’m only around 1000 cd. I can’t imagine 2k or 4k cd on some of the mastering monitors out there. I pity the colorists haha.
The real bright parts of the picture should be limited to specular highlights, things like the sun, bright lights/skies etc. If used sparingly/sensibly instead of gratuitously, HDR does have the intended effect, without (necessarily) making the whole picture brighter (all the time). Contrast is still important, thus I would personally indeed prefer an OLED over an LCD even if the latter could achieve higher peak luminance (all assuming viewing in dim conditions, i.e. home cinema).
yeah, it seems that infinite contrast is, in a sense, already HDR, regardless of total light output. As long as the display is 10 bit or more. Which would allow for better shadow details and make the transitions smoother. So in a dark room, why would you need any peak brightness higher than say 500cd (or even less) on an OLED? For video games it would maybe make sense, if you wanted to create the sense of actual light and having to adjust your retinas from a dark environment to a light one or vice versa. But for movies, the cinematographer is controlling contrast for a reason. So maybe making highlights hit a little higher it might make sense, but it will require different thinking from cinematographers.