How to Calibrate HDR on windows 10?, PG35VQ

Home Forums General Discussion How to Calibrate HDR on windows 10?, PG35VQ

Viewing 11 posts - 1 through 11 (of 11 total)
  • Author
    Posts
  • #24574

    Lucass
    Participant
    • Offline

    Bought pg35vq recently i wonder if its possible to Calibrate HDR mode on windows 10 for content consuming.

    up to 12bit color. 1000 nit peak brightness.

    Quantum dot display.

    I’ve already calibrated for SDR, would be nice if i can do it for HDR too.

    Just cant figure out how should i calibrate this

    THANKS!!!!!

    #24576

    Vincent
    Participant
    • Offline

    You can’t… unless you can set display to accept HDR input, turn off its internal HDR mapping to whatever that display supports, keep “on” HDR backlight, all at the same time. AFAIK no one of these consumer monitors do that.
    You are limited to factory HDR processing with minor grayscale tweaks.

    #24601

    AstralStorm
    Participant
    • Offline

    Technically you can apply a calibration. And that’s about it.

    Most HDR screens do not actually do any special processing for in-gamut sRGB input, however Windows 10 HDR has a nasty property of scaling EDID reported gamut to fit in sRGB and not use a matrix for this conversion.
    (Reported in Feedback Hub by me, please upvote. It makes SDR desktop look super terrible on some monitors due to clipping.)

    Mind you the calibration likely won’t work properly as backlight levels are modulated by the screen.

    #24830

    AstralStorm
    Participant
    • Offline

    Actually I was wrong. You can apply ICC profile for HDR SDR mapping in Windows 10 2004. Probably earlier too.

    Create a profile using madvr in HDR mode on, then using Windows calibration controls add the profile as ICC Advanced Color profile. Windows then will apply these values for HDR mode for color mapping SDR into wide gamut, so you get correct look for sRGB apps which end up not bypassing the compositor. As far as I can see, the compositor does *not* apply the profile to Direct3D or Vulkan applications. And if the app requests the BT.2020 ST.2084 color space and retrieves a profile, it will get the Advanced Color one.

    Of course it’s not a 3DLUT so don’t expect miracles from it as the white point will be slightly off when the backlight goes dimmer or brighter. Should be pretty good though for most displays still.

    • This reply was modified 3 years, 10 months ago by AstralStorm.
    • This reply was modified 3 years, 10 months ago by AstralStorm.
    #26106

    nelldrip
    Participant
    • Offline

    Surely there are a limits, but it is technically possible.
    By operating VCGT even in HDR, you can optimize the tonemap and keep the white balance properly.

    I posted the details to below a while ago, please refer to it.

    calibrate HDR display on Windows HDR mode

    BenQ’s EW3270U and Philips’ 328p6vubreb/​​11 were calibrated this way and these look almost the same as the PA32UCX calibrated with the official ASUS calibration application.

    #26135

    Wire
    Participant
    • Offline

    What are the basic challenges of getting  ICC CM to work with HDR? — Assuming a distinction between using tech ICC  tech  to get more pleasing color versus realizing high tolerance response…

    #26138

    nelldrip
    Participant
    • Offline

    Generally, HDR mode such as Windows, st2084 / rec.2020 is used for both image reception and transmission, and it is mapped to the range that the monitor can display.
    In this case there is no mechanism for icc to intervene.
    However, VCGT can be used for video card output.

    HDR display tonemaps are not always desirable and can be significantly improved by simply modifying the tonemap and white balance with VCGT.

    #26140

    Wire
    Participant
    • Offline

    Ah sure

    My question presumed there was an insertion point for ICC tech. So there is no such point because the signaling is end-to-end and closed like old-school TV?

    I never thought about it but you need the “content consumer’s” display to have a standard personality the control of which lives on far side of HDCP? So you get what you get as far as that personality is concerned, hopefully they do it well. But you might measure it to see how well it does.

    In DolbyVision, there is the idea of windows/frames — I don’t know the jargon — for tone mapping? Like every “scene” has meta-data for rendering?

    Like the system goes to a place that the CIE can’t account for?

    For example, in CG rendering the’s the problem of metamerism in the most general sense, that spectral to tristimulus is mostly (but not complelely) a one-way street, with multiple spectra producing the same tristimulus, If you are compositing a scene’s reflections a tristimulus representation has a huge pitfall that effects can add up odd in visually odd ways. IOW, color goes “wrong” for all the right reasons.

    So CIE model and effects of very wide brightness range: say scoptopic vision where your sensitivity is a radical adaptation to darkess, or for more pertinently for HDR, brightness so high it leaves spots before your eyes, you can imagine that you can build scene that exploit these effects artistically. But nothing I’ve so far come across in ICC approaches accounts for such adaptations.

    Basically, the future would seem to be about rendering intents and DolbyVision is alway down this path. Without a way into the dynamics, what does “color management” even mean? Yet the industry still has to have ways to describe their technology?

    Seems like a can-o’worms

    #26141

    Vincent
    Participant
    • Offline

    Ah sure

    My question presumed there was an insertion point for ICC tech. So there is no such point because the signaling is end-to-end and closed like old-school TV?

    IF ( a big if) you could enable HDR “backlight” (FALD ON, allow max brightness and such) and at the same time disable HDR to “whatever panel capabilities” translation (the tone mapping) you can profile display and store display behaviour in an ICC.
    With such ICC you can make a LUT3D that takes HDR data in and output a translation to your panel. It would be like a HDR tone mapping but made by you. Then id TV or monitor allowed to upload such LUT to its internal storage you can enjoy it.

    That is ICC insertion point. A data source for making LUT3D.

    AFAIK there is no such TV that allows HDR backlight on and HDR tone mapping OFF in a generic “open” way or by OSD menu. It may be some propietary API for some paid software like CalMAN or lightspace.
    If there were such displays it would be very cool.

    For “fake HDR” displays, their panel and backlight are actually SDR, so we can profile display behavior in detail. Then we can make a LUT3D for HDR to display capabilities translation and use that LUT3D in software than allows to use it.

    #26144

    Wire
    Participant
    • Offline

    If I follow you, fake HDR is like still HDR photography, which was not about high-dynamic range media, but about rendering the world into a lower range. A computational form of Ansel Adams’ Zone System

    What is the point of real HDR? The one key use I can see is that it lets display response be scaled from dark to very bright viewing environments.

    Both about rendering intents, beyond tristimulus accurancy.

    You are saying that fake HDR doesn’t go black enough nor bright enough. FALD is a refinement of dynamic contrast which avoid pumping of dynamic iris. but is seems like it should be very useful for scaling to very bright conditions.

    If the old IRE view of contrast still holds—that vision adapted to a particular point us satisfied by about 100:1 CR, then multiply by 10 to get good scaling in controlled lighting—1000:1 for common displays. For high brightness, more important to place 100:1 window well above ambient. For dark surround, you need absolute black performance, with maybe rendering getting a little tricky to compromise because dark surround vision is adapted only to display which is constantly varying. IOW sliding the contrast window up for bright is very practical, but you run into electo-mechanical limits of aperture tech for dark.

    The contrast window seems to be an aspect the CIE model doesn’t thoroughly account for? At least I haven’t run across the vernacular for it.

    I use my displays in a room with windows that get very bright during the day and my displays appear to hold response well from lowest brightness to highest. I find being able to run the display brightness via DDC essential to usability. And I find that color doesn’t visibly suffer across the range, but that 300 cd/m2 is plenty for reading and video, it’s not enough for brightest conditions to do image editing.

    On the flip side, when it’s dark night and I become adapted to the display, I lose the ability to resolve color almost as badly as too bright—white gets weird and the gamut seems to flatten

    Nothing I’ve read so far about colorimetry discussed these effects

    #26145

    Vincent
    Participant
    • Offline

    If I follow you, fake HDR is like still HDR photography, which was not about high-dynamic range media, but about rendering the world into a lower range. A computational form of Ansel Adams’ Zone System

    It’s SDR. HDR translation in this fake HDR displays usually meant “cut” out of gamut, not “compress”. It is not like “HDR photo”.

    What is the point of real HDR? The one key use I can see is that it lets display response be scaled from dark to very bright viewing environments.

    That one.

    You are saying that fake HDR doesn’t go black enough nor bright enough.

    If your display has a static ontrast window of 1000:1 or 3000:1, divide HDR400 by this quantities.. and you’ll see

    FALD is a refinement of dynamic contrast which avoid pumping of dynamic iris. but is seems like it should be very useful for scaling to very bright conditions.

    But when you mix power on zone and power off zone results can be disgusting with halos unless base static contrast ratio is very high (like those VA TVs with 6000:1) and FALD number of zones is ver very high. Thousands of them, maybe even tens of thousands are needed.

    There are HDR native panels with 1M:1 static contrast ratio, like those used for grading (dual panel tech from Panasoinic and others).

    If the old IRE view of contrast still holds—that vision adapted to a particular point us satisfied by about 100:1 CR, then multiply by 10 to get good scaling in controlled lighting—1000:1 for common displays. For high brightness, more important to place 100:1 window well above ambient. For dark surround, you need absolute black performance, with maybe rendering getting a little tricky to compromise because dark surround vision is adapted only to display which is constantly varying. IOW sliding the contrast window up for bright is very practical, but you run into electo-mechanical limits of aperture tech for dark.

    The contrast window seems to be an aspect the CIE model doesn’t thoroughly account for? At least I haven’t run across the vernacular for it.

    I use my displays in a room with windows that get very bright during the day and my displays appear to hold response well from lowest brightness to highest. I find being able to run the display brightness via DDC essential to usability. And I find that color doesn’t visibly suffer across the range, but that 300 cd/m2 is plenty for reading and video, it’s not enough for brightest conditions to do image editing.

    On the flip side, when it’s dark night and I become adapted to the display, I lose the ability to resolve color almost as badly as too bright—white gets weird and the gamut seems to flatten

    Nothing I’ve read so far about colorimetry discussed these effects

    It is not a “point”… but it is true that we measure point-like.
    The halo effect of a big screen mixing bight and dark scenes at the same time is visible. You can see it. I think that there is a Munsil BluRay for testing showing this ¿Starfield? or something like that. That’s why they need to use those VERY expensive true HDR displays for grading. True non FALD 1000nit and 1/1000 nit at the same time on several parts of the screen.
    FALD could be very good on a very high contrast VA TV for consuming HDR content, but has disadvantages be used for grading because you do not see what your deliverables contain… and they pay them for that.

    • This reply was modified 3 years, 6 months ago by Vincent.
Viewing 11 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS