Different white point in SDR and HDR setting

Home Forums Help and Support Different white point in SDR and HDR setting

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #30189

    Stanek
    Participant
    • Offline

    Hello.

    First of all thank you for an amazing DisplayCAL program to calibrate.

    I have a question regarding differences in white point between SDR and HDR mode. In SDR white point seems to be correct while in HDR white is bluish. In order to have correct white point in HDR I have to select “Reset video card gamma table” from the DisplayCAL tray’s icon. White point is setup using monitor controls as follows:
    RGB: 47, 50, 44.

    Applying profile or not does not change white point in SDR  (at least it’s not really visible) although in HDR it changes a lot. Why is that?

    My screen is Samsung C27HG70 updated to the newest firmware 1025. My output is RGB, 10-bit.

    I calibrated using DisplayCAL with setting as attached in the post. I was calibrating in SDR mode.

    Attachments:
    You must be logged in to view attached files.
    #30199

    Vincent
    Participant
    • Offline

    HDR mode is a locked factory translation from Rec2020 PQ to your panel capabilities… hence is not correctable and I won’t mess with it. It is as bad (or good) as out of the box.

    As a general rule in all those fake HDR monitors is to do not use HDR mode at all. Learn to use madVR and let DisplayCAL generate a lut3d to handle the conversion between rec2020 pq and your SDR display (although maybe widegamut) capabilities.

    Also every preset on a disiplay do not share whitepoint. Each preset need its own profile and its own grey calibration (displayCAL calibrates grey..that’s all, resulting ICM allow apps to color manage themselves or apps with LUT3D support can use another calibration derived from displaycal ICM)

    • This reply was modified 2 years, 10 months ago by Vincent.
    #30208

    Stanek
    Participant
    • Offline

    Thanks Vincent for your response.

    This monitor is not an OLED but has a couple of local dimming zones and brightness goes a bit over 600 nits while in SDR is up to 350 nits so there is visible benefit of using HDR on it.

    My question was different: why is white point in Windows so much different in HDR mode when enabling profile? I didn’t check exactly with colorimeter but it looks way above 6500K, something like 8000K if not higher. In SDR mode switching profile on and off does not make a big difference which is expected behavior as monitor gets very close to 6500K without profiling.

    If it’s not possible to calibrate monitor in HDR mode is it possible to make DisplayCAL disable it’s calibrations by default when switching to HDR mode? It’s not a big hassle to do it manually but sometimes when I start full screen application in HDR I tend to forget about turning off profiling. Or maybe it’s more viable to use different profile type (currently XYZ LUT + matrix)?

    Last word on my computer use. I actually use madVR but only for SDR content. HDR mode I only use for playing games. Factory HDR calibration is enough for me for games so I don’t really mind not having it calibrated in that mode. I calibrate my monitor mainly for Photoshop and Lightroom. Better looking films are just a bonus.

    Thanks for anyone who will read this long post.

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS