novideo_srgb: GPU-side LUT-Matrix-LUT calibration

Home Forums General Discussion novideo_srgb: GPU-side LUT-Matrix-LUT calibration

Viewing 15 posts - 1 through 15 (of 31 total)
  • Author
    Posts
  • #33364

    dogelition
    Participant
    • Offline

    I figured out how to use an undocumented NVIDIA API to achieve full LUT-Matrix-LUT calibration on the GPU side, which means you can effectively get almost perfect color accuracy (on a well-behaved display) globally, with no performance cost, including in fullscreen applications.

    I ended up turning this into an easy-to-use tool where you simply have to select the ICC profile for your monitor and the desired gamma, enable the “Clamped” checkbox, and the program does the rest. Calibration based on the primaries reported in the EDID is supported as well, but this is probably less interesting for the users here :). Dither controls are also included (though there already are other tools/ways to change these).

    I think I can’t post a direct link due to the spam filter, but you can find it on GitHub under ledoge/novideo_srgb. The README includes some notes as to how you should calibrate/profile your monitor in DisplayCAL to achieve the best possible results.

    #33403

    Vincent
    Participant
    • Offline

    AMD HW uses the same since AVIVO engine in 2005, maybe earlier, but control panel only allows sRGB simulation.
    Do you mind how to provide that functionality to AMD cards in future novideo_sRGB releases?
    Thank you in advance.

    PS: if you could dump 16bit LUT-MATRIX-LUT in an open format it would allow (in a theoretical way) to upload it (truncated to n-bit) to some monitors with HW calibration, as long as there was some vendor public SDK, like Dell or HP did in the past.

    #33412

    dogelition
    Participant
    • Offline

    I looked into AMD’s linux driver code a while ago, and according to that, only very recent (latest gen?) GPUs actually have custom degamma support. For all other ones, the hardware only supports the sRGB EOTF. So there’s not much point in figuring out their undocumented API (which, as far as I can tell from the function names, does not even have a way of setting a custom degamma LUT on new GPUs), when the same can be achieved by editing the EDID to set the primaries to the measured ones, then using CRU (or the manual registry method) to override the EDID, activating the driver’s sRGB clamp option, and then doing a VCGT calibration on top of that. Other than the primaries being limited to 3 decimal digits of precision, this should work pretty much as well as using the API.

    As for dumping the calibration data into a file, sure, I could pretty easily do that. Though novideo_srgb won’t even start when the NVAPI dll is not present, so it would probably make more sense to make a separate utility that’ll work no matter what GPU you have and can only export the data.

    #34075

    SuspiciousPixel
    Participant
    • Offline

    Hi, In your readme

    Update README.md · ledoge/[email protected] · GitHub

    For the gamma options to work properly, the profile’s tone response curves must report the black point accurately. Using DisplayCAL, this means that Profile type has to be set to Curves + matrix or Single curve + matrix, and Black point compensation must be disabled (enable Show advanced options under Options to see these settings). Tone curve should be set to some (any) target. For Curves + matrix profiles, it can also be set to As measured instead, but that’ll probably require a larger testchart to not get worse accuracy.

    Is that for AMD users as it’s in red background or does it apply for Nvidia also?

    I’m an Nvidia user but for now  created a .icc using single curve + matrix. Previously I’ve always created a profile  using XYZ LUT+matrix. I have no idea what bearing this will have on the end result but from what I understand XYZ LUT+matrix will  be more accurate .

    Thanks for creating this tool, has no impact on FPS in games compared to dwm_lut

    #34087

    dogelition
    Participant
    • Offline

    Not sure how you ended up there, but you’re looking at a diff of some old version. Please refer to the current README as on the repo’s main page – XYZ LUT profiles can be used directly without any problems now. And there is no AMD support at all.

    #34091

    MW
    Participant
    • Offline

    Is 2.2 gamma better than “as measured”? Why is that?

    #34436

    caiokao
    Participant
    • Offline

    Hi, @dogelition

    I have some questions on how to use novideo_srgb.
    (I’m new to this profiling world. Sorry for the silly questions)

    In the ‘readme’ you say to NOT use the ICC profile loaded into Windows or any other aplication.

    So, the correct steps should be:

    -Create an ICC Profile using DisplayCal (D65, do not emulate black point, etc)
    -DO NOT install this created profile in Windows, or let DisplayCal manage it
    -Run novideo_srgb > advanced > Use ICC profile > select the profile created by DisplayCal > check calibrate gamma to > OK > clamp

    Is that correct?

    If the above is right, why do I need to run another profiling?
    Isn’t novideo_srgb aplaying the profile AND clamping to srgb, or it just uses as a reference or something like that?

    #34437

    dogelition
    Participant
    • Offline

    Is that correct?

    Yep, that’s correct.

    If the above is right, why do I need to run another profiling?

    You don’t need to – this is optional in case you want to have an ICC profile to use in color managed applications (instead of e.g. the default Windows sRGB profile) with the novideo_srgb calibration active. Such a profile should then give you even better color accuracy, as it can correct for the non-linearities of the display. The color space transform enabled via novideo_srgb does use the data of your ICC profile to transform colors to sRGB, but it can only be “perfect” if the display is entirely linear.

    #34438

    caiokao
    Participant
    • Offline

    Yep, that’s correct.

    Got it!

    You don’t need to – this is optional in case you want to have an ICC profile to use in color managed applications (instead of e.g. the default Windows sRGB profile) with the novideo_srgb calibration active. Such a profile should then give you even better color accuracy, as it can correct for the non-linearities of the display. The color space transform enabled via novideo_srgb does use the data of your ICC profile to transform colors to sRGB, but it can only be “perfect” if the display is entirely linear.

    Do I need to change any DisplayCal settings for this second profiling?
    After created, can/need I active it in Windows color management? or it’s just for specific aplications?

    I’m just trying to understand how it works.
    Doesn’t novideo_srgb uses the first ICC Profile to “correct” the colors and them clamp to sRGB?

    #34439

    dogelition
    Participant
    • Offline

    Do I need to change any DisplayCal settings for this second profiling?

    I would suggest setting “Tone curve” to “As measured”, as I’m not sure how/if the VCGT calibration would interfere with the novideo_srgb calibration, since they both happen on the GPU level. Apart from that, just use the same settings.

    After created, can/need I active it in Windows color management? or it’s just for specific aplications?

    Doesn’t matter how you do it, as long as the applications that should use the profile are configured to use it – either by telling them to read it from the Windows settings or by adding it in their own settings. I assume that most color managed applications will use it if you just add it in the Window settings, without requiring any additional configuration.

    Doesn’t novideo_srgb uses the first ICC Profile to “correct” the colors and them clamp to sRGB?

    Yes, but its accuracy is limited because the GPU only supports a “simple” (1D LUTs + matrix based) color space transform instead of a “complex” (3D LUT) transform that you usually need for optimal results. With an ICC profile on top you will get the latter in color managed applications.

    #34441

    caiokao
    Participant
    • Offline

    But what’s the difference betwen using novideo_srgb this way and ICC Profile into windows and DWM_Lut ?

    (I’m currently using this way: DisplayCal profiling > loaded into Windows > 3D Lut created > DWM_Lut. Don’t know if I’m doing right)

    #34445

    MW
    Participant
    • Offline

    But what’s the difference betwen using novideo_srgb this way and ICC Profile into windows and DWM_Lut ?

    Non-linear correction will be handled by ICC capable software, instead of DWM. So basic apps will get partial correction(1D+matrix) only.

    (I’m currently using this way: DisplayCal profiling > loaded into Windows > 3D Lut created > DWM_Lut. Don’t know if I’m doing right)

    There’s no right way, depends on your needs. Loading the profile in DisplayCAL is wrong though. You can let DWMLUT is take exclusive control of profile loading because it’s fully featured that way. Check apply VCGT in DisplayCAL, and in system color management settings set a generic sRGB profile as your display profile in system settings.

    • This reply was modified 4 months ago by MW.
    #34514

    dr04e606
    Participant
    • Offline

    Does anyone have any experience of using novideo_srgb on laptops with switchable graphics (AMD iGPU + Nvidia dGPU  / intel iGPU + Nvidia dGPU)?

    Are there any limitations or quirks?

    What will actually happen when you’re on an iGPU? Will your monitor gamut get unclamped until you switch back to dGPU?

    #34550

    Jaylumx
    Participant
    • Offline

    Is there a way that the clamp can be applied only in SDR mode and disabled in HDR.  It is the only thing stopping me using what could be an extremely useful piece of software.

    #34571

    dogelition
    Participant
    • Offline

    Is there a way that the clamp can be applied only in SDR mode and disabled in HDR.  It is the only thing stopping me using what could be an extremely useful piece of software.

    I don’t have access to a monitor that supports HDR at the moment, so it’s kinda hard for me to add proper support for something like that. I can give it a shot (but can’t guarantee that it’ll work properly and reliably).

    I’ve had one user report that when enabling HDR in Windows, novideo_srgb would then show the clamp as being enabled even though it was disabled previously. IIRC that might have been caused by/related to the Night Light feature in Windows? I’d appreciate it if you could try to reproduce that, so I can get a better idea of how I have to handle HDR.

Viewing 15 posts - 1 through 15 (of 31 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS