2022-01-04 at 13:00 #33364
I figured out how to use an undocumented NVIDIA API to achieve full LUT-Matrix-LUT calibration on the GPU side, which means you can effectively get almost perfect color accuracy (on a well-behaved display) globally, with no performance cost, including in fullscreen applications.
I ended up turning this into an easy-to-use tool where you simply have to select the ICC profile for your monitor and the desired gamma, enable the “Clamped” checkbox, and the program does the rest. Calibration based on the primaries reported in the EDID is supported as well, but this is probably less interesting for the users here :). Dither controls are also included (though there already are other tools/ways to change these).
I think I can’t post a direct link due to the spam filter, but you can find it on GitHub under ledoge/novideo_srgb. The README includes some notes as to how you should calibrate/profile your monitor in DisplayCAL to achieve the best possible results.2022-01-05 at 17:22 #33403
AMD HW uses the same since AVIVO engine in 2005, maybe earlier, but control panel only allows sRGB simulation.
Do you mind how to provide that functionality to AMD cards in future novideo_sRGB releases?
Thank you in advance.
PS: if you could dump 16bit LUT-MATRIX-LUT in an open format it would allow (in a theoretical way) to upload it (truncated to n-bit) to some monitors with HW calibration, as long as there was some vendor public SDK, like Dell or HP did in the past.2022-01-05 at 19:47 #33412
I looked into AMD’s linux driver code a while ago, and according to that, only very recent (latest gen?) GPUs actually have custom degamma support. For all other ones, the hardware only supports the sRGB EOTF. So there’s not much point in figuring out their undocumented API (which, as far as I can tell from the function names, does not even have a way of setting a custom degamma LUT on new GPUs), when the same can be achieved by editing the EDID to set the primaries to the measured ones, then using CRU (or the manual registry method) to override the EDID, activating the driver’s sRGB clamp option, and then doing a VCGT calibration on top of that. Other than the primaries being limited to 3 decimal digits of precision, this should work pretty much as well as using the API.
As for dumping the calibration data into a file, sure, I could pretty easily do that. Though novideo_srgb won’t even start when the NVAPI dll is not present, so it would probably make more sense to make a separate utility that’ll work no matter what GPU you have and can only export the data.