Graphics card LUT vs. display LUT

Home Forums General Discussion Graphics card LUT vs. display LUT

Viewing 10 posts - 1 through 10 (of 10 total)
  • Author
    Posts
  • #19082

    Egor S.
    Participant
    • Offline

    Hello everyone,

    I have a question regarding the graphics card LUT.

    Does the graphic cards LUT really have a bit depth of 16 bits, these days?

    So far I assumed that it always has 8 bits only.

    In DisplayCALs “Profile Information”, a bit depth of 16 bits is specified in the “graphic card gamma table” line. The profile loader is also set to 16 bits.

    So if the graphic cards LUT already has 16 bits, there should be 256 district tonal values at the 8 Bit display input, even if the tone curve in the vcgt-tag is not linear.

    It means that hardware-calibration displays with an internal 10-bit, 12-bit or 16-bit LUT offer no advantage, even if using a program that can edit the internal monitors LUT, right? (I am aware that DisplayCal cannot utilize the internal monitor LUT)

    Hmmm… while I’m thinking about it, there actually must be an advantage of the internal display LUT over the graphic cards LUT independent of its bit depth.

    As the display input has only 8 bits it does not help if the graphic cards LUT has 16 bits. If the tone curve became manipulated in the graphic cards LUT, less than 256 values remain for the 8 bit display signal. That’s right, isn’t it?

    The reason I’m asking is because I tried both, hardware calibration using the internal display LUT with a different program and calibration using the graphic cards LUT via DisplayCAL. None of them was ideal in terms of gradation characteristics, on my display. At the moment I calibrate my display with a different program, adjusting the displays internal LUT and make a profile with DisplayCAL afterwards. However, the results are not better. But still, theoretically this is the best way, isn’t it?

    Thanks a lot!

    Egor

    PS. @ Florian:

    Colorimeter correction Information with graph  is such a cool new  feature!

    • This topic was modified 4 years, 8 months ago by Egor S..
    • This topic was modified 4 years, 8 months ago by Egor S..
    #19085

    Vincent
    Participant
    • Offline

    GPU LUT may have 16 or 14 or 12 or 10bit to store (even truncated to 14-12-10bits) VCGT contents… BUT you have to output that correction to display in a proper way.
    If GPU link to display is limited to 8bit because monitor does not accept 10bit/channel input or because you use DVI that GPU LUT output is giong to truncate corrected values to 8bit.
    Unless there is some kind of temporal dithering it will result in banding caused by calibration.

    With GPU LUT >=10bit & temporal dithering at output calibration should be smooth and visually equivalent to HW calibration.

    But there are some things that current GPU LUT cannot do like gamut emulation.
    Some displays with internal HW calibration offer you simple gamut emulation of idealized colorspaces like sRGB/AdobeRGB etc.
    The simplest way is a lut-matrix-lut. I think that at ATI cards from 2005 (AVIVO engine) onwards should have some kind of HW dedicated to this stuff but I’m not sure if it is exposed to apps. For example they have some EDID data based sRGB emulation. Read edid data (native gamut primaries), “believe it”, emulate sRGB…. although it is named in a weird name, you should disable “color temp” in Crimson driver to see it, or at least this was happening in the past. It works at desktop level, so no application support was needed (that’s good). Newer nvidias should have something like that. AFAIK it is not user customizable, just EDID data to emulate sRGB.
    If this functionality were available to apps , some apps like DisplayCAL tray app could use it.

    Other monitors with HW calibration han store a LUT3D for simulation other device colorspace, mostly 17 node per side cube.
    Although you can use software LUT3D in GPU shaders like madVR, this fucntionality is application dependent, it is not exposed at desktop level (this is a difference with EDID to sRGB gamut emulation provided by some GPU drivers). That means that GPU shader LUT3D is limited to some apps.

    Also there is (“was“, I’ve not tested with W10 1903 after it was updated to solve LUT loading issues) an issue related to GPU LUT content being clenaned by some apps (games) or related to MS truncating its contents to 8bit when display goes to standby, or if a 3rd partly app like Xrite i1profiler or Basiccolor tries to load something into GPU LUT: it truncates contents to 8bit and even if you close them DisplayCAL cannot solve this unless you reboot. That means banding if you have non linear GPU calibration.

    So HW calibration inside monitor is better. Unfortunatelly some HW calibration solutions (Dell, Benq… etc) are not ready to deal with low to none QC of these not premium brands. Their calibration apps expect a fairly good uncalibrated response that can be corrected with 10 to 20 patches per black to primary ramp and this is not always true. There are some displays that need more uncalibrated measurements and unless you do it that way you get disgusting green-magenta cast in grey ramp (but grey ramp is smooth, no banding, just “tinted grey”). You’ll need GPU LUT + temporal dithering (like the one you get with DisplayCAL in medium or slow setting) to get a proper calibration on them on top of its HW calibration.
    This is not a HW issue, these displays should have at least 1024 entries for lut in lut-matrix-lut HW. Its HW is able to solve that issues, it’s a SW issue (you need more patches). That means more calibration time and AFAIK these brands are NOT going to pass beyond 20 patches per black to primary ramp.

    • This reply was modified 4 years, 8 months ago by Vincent.
    #19141

    Egor S.
    Participant
    • Offline

    Hello Vincent,

    thank you for your detailed answer. Your postings are, as always, very helpful.

    Yes, my display is connected via DVI and therefore accepts only an 8 bit input.

    Well, unfortunately my NEC P221W is the kind of display that obviously needs more measurements in the darks. The results are not really bad, but they are not perfect also. Although I already use 1148 patches for the profiling, which is quite a lot I think, I will have to experiment a bit. Maybe with custom made test charts. Moreover I will think about dithering.

    Have a nice weekend!

    Egor

    #19143

    Florian Höch
    Administrator
    • Offline

    I highly doubt you’ll get any benefit by increasing profiling patches over the default 175 with most VA or IPS computer monitors.

    #19144

    Vincent
    Participant
    • Offline

    Well, unfortunately my NEC P221W is the kind of display that obviously needs more measurements in the darks. The results are not really bad, but they are not perfect also. Although I already use 1148 patches for the profiling, which is quite a lot I think, I will have to experiment a bit. Maybe with custom made test charts. Moreover I will think about dithering.

    There are two set of measurements:
    -calibration patches (for grey), controled by calibration speed setting
    -profilling patches (user controlled, the ones you named), not related to calibration (for making a LUT3D for madVR or Resolve maybe, but not for desktop calibration)

    If P221W has some kind of HW calibration and it is not working as intended because some issues in grey ramp go unnotinced, and even after DisplayCAL some of these issues remain, decrease calibration speed (more grey calibration patches).
    But… IDNK severity of what you see. Maybe medium speed is enough.

    • This reply was modified 4 years, 8 months ago by Vincent.
    #19160

    Egor S.
    Participant
    • Offline

    Thank you both for your help!

    Well the gray scale isn’t completely bad on my NEC P221W. What bothers me is, that my second monitor, which is a very cheap MEDION non wide gamut display, has an almost perfectly smooth gray scale, whereas the NEC has some issues in the darks.

    With DisplayCAL I usually make a XYZ LUT type ICC profile with an auto-optimized test chart consisting of 1148 patches. All calibration settings are set to “as measured”, as the calibration itself is done by a different software (using the displays 10 bit LUT).

    What I will try next time is, to decrease the amount of profiling patches as Florian says 175 is enough.

    Instead of having a huge amount of profiling patches I will decrease the calibration speed to increase the amount of calibration patches, as Vincent says. What bothers me a little bit is, that I will have to set the tone curve from “as measured” to a specific value (L* in my case). So the GPU LUT will not remain perfectly linear. But on the other hand I obviously need some corrections here, so if this helps, everything is fine.

    However one thing is not quite clear to me. As I use a LUT profile type with a huge amount of profiling patches, the deviation from a perfect grayscale should have been registered during profiling. So the “errors” should finally disappear. But they don’t.

    #19168

    Florian Höch
    Administrator
    • Offline

    As I use a LUT profile type with a huge amount of profiling patches, the deviation from a perfect grayscale should have been registered during profiling. So the “errors” should finally disappear. But they don’t.

    You’ll have to explain in a bit more detail what that means.

    #19184

    Egor S.
    Participant
    • Offline

    I mean, that the XYZ values for the monitors gray scale, which is not perfectly smooth and neutral after the calibration, are stored in the ICC profiles LUT.

    When I do open an image which has a gray scale in it, the CMM of the graphic software / OS should be able to look up, in the B2A-tag of the ICC profile, which RGB values are required  at the output to get a perfect grayscale. Of course therefore the corresponding values have to be stored in the B2A-tag of the ICC profile, which I thought is the case, when the ICC profile is made of a huge number of patches.

    So the RGB value R=20, G=20, B=20 in the image must became a RGB value of R=20, G=21, B=19 for instance, at the graphics card output (after passing the CMM). So that a neutral gray is actually displayed.

    #19188

    Vincent
    Participant
    • Offline

    So the RGB value R=20, G=20, B=20 in the image must became a RGB value of R=20, G=21, B=19 for instance, at the graphics card output (after passing the CMM). So that a neutral gray is actually displayed.

    In the same way as high bitdepth GPU LUT + dithering is needed for smooth GPU calibration, color management engines need something “more” than the solution you propose: there are going to be rounding errors

    One of the solutions is temporal dithering like LR or Capture one and it’s an universal solution. Another one can be 10bit end to end.

    If you use GIMP, or Firefox, or your camera vendor software fro RAW, or PS without working 10bit end to end … its very likely that the corrections needed for yor monitor will sufer some rounding/truncation error and some kind of banding will arise.

    That why most HW cal solutions build extremely idealized matrix profiles with equal TRCs. HW cal “should” provive a good calibration so they choose smoothness over slight inaccuracues.

    *IF you have smooth bandless GPU calibration without banding in MS paint (lagom gradient or others) for your 2nd monitor, the one w/o HW cal*, you can do the same for your NEC: GPU cal (DisplayCAL) over HW cal.
    The drawback is that you may loose fast HW cal switching provided by Spectraview2 or the software you use.

    • This reply was modified 4 years, 8 months ago by Vincent.
    #19201

    Egor S.
    Participant
    • Offline

    Dear Vincent, dear Florian, thank you both for your tips!

    After I managed to install my new Profile, the gray scale looks much better.

    Therefore I  changed two things.

    First, I have decreased the calibration speed to medium, as Vincent said. Moreover I decreased the number of profiling patches as Florian said, to save some time during profiling.

    The second thing is, that I have used my i1DisplayPro instead of the i1Pro2 for the calibration, to increase the measurement accuracy in dark tones.

    Well, the result is still not perfect, but almost perfect. In any case much better than before. Yeah!

    Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

Viewing 10 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS