Calibrating 10bit displays

Home Forums General Discussion Calibrating 10bit displays

This topic contains 4 replies, has 2 voices, and was last updated by  Bruno (@bruno) 3 weeks, 2 days ago.

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #19765

    Bruno (@bruno)
    Participant
    • Offline

    Hi,

    i am in the process of selecting components for a new computer which i will use for photo and video editing. I already have a 10bit display but i am still a bit confused about how 10bit color is supported by the GPU and Windows.

    Is there a benefit for calibration when using a 10bit display? I would suppose because of the higher precision it would be less likely to experience banding after calibration. And as far as i know, on consumer GPUs Windows 10  supports 10bit only for games and video. Does this also apply for calibration?

    I would be grateful for any advice!

    #19766

    Florian Höch (@fhoech)
    Administrator
    • Offline

    Hi,

    Is there a benefit for calibration when using a 10bit display? I would suppose because of the higher precision it would be less likely to experience banding after calibration.

    That is definitely a benefit to higher bitdepth during calibration.

    And as far as i know, on consumer GPUs Windows 10 supports 10bit only for games and video. Does this also apply for calibration?

    Not quite. Consumer GPUs need to support 10 bit (and in some cases higher) for things like HDR as well. Notwithstanding Windows and driver bugs/quirks/limitations, in many cases it should be possible to either make use of higher bitdepth (over DisplayPort and HDMI) for calibration, or at least use dithering (AMD, nVidia) to achieve the same effect. In case of nVidia under Windows 10, dithering support is flakey though. In my personal experience, it works for me after a reboot or after logging out and back in (see the respective thread in the nVidia forums, or the sticky topic here on how to enable dithering for nVidia via registry – in my case, I didn’t need to do that though). Note also that color managed applications working in less than 10 bit can introduce their own quantization artifacts/banding if they don’t apply dithering as well (10-bit framebuffer in Photoshop is only available with certain graphics cards, like nVidia Quadro and possibly AMD).

    #19767

    Bruno (@bruno)
    Participant
    • Offline

    Thank you for your quick response!

    Maybe i still lack understanding on the details of color management… With 8bit it seemed somehow clear to me how color management works. There is a function mapping each input 8bit RGB value to the according 8bit output value of the corrected color (e.g. a neutral gray of 30,30,30 is mapped to something like 28,26,27 to compensate inaccuracy of the display). But what is the concept when using a 10bit display? Are the 8bit color values of the applications mapped to 10bit values on the GPU?  Can argyllcms/dispcalgui access the 10bit LUT even if full 10bit color is not available to applications?

    #19790

    Florian Höch (@fhoech)
    Administrator
    • Offline

    Calibration bit depth and application color management bit depth are independent of one another. Only if both the color managed application and the calibration use higher than 8 bits internally, you’ll get a banding-free result (as long as the display doesn’t introduce its own quantization artifacts/banding).

    Recently nVidia has added 10-bit per channel (30 bit color) support even for GeForce cards with their latest drivers (436.xx ad newer), which is great news because it means (e.g.) Photoshop can display banding-free color managed images (tested under Win10 with GTX 1070), which was previously restricted to Quadro (and AMD) graphics cards.

    #19791

    Bruno (@bruno)
    Participant
    • Offline

    Thank you for clarifying.

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS