Dithering not enabled on macOS when using calibrated profiles?

Home Forums General Discussion Dithering not enabled on macOS when using calibrated profiles?

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #12853

    hugociss
    Participant
    • Offline

    Does anyone notice that dithering appears to not be enabled on macOS when using non-default calibrated profiles on 8 bit monitors such as the MacBook Pro built-in display? Gradients appear band-less and smoother using system default profiles compared to calibrated profiles from either ColorMunki Display software or Displaycal (the gradient images being viewed in Photoshop to avoid Apple crushed blacks bug in Preview, etc).

    And is there a way to enable dithering for non-default profiles if this is the case?

    • This topic was modified 5 years, 8 months ago by hugociss.
    • This topic was modified 5 years, 8 months ago by hugociss.

    Calibrite Display SL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #12856

    Vincent
    Participant
    • Offline

    I think that dithering is not enabled at all in OSX desktop color management whatever profile you use. That’s why this kind of complains related to TRCs (tone response curve) in profiles are so common on OSX users.
    System default profiles are likely to store an ideal, perfectly neutral, infite contrast and even power law TRCs. In that situation color management corrections that are applied to a gradient are minimal.
    You can apply the same to monitor’s driver ICM (Windows), or auto EDID ICC profiles (OSX).
    DisplayCAL allows you to do the same when profiling, use idealized (not so accurate) profiles with just 1 TRC with black point compensation. For example “Gamma + matrix” profile, BPC=ON should be close to “fake” TRCs in Apple’s default display profiles.

    Also default profiles store “no calibration” for graphics card, so there is an identity relation between RGB values meant to be sent to display and RGB values in card’s output.
    Custom display profiles (X-rite software, DisplayCAL, etc) usually store non trivial calibration to fix your display behavior: this is why you created them!
    Unless your graphics card (the actual card that drives outputs! this is important for laptops) has a LUT with more than 8bit per entry and dithered outputs, calibration will lead to value repetition and truncation rounding errors. Some cards have this feature (ATI/AMD), others don’t (all intel iGPU AFAIK and it is very likely to happen on nvidias over an 8bit link).
    The only way to get rid of this is to use a monitor with internal HW calibration (with working & accurate HW calibration, not so common, not so cheap) or a graphics card with LUT with more than 8bit per entry and dithered outputs.
    If the card driving outputs in your mac is not one of these then choosing a white point near native and a gamma close to native may minimize this kind of truncation errors.

    #12857

    hugociss
    Participant
    • Offline

    Immensely informative and helpful reply!

    I am running on an integrated Intel Iris GPU, and after creating a Gamma + Matrix profile using an existing Single curve + Matrix measurement data, (albeit without black point compensation on, that option was greyed out for some reason), can confirm that gradients now appear even smoother than the uncalibrated system defaults.

    Now wondering when does macOS make use of dithering? Only in certain colour managed applications?

    Also, out of curiosity, does this mean when using higher-end GPUs with LUT of more than 8 bit plus dithering output, or/and when calibrating using a higher-end monitor with internal calibration, that a Single curve + Matrix profile type won’t cause any (additional) banding or gradient smoothness artifacts?

    And off-topic somewhat, but is there also a trend of WLED displays having a slightly cooler than 6500K native white points? Anecdotal observations of a single Dell P2715Q, and various Macbook Pro displays in Apple Stores and in public suggest that nearly all have native white points of 7000K-7100K. Wondering if anyone has insight into that.

    • This reply was modified 5 years, 8 months ago by hugociss.
    #12863

    Vincent
    Participant
    • Offline

    Now wondering when does macOS make use of dithering? Only in certain colour managed applications?

    As any other OS it uses dithering when app requests GPU dithering. App, not OS. For example Lightroom or CaptureOne (OSX & Win).
    Photoshop on newer macbooks is suposed to offer “10bit-like” functionality with a dither feature because of intel iGPU OpenGL drivers made by Apple (Photoshop “believes” to have 10bit functionality). Unfortunately this dithering trick (over 8bit link) is not enabled in OpenGL drivers for AMD/nVidia consumer cards on Windows although hardware is capable (Lightroom for example), so you have to pay the Quadro/Firepro “ransom”.

    Also, out of curiosity, does this mean when using higher-end GPUs with LUT of more than 8 bit plus dithering output, or/and when calibrating using a higher-end monitor with internal calibration, that a Single curve + Matrix profile type won’t cause any (additional) banding or gradient smoothness artifacts?

    There are 2 sources of error: LUT hardware and rounding errors from color managed apps.

    1st one is solved with HW calibration or DisplayCAL’s calibration with a GPU with >8bitLUT and dithering. Let’s call it “calibration truncation error”.

    2nd one is caused by app’s rounding errors when doing color management (single curve + matrix trick & others). Let’s call it “color management truncation error”.
    If an app uses “10bit” feature (Photoshop) or “10bit like dithering” features (LR & CaptureOne, or some video renders in Windows/OSX) you can use more complex profiles with 3 TRC and all should look smooth.
    (Of course extremely different TRC per channel in profile may cause some unsolvable rounding errors but it should be uncommon with a proper calibration)

    And off-topic somewhat, but is there also a trend of WLED displays having a slightly cooler than 6500K native white points? Anecdotal observations of a single Dell P2715Q, and various Macbook Pro displays in Apple Stores and in public suggest that nearly all have native white points of 7000K-7100K. Wondering if anyone has insight into that.

    I think that such cool white is actually native whitepoint for that WLED sRGB backlights. Native whitepoint = maximum contrast, also it’s cheaper to use native white if it looks “white” (not pink or green) even if it is a cool white.

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS