What is the RGB gray balance?

Home Forums Help and Support What is the RGB gray balance?

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #11403

    Никита Горских
    Participant
    • Offline

    Hi!

    Can someone please explain in detail what does the “RGB gray balance” graph in a profile verification report mean? How to interpret it? At first I assumed that it should correlate with the color temperature against gray level graph, though now it seems that it is not so.

    The reason I ask is because when my screen was calibrated using an i1 Display Pro with a CCMX measured with the help of a spectrometer, the gray balance didn’t show significant deviations before calibration, and was reported to be virtually perfect after.

    However, when I later calibrated with my ColorMunki Display, the gray balance is reported to be somewhat unstable after calibration, although the color temperature graphs look OK in both cases and visually, I can’t see any significant differences in gray-scale between the two profiles. Please see the attached reports.

    Thanks!

    Attachments:
    You must be logged in to view attached files.

    Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #11428

    Vincent
    Participant
    • Offline

    “RGB gray balance” answer the question “Does measurements in gray scale track what is recorded in profile’s TRCs?”
    “RGB gray balance (>= 1% luminance) combined Δa*00 and Δb*00 range” (also called “a*b* range”  or just “range”/”grey range” in other software or reviews) tells you if there are big shifts between measured data and TRC data. It takes maximum shift in da* and db* as a measure of total/worst color shift in grey, hence the name “range”.

    If you check “Evaluate gray balance through calibration only:” then evaluation runs against a “perfect” neutral grey (a*=b*=0) but measured brightness (dL*=0).
    This is a way to check calibration “color tints” (usually magenta and greencolor cast but also blue) in grey ramp (but usually there are too few measurements to find “calibration induced banding” because of low bitdepth LUTs or lack of dithering, better use visual inspection for this task).
    With this option checked “grey range” will report actual color shifts in grey from against the color of white 255. This means evaluate if color of grey is neutral to its white.

    ***

    For example you could have an extremely accurate XYZLUT+matrix profile that stores monitor behavior in high detail.
    Measurement report will plot very low dE, dC and such (an exception is “assumed whietpoint”).  Even grey range will be low. That is because you are evaluating if profile matchs monitor’s calibrated behaviour. Your are not evaluating if calibration is good.
    But such accurate profile may store a bad calibration with color tints in grey and huge shifts in green-magenta across a grey scale. If you check “Evaluate gray balance through calibration only” you will be able to see that in report.

    Another example: if you had a visually good and neutral grey gradient in non color managed enviroments, without banding or other calibration artifacts, but you experience green/magenta cast in grey gradienta when in color managed apps, then “Evaluate gray balance through calibration only”=checked may report good results but the unchecked report may be not so good (there are situations where checked and unchecked are good and you see bands or color cast in gradients in color managed apps).
    In that situations and in order to avoid color cast in gradients you may want to use “single curve” matrix profile because those profiles store TRC that are neutral (actual L* measured data but a*b* forced to be 0).

    It is very useful that after a calibration and a measurement report like yours, draw a grey gradient in your prefered image editor and look for that issues. Some image editors perform better than others because they cause less rounding errors, or use high bitdepth output paths or because that app does “in app” dithering.
    If you don’t care about those artifacts you can forget this paragraph.

    ***

    P.S.:
    You are using different colorimeter corrections for different devices and different profiles. Difficult to make a direct comparison.
    IMHO, I would not trust that CCMX if it was created with a 10nm graphic arts spectrophotometer. Use a CCSS file created with that spectrophotometer for your two colorimeters. 1 CCSS (just one) + each colorimeter firmware data (“colorimeter observer” measured at factory) = more comparable results.

    #11432

    Никита Горских
    Participant
    • Offline

    Thank you! Very thorough explanation. I am going to process this for a little while now 🙂

    P.S. The i1 instruments do not belong to me, and I no longer have access to them. I bought the ColorMunki Display to try to correct some color tint and clipping in very high tones (close to white) not measured for the report. Strangely, in the end the problem could only be solved by playing with the contrast control on the display, hence the somewhat lower CR with my last calibration. Nevertheless, for now the results are good to my eye. I may reiterate on this topic later, though.

    Further input on the topic would also appreciated!

    Thanks!

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS