Powers of Display Profiles

Home Forums General Discussion Powers of Display Profiles

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #25888

    Wire
    Participant
    • Offline

    The documentation for DisplayCal is very clear in making a distinction between Calibration, bringing a display into conformance with a performance standard, and Characterization, an ICC profile which accurately describes the display’s behavior.

    What are the limits of an ICC profile overcoming some aspects of display non-compliance?

    FIrst let’s get rid of the trivial absurd case: Obviously it can’t overcome if the display is disconnected 🙂

    Or let’s say my display is incredibly low gamut (saturation turned all the way down), but I only use it for monochrome images.

    Keep the thinking going and it leads to a heap paradox: how far away from some nominal does a display have to get to become problematic?

    Let’s say I had a display with a continuous response but odd color geometry. Say hypothetically, if I had a display that shifted all color by some contour, say a 10 degree Munsell hue rotation—I know this is far from typical, but I’m at a loss to contrive a better example—can ICC rendering overcome this to place image color on the proper axis. Let’s say I grossly adjust the OSD hue control. Can an ICC display profile overcome this?

    Or here’s maybe a bit more legit scenario:

    How far from an image’s color space does display color have to be cause the ICC pipeline to improperly render the image color?

    I did an easy experiment on MacOS. I run two displays side-by-side. One display in custom mode, hand set D65 white, G2.2 set by DisplayCal in VCGT. On the other I choose the display’s built-in DCI-P3 Cinema mode, D60, G2.6 and a standard DCI-P3 D60 2.6 profile.

    While the shift of one display from my custom mode to DCI-P3 causes an alarming and seemingly unbearable shift in response. One the OS is informed with the DCI-P3 profile, everything comes around. By ensuring that brightness is normalized between the two (viewed in a bright daylight room) the two become effectively identical. The whitepoint difference is minimally noticeable by not threatening to creative intent. IOW, these displays could be used side-by-side in very different modes with (happily) ICC rendering really working and helping normalize the view.

    For many users, I might guess that if they saw these two very different display calibrations side-by-side, they will feel they sacrifice nothing from the difference, when the display is properly characterized and ICC rendered.

    So in terms of the heap paradox, what are the critical display performance parameters that ICC can’t help with, or that thwart it.

    For example, as we are limited to matrix profiles by much software, a linear display RGB response seems pretty important. But how non-linear does it have to get to be a big problem. I read often about problems with banding and using XYZ LUT profiles made from high patch counts, but the remedies too often sound like sacrificing chickens. And some educators observe that the remedy itself becomes a poison. I’ll skip the details but the lore is out there.

    How do you talk, in technical terms about the important of calibration tolerances of such disparities? …And avoid nonsense…

    I came across post here from a few weeks ago where a guy traded-in his very high-quality and totally capable Dell for a BenQ because he couldn’t figure out whether he could get the Dell alignment correct for youtube video work. As I have worked carefully with the little brothers of his Dell and find mine to deliver superb results,  I felt bad reading about that because he admitted in his post that after hours of study he just threw money at a problem that these tools are supposed to help solve and the end he has no reason to believe that his alternative was any improvement! And he was encouraged to do so by a very reputable list member.

    Part of the struggle is just garbled thinking. As an aside,  I’ve seen internet forum discussions go on for a hundred posts Ethernet jack wiring standards, about whether to use T-568A or T-568B with al sorts of mythical juju anecdotes attached but somehow almost everyone just ignores that in either case pins 1,2,3,4… connect to pins 1,2,3,4… it’s just the wire sheath color assignments that vary. And for some other reason that lost to the mists of Avalon, in either case the industry didn’t bother to just spec the assignments in an easy-to-remember sequence, but to mix them up in the most crazy manner possible  that still retains a core logic.

    Read up on Bt.1886, you will get a sense of what I’m driving at here.

    In USA we have some adages: “A little is good, more is better, too much is just right” and “Never under-estimate the power of advertising TVs on TV”

    I invite discussion in this way because much more often than not the dialog on display performance performance mixes the concerns of calibration and characterization, and it is often heard claimed that a display with 1000:1 contrast ratio is completely unsuitable for some work, or photography is very limited by sRGB gamut, etc.; conventional wisdoms which I become horseshit.

    In bigger picture (haha) my thinking heads to questions about industry directions, like super-wide gamut and HDR. In the best sense, I see that we are now crossing a new technology threshold to much higher levels of display performance, where metamerism is very well understood, bright room conditions are no longer a hinderance to excellent rendering, and high-resolution gets the display closer to a window in every sense. What’s important about this in age of Neuralink ™?

    This also means that stuff that seemed exciting 10 years ago is mature, cheap and very usable. I can’t say enough good things about how DisplayCal / Argyll have helped me get the most out of my kit. And it has me questioning some of the vernacular and tropes. When ICC tools work, it’s pretty great! But as things are ever further refined, it seems like the differences are driving people mad.

    *** Based on my last 6 mos of working with this stuff every day, the one tidbit of advice I will offer to all newcomers struggling to get an alignment that makes sense: Start with a quality colorimeter before changing anything else! Then keep your display config simple. ICC tools are supposed to do the heavy lifting, no?

    /wire

    #25911

    Vincent
    Participant
    • Offline

    Matrix vs XYZLUT is macOS limitation, not a limitation of ICC based systems. Same for some apps.
    Single curve TRC in profile is macOS limitation, not a generic one. Same for Photoshop issues when it cannot use temporral diothering or 10bitpath due computer lacking support for the OpenGL drivers feature that enables an app to send 10bit data to driver (driver may dither or output n-bit data to display or both of them).
    So none of this list is a “ICC based system limitation”. Other apps can deal with all of it (like C1,LR, ACR, madVR…) and dithering is the magic word that makes things work as intened without rounding errors.

    What to do when you cannot cover some portion of a colorspace is not an ICC based system limitation, its a HW limitation (if we name an internal gamut emulation in a display as HW too = outside of ICC based system scope). It is solved by something called rendering intent, “cut” (colorimetric) vs “deform” (perceptual).

    Some vidoe apps are not color managed so that guy’s Dell can be a reliable device but it it cannot make it behave INTERNALLY as some arbitrary colorspace in gamut boundaries too (not only white & gamma) it cannot be used WITH THAT APP to show proper colors. Here we have 1st a software limitation, huge (not ciolor managed) then a 2nd HW limitation, But 1 comes before 2. 2 is needed because the lack of 1.

    So more or less that will be a fast answers to your questions.
    1st limit is HW limitation on uniformity since ICC supposes a uniform pixel response across screen
    2nd limit is poor implemetation of an ICC based system, THAT implementation, not the whole system as concept or standard
    3rd limit are apps not using it, or adding their own sources of rounding errors & truncations
    4th limit if HW lack of internal programable calibration when apps are not color managed… and this is not an ICC based system limitation.
    5th limit is “vendor special taste” when dealing with “perceptual” intents, and a good example may beHDR content mapped to limited “upwards” WOLED TV which is out of gamut.

    The only limitation of an ICC based system as “concept” is that it lacks of uniformity mapping across screen. All other are specific implementation limitations imposed by a HW, OS or app vendor…. or “special taste” on perceptual tables chosen by a vendor to make LUT3D.

    • This reply was modified 3 years, 8 months ago by Vincent.
    #25940

    Wire
    Participant
    • Offline

    Matrix vs XYZLUT is macOS limitation, not a limitation of ICC based systems. Same for some apps.

    My comment was made in full awareness of this.  XYZ LUTs help when the device has considerable non-linearity or cross-overs, e.g., printing inks and papers. But good displays don’t have this behavior, making the computational cost of a LUT profile unnecessary. There has to be an idea, somewhere in the back of the mind of the industry that a display can be perfected, for most intents and purposes because unlike a printer, the device is completely self-contained. The implications of device simulation and creative intents are a whole other matter, which is why the industry has fractured between ICC approaches and vendor-specific cube LUTs.  Yes the underlying principles are the same.

    Single curve TRC in profile is macOS limitation, not a generic one. Same for Photoshop issues when it cannot use temporral diothering or 10bitpath due computer lacking support for the OpenGL drivers feature that enables an app to send 10bit data to driver (driver may dither or output n-bit data to display or both of them).

    I am not sure why you made the above point about Photoshop in context. In a gesture of goodwill and non-sequiturs, I will include a quote from Greame Gill that appeared on the Argyll CMS list:

    tong chen wrote:
    > Hi developer, currently I have a 10bit  display, but my gpu only supports
    > 8bit. If I switch to a gpu that supports 10bit in the future, do I need to
    > recalibrate the display? Or is there a difference between the calibration
    > done under 8bit and 10bit?

    Hi,
    in theory there may be a difference due to the bit depth, but in
    practice it’s unlikely to be noticeable. The fact is that the repeatability
    of your display and any instrument is not of the same order as 8 bit or
    10 bit, so the quantization difference is literally lost in the noise.
    (And this is assuming that you are using an actual 10 bit capable display
    and are connecting it with a 10 bit capable interface.)
    Any practical sort of calibration or profiling isn’t running though
    all bit codes – it would take far too long – so there’s typically nothing
    lost in sampling codes that are common between 8 bit and 10 bit.
    (And presumably you are using the dispread -Z parameter to properly
    use just 8 bit quantized test values).
    The only exception would be if your display is particularly non-linear at some point in its gamut, and where the extra 2 bits might really have some influence. But the calibration system is going to have trouble anyway, so the result isn’t likely to be that nice, leading to the conclusion that if you are after robust image quality, use a display that is reasonably linear thru-out its gamut.

    So none of this list is a “ICC based system limitation”. Other apps can deal with all of it (like C1,LR, ACR, madVR…) and dithering is the magic word that makes things work as intened without rounding errors.

    I think there is a misunderstanding… My original question—more simply stated,—is How far out of normal display tolerances does an 8-bit RGB display have to be to make a ICC approach unsuitable. I’m sort of not joking when I observe that if its unplugged, then there’s no hope. And clearly a display that’s monochrome, no hope 🙂 But as you approach what commodities display conventions, what aspects really require an XYZ LUT profile to just bring display response to an industry standard. I was proclaiming how great it is that I can run to very different display personalities side-by-side on my Mac and they work very well together. Yes we all get tripped up by noticing differences. Eventually you come to realize that’s natural because everything IS different. And, yes, I can see the difference in whitepoint between DCI-P3 Cinema D60 and sRGB D65 on these two Dell UP216Ds I use, but once the Macs ICC regime has harmonized them, the promise of ICC (even under matrix profiles) appears fulfilled. They behave very compatably.

    So let’s come back around to the guy who took the advice to replace his Dell UP2716D with a BenQ. The Dell is well behaved. Why couldn’t an ICC profile get him what he needed? Why did he have to replace his display? His complain was something about “…things seem reddish.” Maybe I missed something but I don’t see how this led to the replacement of the whole display.

    And as to any need to give the display a personality for a specific video workflow, say strict 709, he could have used the i1 Display Pro with Dells DUCCS (which is i1 Profiler)  to load a user-slot with precisely the personality he needed, with precise gamut and tonal response and a null system profile. And he can give it any personality he wants.

    What he needed was an i1 Display Pro, not a new display.

    Then this recommendation of needing at least a 2000:1 contrast ratio got inserted—Maybe I wasn’t reading it carefully enough, but in the end I don’t think anything got figured out.

    What to do when you cannot cover some portion of a colorspace is not an ICC based system limitation, its a HW limitation (if we name an internal gamut emulation in a display as HW too = outside of ICC based system scope). It is solved by something called rendering intent, “cut” (colorimetric) vs “deform” (perceptual).

    Some vidoe apps are not color managed so that guy’s Dell can be a reliable device but it it cannot make it behave INTERNALLY as some arbitrary colorspace in gamut boundaries too (not only white & gamma) it cannot be used WITH THAT APP to show proper colors. Here we have 1st a software limitation, huge (not ciolor managed) then a 2nd HW limitation, But 1 comes before 2. 2 is needed because the lack of 1.

    The whole point of the Dell’s in-unit LUTS is to make it have whitepoint, tonal response and gamut which agree with any standard you can plug into Dell’s i1 Profiler, and that package allows many options. I’ve tried it and it really works. Plus the Dell has more than enough gamut, and CR 1200:1 is sufficient for many uses. The difference of 1200:1 and 2000:1 is less than one stop.

    So more or less that will be a fast answers to your questions.
    1st limit is HW limitation on uniformity since ICC supposes a uniform pixel response across screen

    If all this comes down to Dell panel quality control. There is amazingly little evidence around to show what a terrible problem it is. I’ve owned 6 Dell IPS in the last 15 years and none has had a problem with this. I certainly have seen such problems, but they are uncommon, and an issue with panel sourcing issue and affect all vendors who use IPS. The right thing to do is exchange the defective unit.

    2nd limit is poor implemetation of an ICC based system, THAT implementation, not the whole system as concept or standard
    3rd limit are apps not using it, or adding their own sources of rounding errors & truncations
    4th limit if HW lack of internal programable calibration when apps are not color managed… and this is not an ICC based system limitation.
    5th limit is “vendor special taste” when dealing with “perceptual” intents, and a good example may beHDR content mapped to limited “upwards” WOLED TV which is out of gamut.

    I understand. All that supports a more general point which is if this stuff is so complicated that it drives people crazy.

    The only limitation of an ICC based system as “concept” is that it lacks of uniformity mapping across screen. All other are specific implementation limitations imposed by a HW, OS or app vendor…. or “special taste” on perceptual tables chosen by a vendor to make LUT3D.

    Thank you for your answer. You’ve touched some good points. What I want to know is in what common features of display behavior are intractable to ICC. Certainly uniformity compensation is one. Let’s keep in mind that when these standards were first put together CPU speeds where measured in 10s of megahertz and that embedded micro-controllers where very simple with support for small integer arithmetic. The way a CRT worked, screen uniformity was handled by yoke geometry rotating convergence purity magnets. The idea of compensating for display uniformity computationally was a wild dream. Today we have displays with local-dimming that attempt to achieve contrast effects which anyone with at least one eye can see looks terrible. Why you want that sort of feature inside the ICC regime is not obvious to me but I have never come across such thinking.

    Again, what common aspects of display performance can a system ICC approach not handle, and further what necessitates an XYZ LUT profile when simply bringing a display into alignment for a consumer standard like Rec. 709?

    Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #25941

    Vincent
    Participant
    • Offline

    You have a lot of misconceptions as I read your post

    -What I wrote and what you asked to Graeme are totally unrelated…
    You asked about if there is difference between sampling a few patches in 8 or 10bit to make a profile… and there is not.
    I wrote about color differences between one grey color and its next grey neutral neighbor when you go through all color management pipeline. It’s there where consecutive rounding errors arise that translates to banding/posteriation caused by GPU LUTs (and their lack of dithering)  or banding/posterization caused by truncation in app/color management engine.

    Some GPUs cause this kind of truncation others not (dithering), some apps cause this kind of rounding errors others not (full 10bit end to end … or dithering again). And they are visible.
    And of course it is not an ICC based system limitation as a concept… just poor implementations.

    -Regarding QC… it’s not just QC. The same % tolerances will be more visible in widegamuts because colors and their next “+1” neighbor are further from each other.
    You can have bought tens of Dell or LG sRGB like displays and do not suffer white/grey color uniformity issues (I’ve excluded black screen bleeding)… but as you go widegamut with an AdobeRGB+P3 backlight even if applying the same “percent tolerances” in QC… color distance becomes greater between neighbor colors on the same “Xbit mesh” covering a colorspace
    Also the bigger the screen the higher the chances of an issue
    So a:
    -big (27-32″)
    -widegamut
    -lowcost (vs same features “reference” competitors)
    is MORE prone to have uniformity issues with the same “assembly QC” in factory as their old models. As screen goes bigger and backlight goes to wider colorspace QC needs to be higher than before with 21-24″ sRGB-like displays.

    Of course QC may have gone down this years, so there are more andmore issues, but just going to widegamut and bigger screens median uniformity goes worse with the same QC.

    -Also another correction to what you wrote, humans do not see light diferences in linear 2x, 3x, 4x so going from 1000:1 to 2000:1 may be much bigger than 1/2 brightness in black. For actual “hiuman” difference you cannot use “stops”, you can check a L* table at some arbitrary white level or check dicom tables at certain contrast window (the bigger one) up to the same reference. That will be “actual” color difference in human vision, not stops. That’s why L/was made and DICOM became an standard. I did not make the maths your your example 1200:1 vs 2000:1, you can use it as an exercise. Choose a reasonable white level 120-160cd*m2, calculate where are those two blacks in L*, in dicom , “human color distance” between them, between a “dark-black” reference in ambient (using several values)..etc.
    That will be a valid way to address this issue, not what I read above.

    • This reply was modified 3 years, 8 months ago by Vincent.
    • This reply was modified 3 years, 8 months ago by Vincent.
    • This reply was modified 3 years, 8 months ago by Vincent.
    #25945

    Wire
    Participant
    • Offline

    I must apologize Vincent, I’m stretching the limits of the forum.

    That post to Greame I included was not mine! It is a random coincidence that I  copied/pasted because it has a specific, though oblique bearing on the topic… I thought I was being clear but the discussion area, but obviously I’m not. So let me  try to regroup and follow up with another message.

    #25946

    Vincent
    Participant
    • Offline

    That post to Greame I included was not mine!

    My fault then.

    Regarding main topic, the only color correction outside ICC based system capabilities is display uniformity.
    I do not see other potential uncovered issues that cannot be covered by ICC, although particular implementations of such system hay have their own issues.
    Well, there is one about absolute/relative and chomatic adaptation that Graeme covered in its web:
    https://www.argyllcms.com/doc/ArgyllCMS_arts_tag.html
    But IMHO severity if any is very limited if ICC implementations just use bradford.

Viewing 6 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS