Calibration Setting: Tone Curve Selection

Home Forums Help and Support Calibration Setting: Tone Curve Selection

Viewing 11 posts - 1 through 11 (of 11 total)
  • Author
    Posts
  • #36919

    Wire
    Participant
    • Offline

    What is the functional difference between:
    Tone Curve > Gamma 2.2 [1.8–2.x]
    and
    Tone Curve > Custom [1.8–2.x]

    Can I use Custom to make a profile that has a calibration (vcgt) that differs from the tone curve. used to do the characterization.

    In macOS this becomes desirable because almost all gfx are subject to the total profile for the display, which means that sRGB/709 are rendered literally according to a content TRC, without the assumption of the CRT EOTF and dim-surround. This leads to a discrepancy in custom display profiles under macOS that content is rendered slightly too light — although ultimately this may be a matter of taste, the keypoint here is that a “correct” alignment on macOS from DisplayCal literally pulls response slightly up and away from industry norms for viewing the defacto standards of sRGB, 709, and Display-P3.

    This is a conundrum for a couple big reaons:

    1), because the 1.1 correction described in the DisplayCal documentation for “Tone Curve / Gamma” assumes a Windows / Linux style of bias to decode of these standard content through an implicit application of vcgt to content that is other wise presumed to be sRGB chromaticities. To put it more simply, there’s a gap between the GPU cal and an internal OS predisposition to sRGB, unless and app brings something to the party.

    2) in macOS overall and CMM Windows / Linux, adding a rendering intent via cal to apply the historical 1.1 decode correction in the display, introduces this correction as a rendering error for content in colorspaces other than sRGB/709.

    I am experimenting with creating display profiles on macOS in multiple steps:
    A) profile under 2.2 or L*
    B) profile again with 2.4
    C) extract cal from B and insert into A, via ArgylCMS iccvcgt

    I am of course aware than making the calibration deviate from the characterization is bad form from a pure accuracy point-of-view, but the DisplayCal documentation explains that this is a reasonable approach for dealing with the implicit 1.1 gamma correction rendering intent built into sRGB and video, with the unstated bias towards Windows and Linux trait that unless an image is directly accommodated by color management in a specific app under consideration, Windows / Linux tend to render under assumption of sRGB/709 chromaticities and effective TRC of the display.

    As a total aside, to me it seems notable that DCI-P3 for cinema and theater chose a TRC of 2.6 end-to-end, presumably to better provision signal towards shadows under assumption of a complete controlled, dark viewing environment. For users of color-managed SW on PCs, a high-gamma alignment could help workaround banding issues due to display limits, but only if the full display pipeline from renderer through display is aligned as such, IOW, the final decode needs to be done in the display itself. This naturally doesn’t work well for content subject to Windows / Linux implicit sRGB rendering assumptions because the images will look wrong (too dark).

    So to return to my question:

    Does the distinction of DisplayCal Tone Curve Gamma  vs. Custom allow the cal to deviate from the characterization? Or is it just a UI oddity?

    • This topic was modified 1 year, 7 months ago by Wire.
    #36923

    Vincent
    Participant
    • Offline

    Looks like the second. It’s easy to test, just look for log files and look for parameters added to displaycal and colprof on both cases.

    #36950

    Wire
    Participant
    • Offline

    Thx

    Reading colprof man page,  there’s a -d option for viewing conditions (print or display CIECAM02 gamut mapping) as follows:

    -d            mt – Monitor in typical work environment
    mb – Monitor in bright work environment
    md – Monitor in darkened work environment
    jm – Projector in dim environment
    jd – Projector in dark environment

    I’m totally ignorant of CIECAM02 so can’t have a serious conversation without more study. Any hints are most welcome.

    It seems maybe what I describe as ‘rendering intent’ in my messages, which I achieve by fudging vcgt gamma is more comprehensively handled for the whole gamut volume inside the profile per gamut mapping?

    This leads to questions about limits of the -d option per profile type and so on. Can it work with shaper-matrix display profiles? I suppose this is a question for Argyll forum, but I’m not at all versed in this topic.

    I imagine the DisplayCal’s “ambient” light measurement feature might work by considering several luminance ranges and picking one of these switches under the covers, which ends up including a TRC contour for the viewing conditions as well as stretching response appropriate to a better fit than my fudge.

    In support of my fudge, I will report that it makes DisplayCal alignments agree much more closely with Apple OOB alignments, which maybe is just stupid conformation bias to a wrong way of thinking… But using my Mac fudge, my display looks really good, not just per a measured stndard of accuracy, but it feels right, whereas without it everything looks slightly flat.

    Pertaining to the first question I ever submitted to these forums about why Apple’s apparently well-aligned out-of-box (oob) response was slightly more contrasty than a DisplayCal alignment. This discussion immediately veered into questions about the viability of my old DTP-94, which is fair game and I have no regrets at all about buying an i1 Display Plus and gratitide for the help. But after a couple of years of wandering around this situation, all my investigations point to a distinction that Apple bakes in a slight gamma bump into desktop / laptop to resolve the end-to-end assumption of sRGB / 709 video. My take on phone is that Apple does not do this, possibly because it’s likely to be used in very bright ambient conditions. Anyway, the DTP-94 can in fact handle WCG and produces good alignments within the limits of more primitive spectral corrections (CCMX which doesn’t belong to my instrument). As to how to trust white, clearly the i1 Display can deliver. But even with i1 Display and a CCSS correction, it’s too easy to miss the mark. The variables are considerable and tricky going in. Only through experience have I gained a sense of what a good alignment looks like — and I’m probably still off the mark! So I really feel for new users who hope to get a good alignment but don’t know what it looks like and get tripped up by many complex edge cases which seem trivial to experienced calibrators. I would never argue that this technology is overly democratized, but I am not issuing fan-service when I claim that Apple does an amazingly great job of good color OOB across all its offerings. As to why they persist with a severe glitch in LUT profile handling, I suspect this got reviewed and comes down to a performance limit in the graphics of older systems. They decided not to fix it because it was impossible to fix, and almost never matters to their market. But this doesn’t excuse that fact that macOS just ingests a valid display profile and barfs completely wrong graphics response. They could at least issue a warning.

    Back on topic, re comparing Argyll options under the hood to DCal GUI tonal response choices: I’d really like to think more about what the right way to use the tools is, and think less about all the edge cases and lore… Garrr

    #36951

    Vincent
    Participant
    • Offline

    If you want a fake inncurate profile as Apple does, it’s easy. Use baked from EDID Apple profile (which for native gamut primaries maye be accurate for mid to high end displays) then embed DisplayCAL profile VCGT using new tool in ArgyllCMS “iccvcgt”.

    Current DisplayCAL synth profile editor allows custom RGB primaries (which can be copied from a custom DisplayCAL profile, matrix info) but AFAIK does not allow a custom curve in a TXT. Maybe this is an upgrade you can ask to Erkan once he finishes the port.

    Anyway that “apple way” won’t be accurate because TRC is what is (measured), in the same way that DisplaYCAL profile idealization choosing “single gamma matrix” idealizes TRC not only in grey color but in L* too.
    Another option would be a DisplayCAL upgrade (ask Erkan) to add “Apple TRC + matrix” to the set pf profile types as another idealiez profile.

    AFAIK these are the only way to get a custom VCGT with apple fake TRC.

    #37033

    Wire
    Participant
    • Offline

    Your idea of canning (making a config preset, putting the options into a can) for “Apple TRC” makes sense  to me given that the way I’ve been describing my experience is Apple-centric. But I don’t think calling this an Apple-oriented tweak is a good idea.

    There’s a larger issue, which is that certain content spaces are overloaded with end-to-end assumptions about decode that cannot be handled in any one-size-fits-all calibration.

    The sRGB spec is a content spec, not a display spec. It assumes the display cal is going to impose correction at final decode. It’s too late to do anything about this because the standard is derived from a history of video in which the correction was built into the camera (content), and the display just operated according to its physics (native decoding). An sRGB display calibration with an sRGB TRC is basically wrong. But the way ICC CM works, there’s no easy way to account for this, because the display’s cal needs to travel with the content space, and display profiling solutions never overtly deal with this. If you run color managed, and open a ProPhotoRGB or AdobeRGB document in Photoshop, the correct cal needs to agree with the content space, unlike sRGB and 709. This becomes a nightmare with DCI because Cinema P3 assumes a literal luminance translation from the source into the display, while Display P3 injects the sRGB TRC and therefore its baggage. It looks to me like a huge mess!

    In a similar vein, BT.1886 cannot be thought of as one alignment: it’s a calling out of the observation that the effective TRC in video display is by dint of history a function of both black level and power response, where display power response is known to vary across devices and both it and black level are locally set, partly against a idealized camera (the PLUGE) and partly in accordance with local viewing conditions in the production studio. If the idea of “artistic sign-off” in production means anything, it’s subject to these variables. BT.1886 is designed to make sure this isn’t forgotten during an era of epic shift in display physics, given tha at the time video standards were laid down, decades more work was on the horizon for CRT, and solid-state display wasn’t even a pipe dream because the transistor was still being perfected!

    Meanehile, ICC for display seems to not have fully considered these factors, because I assume they are so video centric. The idea of an implicit gamma 1.1–1.2  decode inflection at display to me looks distinct from idea of ICC gamut mapping issues that arise in print. And so are the newer hazards of non-linear display response typical of poorly designed and aligned LCD displays, which may have oddly lumpy response and hard clipping especially after the user has tried to “correct” it to meet some video standard. such as goofing around with RGB GAIN/OFFSETS and HUE/SAT controls, which are often nothing more than skeuomorphism.

    I believe the reason that this doesn’t get more attention in calibration circles is basic laziness and misunderstanding, where the inaccuracy that comes along with literal sRGB display calibration, or the conformance of sRGB content to a display space under the most literal TRC conversion—both of which are fundamental inaccurate—is not either no noticed by many users, disregarded,  or can reasonably be chalked up to a matter of taste given viewing conditions that are by nature local. If you look at consumer electronics, you see everywhere mismatches between features and configurations of different vendors. And in production, the most common adaptation to centralize gear to a common reference product. For example, everything HDTVtest Vincent reports assumes that a Sony broadcast unit is the reference. In another shop it might be Flanders or Panasonic. In cinema, Kodak and Sony-Panavision provide such de-facto standards based on patterns of consumption. The whole point of DICOM to end-run this situation. Meanwhile UHD TV is making a bad situation even worse for ICC oriented approach.

    I’ve maybe gone too far astray above, but only to build the point that adding a vendor-specific preset to DisplayCal sounds wrong, when the actual issue is that the display user needs to determine and select a suitable cal for his work based on both the kind of content he is editing, and the historical assumptions of the editing environment.

    The suggestion in the DisplayCal documentation to calibrate to 2.4 actually can never work on a Mac, because the display profile naturally accommodates the cal and by default, everything goes through the display profile. On Mac, the video player may use a rendering library which itself include a correction for 709, so the recursions become a mess.

    These issues are everywhere in display tech, an getting worse: as can be seen in browser discrepancies and continually recurring bugs in color management of embedded content, and the growing popularity of a user-configured renderers to ensure a proofing view is not despoiled by OS / iCC assumptions about alignment.

    Moreover, once all other features of the media pipeline are tuned for a content standard, you still have a concern for a local correction based on display tech and ambient.

    In the sense of claiming “fake” and “accuracy” my question to you is fakea-and-accurate according to which content and rendering assumptions?!

    Ultimately, it looks like a matter of taste, so an advance to DisplayCal would be letting the user express his taste in a sane way that increases awareness of the tradeoffs without either fan-service or blame.

    Detaching cal from characterization in a limited way seems possibly appropriate. I saw a old thread on Argyll forum where I think Florian argued that no good would come of this and rejected the idea of icccvgt. Today, this program is not included in the Argyll Mac binaries that come down with DisplayCal. I had to install Argyll myself using macports to get access to this program.

    Long story short, I’m not onboard with vernacular of fake apple alignment, because it’s needlessly partisan.

    Will keep thinking about how to approach this.

    Re new DisplayCal maintaner, I’m happy to hear about this.

    Vincent, thank you for all the support you provide on these forums, you’re great!

    #37042

    Vincent
    Participant
    • Offline

    There are lot of wrong assumptions in your text… and that is not new.

    This is an Apple issue. Apple EDID profiles are fake and are fake on purpose due the limitations of Apples color management engine. If you need to use fake apple icc, then calibration apps should offer you that kind of idealization, in the same way they offer you a pure power gamma curve as profile TRC.

    A profile is a device response description. Period. It has to capture actual device response with the best detail it can.

    Dealing with limited contrast, out of gamut and such issues it is not a responsability of a ICC profile…. it is a task for color management engine and the rendering intent it is using. And this rendering intent is not limited to 4 types, for example Adobe relative colorimetric with BPC is applying a percetual correction to out of gamut near black colors.

    The issue here is the computation simplifications in color management engines. For example Adobe’s ACE on a common GPU.
    If you use an extremely detailed ICC profile, an XYZ LUT cube/table with 3 independent TRC per channel it captures in great detail lots of irregulaties including small coloration in grey ramp that in display colospace (no color management) are almost invisible to naked eye.
    Due to brute truncations to 8bit in ACE + GPU ouput pipeline this leads to color banding in greyscale when ACE tries to correct TRC to be neutral.
    THIS IS NOT AN ISSUE OF ICC PROFILE (unless it is not accurate). This is an issue of color management engine + GPU ouput pipeline.
    Since we cannot alter or fix this for Adobe Ai or In because we cannt use 10bit OpenGL surface drawing like in Ps for some GPU drivers… we chose to FAKE our display profiles so the LIMITED FUNCTIONALITY of color management engine do not make such brute truncations. We choose an idealized (=less acurate) profile like matrix profile + single curve + bpc…a dn if display is wel behaved this idealized (= less accurate) profile can describe display behavior under some low dE, so we can use it. But the limitation is not on ICC structure, it’s on the app that uses it.

    If the same way Apple color management engine is extremely limited. Their target market is casual and values a fluid desktop over all. It is very likely to be GPU assited with all the simplifications they can. DUE TO THIS LIMITATION, ON ENGINE SIDE, they choose tho priorize the out of the box experience for casual users, so a fake TRC and a matrix description with EDID primaries was their natural choice.

    Whatever correction you need to apply due to mismatch between an ideal profile (RGB0 =>L*0) for CONTENT and actual device response (like a display profile) should be on engine side and the (tweaked or not) rendering intent. It is not partisan, it’s a fact. It is stupid to argue on Apple side for their wrong doings.
    This applies to Adobe ACE too in the 3 TRC example. But since we cannot change engine we choose to “fake” (idealize) profiles to overcome this ENGINE limitation.
    And MS ICM engine should have their faults too, although no bodycares about it.

    So again, since this is an Apple only issue you can wait till Apple corrects it (unlikely since they have a lot of other uncorrected issues in their engine) or fake device ICC so all the faults in engine implementation go unnoticed or minimized.

    • This reply was modified 1 year, 7 months ago by Vincent.
    • This reply was modified 1 year, 7 months ago by Vincent.
    • This reply was modified 1 year, 7 months ago by Vincent.
    #37048

    Wire
    Participant
    • Offline

    Due to my rambling style I’ve led your thinking away from my primary observation:

    The issue I’m facing is the the far and away most  common content codings: sRGB and 709 (and Display P3) are not intended to be viewed with a 1:1 luminance translation from the content to the display. Due to history of video, the display is expected to attenuate the rendering under a dim-to-dark surround viewing assumption, which is very common viewing condition, although phones and very bright TVs have changed this lately.

    On Mac, or any CMM when you display sRGB under ICC the engine renders the content into the display without the video decode assumption that’s part of the legacy of the sRGB (and broadcast video) standards. My understanding is that the need for this end-point correction is just a fact of history.  Unfortunately, with 1:1 luminosity this renders sRGB content too bright (inaccurate to be flippant, but as I keep saying it’s somewhat a matter of taste).

    I know two places I can influence rendering to add the decoding gamma correction expected by sRGB: 1) is in the vcgt: I can set the cal to introduce the correction behind the back of of the CMM, 2) I can provide a hint to colprof which invokes CIECAM2 rendering intent. I don’t know anything about how this works.

    My simple experiment was to do (1) and observe the effect and subjectively judge its merits. My opinion is that it makes my display looks better for sRGB content, and renders the entire Mac UI more in line with Apple’s OOB look.

    But I admit this cannot be a general approach  to “improved” response, because tweaking the cal also affects rendering from content spaces for which a 1:1 luminosity translation is the expected, such as AdobeRGB, ECIRGB, ProPhotoRGB.

    However, lucky for me, accuracy on this point is partly a matter of taste! 🙂 Because the final rendering must of necessity be about the viewing conditions for the display. This is why TVs historically  have “picture” and “brightness” controls.

    My point about Apple’s OOB alignment is merely the observation that Apple has taken the need for an end-decode correct into account where their display profiles advertise sRGB to the CMM and their cals add something close to the video decode correction as required for standard display of sRGB.

    I’ve not tried to quantify the precise parameters of Apple’s alignments against the standard, and I think it’s not very important to me to understand exactly what they do.

    I’m more interested in the conundrum that ICC tech can’t deal with the long-standing implications that the display decode needs to vary with the content space because some spaces expect the display to provide an implicit correction and others don’t.

    I see this as a more general problem of what I term in my own made-up vernacular “rendering  intent” as opposed to CMM rendering intent because it so far seems to me that ICC+whateverCMM have not accommodated this trait of dependency the receiver (display) to provide final attenuation. MAYBE I AM WRONG but my story actually fits well with Windows color mgmt where that OS actually is historically passive WRT total sRGB assumption: everything is assumed to be sRGB and display which historically was a repurposed video monitor is assumed to operate on its own (adding the final CRT power response gamma correction). Linux copied Windows.

    So my issue, so to speak, is properly an Apple related issue, I cannot argue with you here, but the reason is because Apple CMs the whole experience. My issue is actually introduced by me (!) the user, because by default DisplayCal creates a cal that’s incognizant of the needed decode correction for accurate rendering of sRGB! But what else can it do, as there’s no obvious general solution.

    Florian explicitly acknowledges this in the Documentation by writing (paraprase) ‘Don’t assume that the proper alignment for sRGB / 709 is sRGB tonal response’ I’ll leave it to the diligent reader to verify this. But he also implicitly describes this from a Windows / Linux POV: on those systems the UI tends to let the displays’s tonal response have the final needed effect on decode assuming implicit sRGB regime. This cannot work on Mac, but that’s simply an omission. And on Windows / Linux, once you are considering a specific application, that’s out of scope of DCal documentation, so he’s completely correct in the writeup, but there’s more to the story the user has to figure out.

    Conclusion: It’s not obvious how to handle this in a general way. It’s a messy situation. I’m posting just to learn and encourage dialog. My kluge of inserting a final decode gamma correction via VCGT is working well for my very narrow self-serving use cases 🙂 Maybe others will experiment and add thoughts.

    #37058

    Vincent
    Participant
    • Offline

     

    My point about Apple’s OOB alignment is merely the observation that Apple has taken the need for an end-decode correct into account where their display profiles advertise sRGB to the CMM and their cals add something close to the video decode correction as required for standard display of sRGB.

    On the wrong way (Apple), as usual. It is responsability of engine + rendering intent, not profile’s. Profiles describe devices. We only idealize profiled when CMM engine fails or performs slowly.

     

    I’m more interested in the conundrum that ICC tech can’t deal with the long-standing implications that the display decode needs to vary with the content space because some spaces expect the display to provide an implicit correction and others don’t.

    None, it is not about profiles as said several times.

    So my issue, so to speak, is properly an Apple related issue, I cannot argue with you here, but the reason is because Apple CMs the whole experience. My issue is actually introduced by me (!) the user, because by default DisplayCal creates a cal that’s incognizant of the needed decode correction for accurate rendering of sRGB! But what else can it do, as there’s no obvious general solution.

    You are wrong again. It is not profile responsability. Regarding calibration, ArgyllCMS “actual” TRC (!= profile’s, t I meant the actual one, the measured one) obtained after calibration, AFAIK priorized neutral grey and tonal separation over strict TRC value in the lower end.

    On a color managed enviromnet this does not matter at all, so it is not an issue with calibration. It is an issue with a faulty color management engine.

    A faulty color management engine (because Apple’s preference it’s its casual customers) is differet than no color management at all on desktop.

    Florian explicitly acknowledges this in the Documentation by writing (paraprase) ‘Don’t assume that the proper alignment for sRGB / 709 is sRGB tonal response’ I’ll leave it to the diligent reader to verify this. But he also implicitly describes this from a Windows / Linux POV: on those systems the UI tends to let the displays’s tonal response have the final needed effect on decode assuming implicit sRGB regime. This cannot work on Mac, but that’s simply an omission. And on Windows / Linux, once you are considering a specific application, that’s out of scope of DCal documentation, so he’s completely correct in the writeup, but there’s more to the story the user has to figure out.

    I think he meant what I wrote in the previous quote.

    Conclusion: It’s not obvious how to handle this in a general way. It’s a messy situation. I’m posting just to learn and encourage dialog. My kluge of inserting a final decode gamma correction via VCGT is working well for my very narrow self-serving use cases ???? Maybe others will experiment and add thoughts.

    It is obvious where to do this if yiu want to do it in teh right way: CMM+rendering intents.
    But simce they (apple ) are unlikely to change it, this leads to “FAKE” profiles like the ones made from EDID by Apple. They made the mistake, they “hack” the profile rules (make an innacurate profile) to solve their own issue and keep fluid desktop with a simplified CMM.

    • This reply was modified 1 year, 7 months ago by Vincent.
    #37060

    Vincent
    Participant
    • Offline

    Of course others do the fake/simplified profiles too. And we do it in DisplayCAL as users on puprpose.
    For example Eizo and others know the precision limitation in Adobe & other software, so since they make displays that behave in a very good way, they make idealized profiles (akin to matrix simple curve) in their calibration software. “They put the band aid before the scratch”

    But this not absolve the color management engine bug & issues. This is our way to avoid issues by making “fake” profiles, to move on and work with our devices.

    So it’s silly to justify apple’s wrong doings. Their auto generated profiles are FAKE, so since Apple won’t change its faulty engine you have to fake your own custom profiles if you whish the same response as fake innacurate ones auto generated by Apple’s.
    Maybe a binary hex editor and copy & paste your desired TRC would do the jobs since it seems that no calibration package allows you to “idealize” display ICC profiles in that way.

    • This reply was modified 1 year, 7 months ago by Vincent.
    #37065

    Wire
    Participant
    • Offline

    I agree that the proper place to do the correction is in the CMM.

    Do you know of any CMM that actually does this?

    It’s not an option for me, but I also would not go so far as to blast Apple for not including it,  as follows: the proper place for the correction is not necessarily the best place for it from an engineering pov:

    • Why expensively compute what you can get for free from the physics of the device? Historically the correction circuitry was put in the camera for precisely this reason, 1 camera with a cost bump at point of production, where cost is easily borne, serving millions of TV sets native physics which are sold on price.

    • The history of the correction is literally an end-device per R-G-B channel correction, with its attendant effect of apparent  increase in saturation as well as contrast. The vcgt is already there and operates for free. Tweaking it to achieve this correction is precisely what it was intended to do and produces precisely the right effect.

    • The correction itself is nearly a fudge, because its perceptual value is true, but arguably purely subjective and possibly unwanted in some (bright) environments. I mention this only because it’s about design conundrums in engineering, and so about ways of thinking not just pragmatics. History has already spoken and we live with it, but we can imagine explanations in context of history to open doors to thoughts of the future.

    • Computing today is 10,000x cheaper than it was when the video signaling was invented, but each step in the arc of progress was taken from the one before it. You can argue that Apple’s CMM is incomplete, but the CMM is a funny module because once it was adopted, so much content value transits over its behavior than it can’t be easily changed without big risks. For example, Adobe After Effects (the program) is known to not be able to be evolved to use GPU because the  differences in GPU calculations lead to different output which wrecks workflows with a lot of “customer value” stored in previous work. An analogy to this conundrum is web browser default rendering into display space after ICC had already been embraced: expensive site designs based on Adobe Flash matched color numerically between page layout and embedded media. When browser started rendering sRGB, it no longer matched the color from media plugins. This lead to a lot of costly support calls. You could justifiably argue Adobe (the user) is wrong! But they’re a big rich business with account bottom-lines partly funding the web revolution. And so on. History is full of these “compromises.” My point is that changing a CMM is a big deal to its vast installed base!

    • You and I are powerless to affect the CMM. The profile  our point of control.

    So I seriously can’t imagine how you could be any more dumb and useless in your arguments. Honestly—and I’m not putting you down—you’re just dumb and wrong in your points.

    #37066

    Vincent
    Participant
    • Offline

    I agree that the proper place to do the correction is in the CMM.

    […]

    So I seriously can’t imagine how you could be any more dumb and useless in your arguments. Honestly—and I’m not putting you down—you’re just dumb and wrong in your points.

    Apple fanboy in rampage. Nothing new, every error they make to overcome its design limitations (speed for a fluid desktop since the core of its customers are casual and not oriented to graphic arts) it’s seen by these people as miraculous p**p fallen from heavens to the earthly world of colours.

    But the truth is those issues are not ICC profile responsability because, again, an ICC profile is just a description of how a device behaves.
    You do not seem to understand or accept where the flaw is… because as an Apple fanboy recognizing this flaw is an act of apostasy.
    Meanwhile other people know (even by trisl & error) CMMs limitations, like  Adobe’s (3 TRC issue & no 10bit openGL truncation). So in these situations we choose ON PUPORSE to FAKE display response, as long as these idealized profiles error to actual response falls under X dE, but we know that they are fake and WHY we make them fake (or we should, IDNK you).
    OTOH you belive these miraculous p**p fallen from Cupertino heavens is flawless and for your own sake.

    Quite different approach to the problem.

Viewing 11 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS