Some bugs and quirks I found…

Home Forums General Discussion Some bugs and quirks I found…

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #9816

    Stephan
    Participant
    • Offline

    Hello there,

    I have made it a hobby to try and calibrate any random piece of crap using my i1 DisplayPro (and being a sysadmin for a few dozen office folks, I come across my fair share, generally on anything from XP to Windows 7 machines). I have been using DisplayCAL for a while with generally good results, but as fellow i1d3 users might know, the white point might as well be determined by rolling some dice (or at the very least, it is rather higher than it should be), plus I wasn’t sure about the accuracy of imported correction files, so I obtained a trusty old i1 Pro recently (for plain budget reasons). That gave rise to some frustration, not all of which is down to the instrument:

    1. (unrelated) If calibration/profiling ever crashes due to running out of memory after having taken all the measurements (which it seems to do about 2 times out of 3 on a crappy Vista notebook with just 2 gigs of RAM, and even systems with 4 gigs are gasping for breath a bit lately), there is no way of recreating the exact same result by using the “create profile from measurements” function. You generally end up with a profile where none of the LUT compensation curves reach 255, even if you had specified using original brightness, and having to retake the entire measurement. FAIL.
    2. When setting up the measurement window, the OK button is not selected by default. Now imagine having an i1 Pro on your screen, which is very long and generally covering that area. So I have to try and hit it blind, with about a 50/50 hit rate. Annoying.
    3. I’m not sure whether it’s because I’m using adaptive mode, but the instrument quite regularly seems to be initialized twice within a short period of time. That in turn means I also have to take off the i1 Pro and put it on the calibration tile twice, which is aggravating. If that’s just how it is, oh well, but if there is anything to be done about it, this would be appreciated. Oh, I just noticed there’s an advanced option “allow skipping spectrometer calibration”, that sounds interesting for sure.
    4. Could the “interactive screen adjustment” dialog make it a bit more obvious when the instrument is still being initialized? Like bright red font or something? I’m always catching myself trying to click “start measurement” before it’s ready. The contrast between active and inactive buttons also is too low there IMHO. Consider consulting WCAG and the like.
    5. What’s screwed up if when trying to create a colorimeter correction, I can take the measurement fine, but confirming the dialog gives me this error message (attached) and I have to retrieve the CCSS file from the temp directory manually? It’s only a single machine doing this, using DisplayCAL 3.3.5 and ArgyllCMS 2.0.0 x64 on Win7 x64 (I’ve had another where the same combination works fine). I suspect some old remnants from Dispcalgui, but the “Reset settings” option did nothing at least. If nothing helps, I’ll try uninstalling / reinstalling.
    6. Speaking of that dialog, what sort of display technology is “VPA”? 😉 I only know PVA and MVA and derivatives…
    7. The fancy gamut thingy in HTML reports does not display properly in Seamonkey though it does in Firefox. Not sure why. The former used to have some JS issues but those were solved AFAIK.

    Oh, and you know what I would find super handy? Some way of tweaking an existing profile iteratively from a set of verification measurements. In my experience the white point may still deviate from the desired color temp curve especially when a fair bit of correction was needed (and without any monitor controls e.g. on a notebook, that means you’re screwed), and having dealt with my fair share of crummy displays, I would like to be able to iron out their quirks a bit better. Two examples for crummy displays are attached, sharing their native love for blues, where calibration results in overly low levels on bright blue tones instead, and I would hate not to have decently accurate sky (one of them is the monitor my dad uses to look at his photos, and he takes a lot of them). (Same issue also occurs on Samsung 156HT /  Dell E6520 and LG 156WH4 / E6530. Man, those Samsung TN panels have some terrible vertical viewing angle dependency, but that just as an aside.) But maybe ArgyllCMS just can’t do any better, I don’t know.

    To my disappointment, I also found that a CCSS correction, while it does improve accuracy a fair bit in some cases (the old GX710 CCFL job with ~60% sRGB coverage was one of the more drastic ones), will not do anything about the instrument’s white point skew. You have to use a matrix (CCMX) correction instead, that mostly gets it sorted.

    Feel free to point out any instances of PEBKAC, I love tinkering with stuff but I’m far from an expert, and color management is pretty daunting. My idea of what the various kinds of profile (XYZ, inverse, 3DLUT) can do and when best to use them still is rather hazy to say the least, though so I guess it is for most people. Is XYZLUT + MTX fine for browsers, XnView and MPC-HC (Enhanced Video Renderer)? I noticed that video folks seem to like their 3DLUTs. Will applications take LUT corrections into account and detect them reliably, even when using the DisplayCAL profile loader? Things seem to look fine (if not entirely banding-free) in Firefox on an E6520, so I guess so.

    • This topic was modified 6 years, 4 months ago by Stephan.
    Attachments:
    You must be logged in to view attached files.

    Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #9882

    Florian Höch
    Administrator
    • Offline

    Hi,

    (unrelated) If calibration/profiling ever crashes due to running out of memory after having taken all the measurements (which it seems to do about 2 times out of 3 on a crappy Vista notebook with just 2 gigs of RAM

    That should not happen if the system is stable. Low memory is no concern unless you restrict or even disable the operating system swap file. 1.5 GB of RAM is enough.

    When setting up the measurement window, the OK button is not selected by default.

    It is, but wxPython has had a few bugs in the past and it seems the focus needs to move from the window to the contained panel for containing controls to be focusable (the measure button is the default).

    I’m not sure whether it’s because I’m using adaptive mode, but the instrument quite regularly seems to be initialized twice within a short period of time. That in turn means I also have to take off the i1 Pro and put it on the calibration tile twice, which is aggravating.

    This happens due to automatic output levels detection. You can turn that off in the advanced options. For colorimeters it is not problematic though, just for spectrometers in case “Allow skipping of spectrometer calibration” is not enabled (but you shouldn’t use a spectrometer for prolonged measurement sessions anyway if you can avoid it, as low light accuracy and drift are a real problem).

    What’s screwed up if when trying to create a colorimeter correction, I can take the measurement fine, but confirming the dialog gives me this error message (attached)

    Hmm, can’t reproduce this.

    Speaking of that dialog, what sort of display technology is “VPA”? 😉 I only know PVA and MVA and derivatives


    It matches X-Rite nomenclature from their EDR files for technical reasons.

    Oh, and you know what I would find super handy? Some way of tweaking an existing profile iteratively from a set of verification measurements.

    I see little point in that. If the underlying device response isn’t stable, then no amount of tweaking will be able to fix that. Display devices are too linear for any of that having discernible impact.

    In my experience the white point may still deviate from the desired color temp curve especially when a fair bit of correction was needed

    in terms of delta E? (Correlated) color temperature is just useful to determine the reference CIE values you’re aiming for (and that’s how it is used during calibration).

    where calibration results in overly low levels on bright blue tones instead

    That’s to be expected if calibration has to reduce the blue channel. It limits the gamut. The solution is to calibrate to the native whitepoint (or something close to it) instead.

    To my disappointment, I also found that a CCSS correction, while it does improve accuracy a fair bit in some cases (the old GX710 CCFL job with ~60% sRGB coverage was one of the more drastic ones), will not do anything about the instrument’s white point skew.

    Sounds weird. The main point of any form of colorimeter correction is to make the instrument better match a more accurate (reference) instrument, and it has the biggest impact on neutrals. It is imaginable though (and in my experience common) that a matrix correction will provide a better match, because the CCSS relies on the accuracy of the instrument filter spectral curve data that is stored in the instrument (which may or may not accurately present the actual instrument response). The benefit of a CCSS correction is mainly that it can be used with any i1D3 (on a display with similar panel technology).

    Is XYZLUT + MTX fine for browsers, XnView and MPC-HC (Enhanced Video Renderer)?

    Yes, although the only browser that does color management somewhat consistently is Firefox (with gfx.color_management.enablev4 true as well as gfx.color_management.mode 1)

    Will applications take LUT corrections into account and detect them reliably, even when using the DisplayCAL profile loader?

    The point of the calibration curves (which are basically part of pre-profiling display adjustment, just via the video card gamma table hardware) is that applications need not know, care about, or interact with them. The profile loader ensures that the calibration will be re-applied should another program, system event, or graphics driver quirk reset the video card gamma tables.

    #9883

    Florian Höch
    Administrator
    • Offline

    What’s screwed up if when trying to create a colorimeter correction, I can take the measurement fine, but confirming the dialog gives me this error message (attached)

    Hmm, can’t reproduce this.

    Alright, figured it out. This problem only exists on Windows, and only if the path of either the reference or colorimeter measurement file contains a directory that starts with a number, e.g. C:\Users\1User\AppData\Roaming\DisplayCAL\storage, then the \1 gets interpreted as a backslash escape referring to a group number. This is easily solved by escaping the backslash .

    #9889

    Stephan
    Participant
    • Offline

    Interestingly enough, it’s not exactly that, but definitely something to do with the path. I renamed the “i1 Pro &…” directory to no avail, moved it to My Documents or directly into my user directory with equally little success, but dropping it right under D:\ made things work. Methinks it doesn’t like my user name on that system, or rather its post-8.3ness… or perhaps the customary tilde in its 8.3 representation for some reason (why is it being resolved to 8.3 anyway?). I guess I’ll try playing Waltzing Matilda for it next. 😉

    At least that would explain why it’s a Windows-only issue – other systems don’t concern themselves with this 8.3 business. Maybe there are some subtleties to be aware of when resolving paths. I vaguely remember something to this effect occasionally tripping up applications ported from the *IX world due to some unique eccentricities in system functions.

    PS – I hope you had some great (and well-deserved) holidays! 🙂

    • This reply was modified 6 years, 4 months ago by Stephan.
    #9891

    Stephan
    Participant
    • Offline

    Now for part deux

    Hi,

    (unrelated) If calibration/profiling ever crashes due to running out of memory after having taken all the measurements (which it seems to do about 2 times out of 3 on a crappy Vista notebook with just 2 gigs of RAM

    That should not happen if the system is stable. Low memory is no concern unless you restrict or even disable the operating system swap file. 1.5 GB of RAM is enough.

    I have little reason to believe that the system isn’t stable, it’s been used only about once a month for the last few years. That said, I haven’t run a memory test on it in a long time, the CMOS battery seem to have run flat lately, and if I tackle that one I might as well do something about the old 11g WLAN and limited RAM, not to mention the slow harddrive… assuming all that is even worth it.

    Anyway, it’s 32-bit Vista, the system where no application seems to have much luck temporarily disabling the screensaver (DisplayCAL included). You always have to do that by hand. (Not sure whether Aero makes any difference, it’s generally off in this case.)

    I’m not sure whether it’s because I’m using adaptive mode, but the instrument quite regularly seems to be initialized twice within a short period of time. That in turn means I also have to take off the i1 Pro and put it on the calibration tile twice, which is aggravating.

    This happens due to automatic output levels detection. You can turn that off in the advanced options. For colorimeters it is not problematic though, just for spectrometers in case “Allow skipping of spectrometer calibration” is not enabled (but you shouldn’t use a spectrometer for prolonged measurement sessions anyway if you can avoid it, as low light accuracy and drift are a real problem).

    I know… the i1 Pro in particular seems to be rather notorious for this, and the Pro 2 is supposed to be much improved. I figured I’d mainly be using mine for making colorimeter corrections, which seems to work out quite well. I sort of doubt the accuracy if its brightness reading though, it’s about 10% lower than the i1 DisplayPro, and I can’t quite believe that this one should be that far out. The instrument is 11 years old at this point and could quite possibly use a recalibration.

    Speaking of that dialog, what sort of display technology is “VPA”? 😉 I only know PVA and MVA and derivatives


    It matches X-Rite nomenclature from their EDR files for technical reasons.

    That’s potentially confusing though. I’ve never, ever seen that sort of terminology anywhere else, and I bought my first TFT with a PVA panel way back in April 2003. (It still lives, actually, though calibration was direly needed. Monitors with proper color management seem to age a lot more gracefully, as proven by an Eizo L795 that basically needs little more than increasing color temperature to hit an acceptable white point again. I’ve seen gamma drift quite severely when warming up though, starting at 2.48 and ultimately reaching 2.26-ish. I had noticed that my old monitor was quite dark initially, too, but always blamed it solely on aged CCFLs.)

    where calibration results in overly low levels on bright blue tones instead

    That’s to be expected if calibration has to reduce the blue channel. It limits the gamut. The solution is to calibrate to the native whitepoint (or something close to it) instead.

    The fun part is that it still happens when the white point actually isn’t so high but the panel really jacks up the blue channel internally. Take the FP992 I attached earlier, for example – it’s set to R=48, G=46, B=50 just to hit a (real) 5850-5900 K white point as the CCFLs have seen better days (stupid design with fixed backlight), so native is rather lower than that (~5500-ish K or so). Yet blue is being pulled way down.

    I’ve seen that a lot in notebook screens, where I can only guess native CCFL white point was lowish to begin with, and the goal was bumping that closer to 6500 K. Unfortunately this seems to have remained in place for first-generation LED-backlit panels (2010-2012-ish), possibly even longer, which is massively stupid as their white point seems to exceed 7000 K more often than not. Reviews for these panels generally state Delta Es in the 11-12 area (vs. D65 Gamma 2.2). It’s not like TN panels didn’t have any other issues to contend with in the first place…

    It is imaginable though (and in my experience common) that a matrix correction will provide a better match, because the CCSS relies on the accuracy of the instrument filter spectral curve data that is stored in the instrument (which may or may not accurately present the actual instrument response).

    …and in case of the i1 DisplayPro evidence heavily points towards rather limited accuracy of said internal curve (color temp seems to come out quite consistently higher in any comparisons), not to mention non-negligible sample variation.

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS