displaycal madvr preset / Windows 7 photo viewer/ general questions

Home Forums Help and Support displaycal madvr preset / Windows 7 photo viewer/ general questions

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #5056

    Hannes
    Participant
    • Offline

    @Florian

    First of all I have to thank you for this great piece of software, it made my life a bit more (right)colorful 😉
    At the moment I have a calibrated desktop (1DLUT), color managed browsing through Firefox by using a XYZ LUT profile and perfect calibrated video playback through madVR with a 3DLUT.
    All of this is possible due to this wonderful tool here. Many thanks.
    I have some questions / remarks / suggestions concerning the calibration/profiling process:

    1. I used the madVR 3DLUT preset in displaycal for managing my setup. The 3DLUT worked great, but the profile wasn´t as intended and gave a horrible picture quality.
    I found out that the “use low quality PCS to device table” was the culprit. Of course it was my fault not reading the instructions carefully enough, but though I think it would be a more reasonable default to disable this option.
    I think that more people will use this madVR preset to generate a 3DLUT AND along an applied ICC file than people would only use a 3DLUT. If disabled this would mean a longer calculation time (unnecessary for 3DLUT only), but it wouldn´t do a harm for the process.
    In my case it was harmful;)

    2. I found out that Windows 7 photo viewer doesn´t support XYZ LUT profiles (I used a XYZLUT + swapped matrix profile, so I could see the wrong matrix was used). Do you know as an expert in this topic a freeware photo viewer that works correctly with LUT based color profiles?
    The net is full of discussions about ICCv2 and ICCv4 (I know that Argyllcms and displaycal only use v2), but there isn´t a lot of discussion about LUT profile support. Maybe you could give me a hint about that…

    3. I use a colorimeter (colormunki display) and calibrate/profile a wide gamut TFT monitor with contact mode. I let the display warm up at least 2 hours. The colorimeter is stored at room temperature (20°C). Now 2 different approaches:
    setting A: the colorimeter is placed on the monitor during warmup time
    setting B: the colorimeter is NOT placed on the monitor during warmup time

    When now making a measurement report, setting B delivers a gamma curve that is parallel shifted to that one from setting A. Setting A delivers an average gamma of e.g. 2.3, setting B then delivers the same gamma curve shifted parallely down to an average gamma of ca. 2.18.
    That is not very much, but regarding the whole discussion about the perfect gamma of 2.2 or 2.4 it is already half of this range we´re talking about.
    Of course I use the same setting for both the calibration process itself and the measurement report afterwards. And I always use setting A.
    But: I have a beamer, which I have to calibrate in noncontact mode of course.
    So there is the question that could maybe be answered out of practical experience: What could probably be considered as the “right” value? Or asked in another way: What is the normal operating temperature for a colorimeter? Contact mode maybe leads to temperatures 5°C or 10°C higher than non contact mode.
    Assuming this difference of 5-10°C it is remarkable that this already leads to gamma shifts of more than 0.1. Or alternatively said: the temperature compensation of my colormunki display is worse than assumed by me 😉

    4. General approach question:
    Would there be some benefit in another precalibration approach than your suggested one? The interactive thing before the actual calibration concentrates on the white point so that ideally 100 IRE has 0 dE.
    What if I take into account the whole greyscale and aim for average dE? I mean like looking in HCFR at RGB levels graph and set the RGB gains of the monitor in such a way that I have low AVERAGE dE and not low dE for 100 IRE. My display doesn´t behave very homogene, so when fixing 100% white I get worse results for the rest of the scale.
    On the other side I observed this: when calibrating and profiling with perfect 100% IRE before this process, I have no white clipping after calibrating/profiling, with a NOT perfect 100% IRE I get white clipping (flashing bars above 235 with AVSHD test disk)…
    What´s your opinion on that?

    Calibrite Display SL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #5062

    Florian HĂśch
    Administrator
    • Offline

    I used the madVR 3DLUT preset in displaycal for managing my setup. The 3DLUT worked great, but the profile wasn´t as intended and gave a horrible picture quality.
    I found out that the “use low quality PCS to device table” was the culprit.

    How did you install the profile? DisplayCAL will not let you install a profile with low quality B2A tables and automatically offer to create high quality tables (to get the profile installation dialog when you used the madVR preset, you have to select your actual display device under “Display & instrument”).

    I found out that Windows 7 photo viewer doesn´t support XYZ LUT profiles

    True. Windows itself doesn’t fully support display color management unfortunately, and in Windows 10 (probably also 8/8.1), the photo app doesn’t even have the rudimentary support that was there in Windows 7. I would advise switching to a different image viewer (personally I use XnView MP, but you need to enable color management in its options and it doesn’t have multi-monitor color management support).

    When now making a measurement report, setting B delivers a gamma curve that is parallel shifted to that one from setting A. Setting A delivers an average gamma of e.g. 2.3, setting B then delivers the same gamma curve shifted parallely down to an average gamma of ca. 2.18.

    That’s a curious difference, and the only thing I can assume is that it’s due to pressure on the panel (from the instrument) or possibly a “hot spot” under the area the instrument covers, because the ColorMunki Display (like most colorimeters) isn’t really susceptible to being affected by heat (at least not in the quantity that monitors produce).

    The interactive thing before the actual calibration concentrates on the white point so that ideally 100 IRE has 0 dE.

    No, the whitepoint sets the target, and then the whole grayscale is made to match said target.

    On the other side I observed this: when calibrating and profiling with perfect 100% IRE before this process, I have no white clipping after calibrating/profiling, with a NOT perfect 100% IRE I get white clipping (flashing bars above 235 with AVSHD test disk)…
    What´s your opinion on that?

    That’s to be expected: When you adjust the whitepoint to match the target, the target white is guaranteed to lie within gamut. If you don’t, the target white may be out-of-gamut with respect to the actual white of the display. The way to avoid clipping with a 3D LUT in that case would be to either use “Absolute colorimetric with whitepoint scaling” (the default) which scales the white down so it lies within gamut, or to simply ignore the target white and use the actual display white by choosing the “relative colorimetric” rendering intent.

    #5074

    Hannes
    Participant
    • Offline

    How did you install the profile? DisplayCAL will not let you install a profile with low quality B2A tables and automatically offer to create high quality tables (to get the profile installation dialog when you used the madVR preset, you have to select your actual display device under “Display & instrument”).

    Ooops, so I bypassed the intended workflow… I took the madVR preset, took madVR as display device, changed calibration tone curve, started the whole process. At the end I was asked for 3DLUT installation which I confirmed. And I wondered why I wasn´t asked for the profile installation. Logically (at least in my world 😉 ) I went to displaycal´s storage folder and used the ICC I found there with Window´s color management, installed it, and finally chose it with displaycal´s profile loader.  The mountain didn´t come to the prophet, so the prophet went to the mountain … 🙂

    I would advise switching to a different image viewer (personally I use XnView MP, but you need to enable color management in its options and it doesn’t have multi-monitor color management support).

    Many thanks for that hint, I´ve installed the viewer, found the according options, seems to work everything!

    That’s a curious difference, and the only thing I can assume is that it’s due to pressure on the panel (from the instrument) or possibly a “hot spot” under the area the instrument covers, because the ColorMunki Display (like most colorimeters) isn’t really susceptible to being affected by heat (at least not in the quantity that monitors produce).

    Ah, I didn´t think in this direction, but that sounds absolutely logical to me. When pressing the monitor´s surface with a finger, there obviously is a large change in brightness in this area.

    My monitor doesn´t stand absolutely vertical, so there is a little pressure from the colorimeter during measurement. Maybe I will place the colorimeter above some books and stay away some millimeters away from the monitor to eliminate these effects.

    And when doing a measurement of my beamer there isn´t such an effect of course.

    That’s to be expected: When you adjust the whitepoint to match the target, the target white is guaranteed to lie within gamut. If you don’t, the target white may be out-of-gamut with respect to the actual white of the display. The way to avoid clipping with a 3D LUT in that case would be to either use “Absolute colorimetric with whitepoint scaling” (the default) which scales the white down so it lies within gamut, or to simply ignore the target white and use the actual display white by choosing the “relative colorimetric” rendering intent.

    Interesting. For both cases (one case RGB gains for perfect white, the other case balanced dE) I didn´t change the defaults so “absolute colorimetric with whitepoint scaling” was used in both cases… And when using the 3DLUT with precalibration perfect white there were no bars above 235 visible, and when using the 3DLUT of the other case bars up to around 249 were visible…

    What is furthermore interesting to me: Both 3DLUTS show clipping in the AVS “Color clipping” test clip. Without using one of the 3DLUTs there is no clipping visible. When using the 3DLUT with precalibration correct white I get color clipping in this test video (red up to 239, green up to 241, blue up to 251) but NO clipping in the “white clipping” test video. There seem to be some coherences I don´t quite understand…

    #5216

    Hannes
    Participant
    • Offline

    @florian

    Many thanks for your help so far. At the moment I am calibrating my DLP projector and have a few questions:

    1. For a 3DLUT, what is the best choice for the rendering intent in your opinion, when, like in my case, the gamut triangle is tilted? I mean: For certain colors the gamut of my projector is too small, for other colors it is too big. Maybe perceptual?
    2. I´m creating an Excel sheet at the moment which should show different gamma curves. I want to show in one graph different theoretical gamma curves and my measured real gamma curve and then play around with input/output offset. I have already implemented showing curves for 100% input offset (Rec 1886) and 100% output offset and everything seems to work ok when entering black level, white level and the desired gamma (I use relative). Now a quote from your online documentation: “A subtlety is to provide a split between how much of the offset is accounted for as input to the ideal response curve, and how much is accounted for at the output, where the degree is 0.0 accounts for it all as input offset, and 100% accounts for all of it as output offset.” It is good to have the possibility in Displaycal to mix the two curves. But I am not so sure, how the mix is achieved. I have compared my 100% input and 100% output values with HCFR ones and they are correct. Then I have tried the ollowing to get the mix: I simply weighted the two “extreme” curves. So 30% output offset means in my calculation: mix curve = 0.3 * (100% output offset curve) + 0.7 * (100% input offset curve). When now comparing the results with HCFR, I have different numbers. I am asking you, because you are also using the concept of mixing the two curves in displaycal and zoyd from HCFR isn´t very active at the moment. And I don´t use HCFR for calibration (I use displaycal for that 🙂 ) I use HCFR only for checking the results and experiments. I´m aware that HCFR works with the number of % input offset, and you are working with %output offset, that´s not the problem. I´m asking you, if you are using the same “mix concept” like HCFR and of course what is the concept.. I don´t understand what “how much of the offset is accounted for as input to the ideal response curve, and how much is accounted for at the output” means, if NOT weighing the two extreme curves accordingly.
    #5257

    Florian HĂśch
    Administrator
    • Offline

    For a 3DLUT, what is the best choice for the rendering intent in your opinion, when, like in my case, the gamut triangle is tilted? I mean: For certain colors the gamut of my projector is too small, for other colors it is too big. Maybe perceptual?

    When the gamut sizes and shapes are not too dissimilar, my general recommendation would be one of the colorimetric intents (“abs. col. with whitepoint scaling” or “rel. col.”). Perceptual may be worth a try if you really need gamut compression due to larger differences between gamuts.

    But I am not so sure, how the mix is achieved. […] Then I have tried the ollowing to get the mix: I simply weighted the two “extreme” curves. So 30% output offset means in my calculation: mix curve = 0.3 * (100% output offset curve) + 0.7 * (100% input offset curve)

    The output offset determines how much of the black level is accounted for at the input and output of the function. You can look at the Argyll or HCFR source to see how it’s implemented.

    #5298

    Hannes
    Participant
    • Offline

    When the gamut sizes and shapes are not too dissimilar, my general recommendation would be one of the colorimetric intents (“abs. col. with whitepoint scaling” or “rel. col.”). Perceptual may be worth a try if you really need gamut compression due to larger differences between gamuts.

    Ok, will try that. 3DLUT creation doesn´t need much time (when calibration and profiling is already done), so maybe I will create different 3DLUTS and compare the results. I asked out of interest from a theoretical point of view.

    The output offset determines how much of the black level is accounted for at the input and output of the function. You can look at the Argyll or HCFR source to see how it’s implemented.

    Hmpfh 😉 I thought there would be an easy mathematical kind of answer with fairly simple formulas. I myself can do some C/C++/C# coding, but actually I´m not a real programmer, so reading source code that I didn´t code myself is not easy for me. I took the Argyll code and found the according part in \spectro\dispcal.c probably line 372 – 469. I didn´t realy get it how that works. Maybe in HCFR code it´s clearer, didn´t take a look there.. It seems much more complicated than simply weighting the two extreme gamma curves (100% input and 100% output offset curves) according to the entered number of output offset %.

    For my setup I surely can achieve the right gamma curve with some trial and error, but it was an idea of me to make that a little bit more understandable with some visualisation. My idea was the following: Doing a greyscale measurement, entering the numbers in an Excel sheet, entering different gammas and % output offset, drawing different gamma curves. Then you´ll see BEFORE calibration and profiling the gamma curve that will be applied. Such a kind of Excel sheet could probably lead more directly to the desired curve. Only having numbers to enter (effective gamma and output offset %) before the calibration process could lead to an iterative procedure with some repetitions. Having visual support from Excel could support the approach for the “right” gamma curve (especially for the low greys / blacks). So my Excel sheet is ready for that, but only for the 100% curves…

    Ok, now you could say that I should stop whining and should learn to read someone else´s source code 😉

    • This reply was modified 7 years, 3 months ago by Hannes.
Viewing 6 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS