Dell U3223QE / Gamut Coverage measurement differs wildly in Linux vs Mac

Home Forums Help and Support Dell U3223QE / Gamut Coverage measurement differs wildly in Linux vs Mac

Viewing 9 posts - 1 through 9 (of 9 total)
  • Author
    Posts
  • #140594

    Ariel
    Participant
    • Offline

    Trying to profile my new Dell U3223QE monitor with DisplayCal, mainly for Linux/darktable use.

    Per specs, this monitor uses a 98% DCI-P3 wide-gamut panel. 100% SRGB.

    Using a Calibrite Colorchecker Display Plus (new), with the “Spectral: LCD PFS Phosphor WLED Family” correction. 6500K, 120 cd/m2, Calibration Speed = High

    The linux PC has an Nvidia 1650 PCI card and connects to the monitor via DisplayPort. I am using Nvida proprietary drivers (535).

    Running Ubuntu 22.04, using the displaycal via Flatpak (3.8.9.3), I get this:

    GAMUT_coverage(srgb) 0.9654
    GAMUT_coverage(dci-p3) 0.8665
    GAMUT_coverage(adobe-rgb) 0.7835

    I tried running displaycal with Gnome’s color management on and off, I tried displaycal 3.9.11 building it locally (successfully), I tried upgrading argyllcms to 2.3.1 (ubuntu’s 22.04 default is 2.2.1). I tried High, Medium and low speeds. No matter what, DCI-P3 coverage comes out in the 86-87% range and SRGB comes out in the 95-96 range

    I found this very weird so I installed displaycal in my macbook (sonoma 14.3, MBP 2020, connecting to the monitor via USB-C), displaycal 3.9.11 using brew. I used the exact same correction file, and ran it several times, on High and Medium speeds.

    Every time, on the mac, displaycal gives me DCI-P3 coverage in the 97.1-97.5% range, and 100% SRGB. Kind of as expected.

    In both scenarios, the monitor is set to Custom Color, and I adjust successfully color temperature and brightness in it and end up with the same values (Brightness and R/G/B levels) for Linux and Mac.

    In LInux, the nvidia-settings tool shows Color Space: RGB, Color Range: Full, dithering disabled. Nothing appears to be wrong.

    Question: am I doing anything fundamentally wrong in my linux setup? I ran out of ideas on what to try :/

    Calibrite Display Plus HL on Amazon   Calibrite Display SL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #140599

    Vincent
    Participant
    • Offline

    It looks like

    -you were using “srgb/rec709” mode / OSD preset in Linux PC, while on macbook you were using Standard/custom color mode or something like that

    or

    -your linux distribution has some kind of edid sRGB simulation running on top of it. IDNK Ubuntu configuration.

    #140604

    Ariel
    Participant
    • Offline

    @Vincent that was close. I was indeed disabling Color Management (in gnome settings) for the monitor prior to running displaycal and I thought that was enough (as I could see the color change in the wallpaper image when toggling the switch).

    I found by chance that I needed to restart the system after disabling color management. If I run displaycal after the restart, I get what I was expecting,

    GAMUT_coverage(dci-p3) 0.9683
    GAMUT_coverage(srgb) 0.9999
    GAMUT_coverage(adobe-rgb) 0.8743

    Very close to the results I got with the mac on the same screen.

    To confirm the issue, I re-enabled color management, restarted the machine, ran displaycal and the problem re-appeared. Did the reverse and the problem was gone.

    Lesson learned, system color management needs to be disabled prior to displaycal, and a restart is needed after disabling it.

    Also confirmed that the flatpak version works fine (as does the new python-3 version, 3.9.x). The only issue with the flatpak is that it can’t install the resulting profile, you have to extract it from the folder where it’s stored and manually add it in gnome-settings. Surely a flatpak permission issue.

    FWIW I also tried enabling dithering in the nvidia card (using nvidia-settings, dynamic 2×2), enabling/disabling it didn’t make a difference in terms of calibration. I read though that dithering helps prevent banding issues post-calibration, so I left it enabled.

    • This reply was modified 2 months, 2 weeks ago by Ariel.
    • This reply was modified 2 months, 2 weeks ago by Ariel.
    #140607

    Vincent
    Participant
    • Offline

    Lesson learned, system color management needs to be disabled prior to displaycal, and a restart is needed after disabling it.

    That points to “wrong color management” on that linux distribution, but rather be called sRGB simulation which is more accurate.

    You can try to test it by showing color mangaged P3 patches, for example running a baisc profile verification with no simulation profile:

    -If that  feature labeled “Color management” in that ubuntu distribution is actually systemwide sRGB simulation, verification will go wrong. Lower a*b* plot will show desaturation (white holes be ~P3, color dots be ~sRGB). = it is not color managed desktop, is a simulation of sRGB.

    -Otherwise, if verification shows that all is OK and lower a*b* plot is P3 like on white holes and color dots will mean that “untagged RGB patches” get rendered as sRGB. If that happens it may be interesting that you send your findings to Graeme Gill in ArgyllCMS maillist because it may be corrected in some way. For example macOS desktop is color managed but ArgyllCMS color patched can be sent to display non color managed.

    #140620

    Ariel
    Participant
    • Offline

    Good point! I will try the verification soon.

    Indeed the default color management profile in ubuntu is SRGB derived from EDID, it’s the “automatic” profile in the attachment. My take is that the toggle does not take effect of windows that are already opened (displaycal in my countless tests), but I need to prove this. In any case, a restart after the setting change is the safest route.

    BTW FWIW I also profiled the monitor using my 10-year-old colormunki photo and the result was very close to the new colorchecker display plus (using calibrite’s generic PFS Phosphor Family correction), 1% smaller volume with the spectrometer, similarly smaller gamut. Then for fun I created a correction matrix for the colorchecker display using the spectro and that got even closer to what I got in the first run with the generic correction but still slighly smaller volume. I assume that the “best profile” would be the one with the largest volume & gamut.

    Attachments:
    You must be logged in to view attached files.
    #140622

    Old Man
    Participant
    • Offline

    No, the best profile is the one that is most accurate. So probably the one where you created your own spectro correction for your colorimeter. Always verify too

    #140631

    Ariel
    Participant
    • Offline

    Finally I completed the calibration checks with both profiles, with the Large testchart (328 patches):

    Colorimeter with CCMX generated with colormunki photo spectro

    Criteria Nominal Recommended # Actual Result
    Measured vs. assumed target whitepoint ΔE*00 <= 2 <= 1 0.39 OK ✔✔
    Measured vs. display profile whitepoint ΔE*00 <= 1 0.05
    Average ΔE*00 <= 1.5 <= 1 0.2 OK ✔✔
    Maximum ΔE*00 <= 4 <= 3 023 1.02 OK ✔✔

    ✔ Nominal tolerance passed
    ✔ Recommended tolerance passed

    ————————————————————–
    Colorimeter with generic PFS Phosphor CCSS

    Criteria Nominal Recommended # Actual Result
    Measured vs. assumed target whitepoint ΔE*00 <= 2 <= 1 1.14 OK ✔
    Measured vs. display profile whitepoint ΔE*00 <= 1 1.55
    Average ΔE*00 <= 1.5 <= 1 0.23 OK ✔✔
    Maximum ΔE*00 <= 4 <= 3 297 0.85 OK ✔✔

    ✔ Nominal tolerance passed

    ————————————————-

    So they were pretty close but the colorimeter profile created using the CCMX with my 8-year-old colormunki photo spectro passed with better Delta E avg and met both nominal and recommended tolerance. The one with the generic CCSS did not meet the recommended tolerance.

    p.s in order to do the calibration, I had to disable Color Management in the OS and restart the machine. Doing the verification without the restart gave totally wrong results (nothing passed).

    So now I know which ICC to use. Thanks for all the advice!

    • This reply was modified 2 months, 1 week ago by Ariel.
    #140646

    Vincent
    Participant
    • Offline

    The one with smoothest grey on a non color managed enviroment. Test visually.
    I assume that CCMX was mahe at high res 3nm resolution.
    Also instead of using generic CCSS forrrection, you can make your own & compare, but likely to end on the same test. Test visually a non color managed gradient or cross test white point to track if both whitepoints are very close.

    #140671

    Ariel
    Participant
    • Offline

    Colormunki Photo does not appear to work in HiRes mode with displaycal unfortunately, there seems to be a known issue:

    https://hub.displaycal.net/forums/topic/colormunki-photo-cannot-read-luminance-correctly/

    with HiRes (adaptive or not) active, I cannot even pass the first calibration step. I’m guessing that that using normal resolution impacts the CCSS generation more than the CCMX. In any case I got very good results with the verification (using the ccmx correction).

Viewing 9 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS