dispcalGUI 3.x resets prior calibration on launch (nvidia + ubuntu)

Home Forums Help and Support dispcalGUI 3.x resets prior calibration on launch (nvidia + ubuntu)

Viewing 15 posts - 1 through 15 (of 17 total)
  • Author
    Posts
  • #1242

    finknottle SourceForge
    Member
    • Offline

    System: Thinkpad W530
    Graphics: K2000M
    OS: Ubuntu 14.04

    On an ubuntu machine with an nvidia card and proprietary drivers, new versions of dispcalGUI reset the calibration to default when the application is launched. i.e. The monitor goes back to the uncalibrated state, which i guess is a result of the video card’s LUTs being reset. This didn’t happen with the 2.6 version, which is what I’m using now. I’ve noticed this behaviour with a few other applications too, most notably nvidia-settings, which also does the same thing. (Related thread: http://www.nvnews.net/vbulletin/showthread.php?t=125017). However, this works fine with 2.6, which could hopefully help in finding a fix.

    #1243

    Florian Höch
    Administrator
    • Offline

    If DCG fails to restore the calibration after checking video LUT access, this should appear in the log. The nVidia settings issue is a long-standing known problem of the proprietary nVidia driver and I’ve not found a work-around other than disabling it from running automatically (delete or move the accompanying *.desktop file), remembering to restore calibration (e.g. by running dispcalGUI-apply-profiles) after running it manually, or switching to an OSS nvidia driver like Noveau.

    #1244

    finknottle SourceForge
    Member
    • Offline

    I gave nvidia-settings as just an example, which it looks like you are aware of as well. The fix for that is understandably outside the scope of this project. (Another application where I noticed this problem is pipelight when it tries to play full screen videos.)

    What I was trying to highlight was that between 2.6 and 3, something changed in the way DCG launches, and it also (unintentionally ?) resets the calibration. I’ll be happy to collect logs and compare 2.6 with 3. Is there a quick how-to for getting the relevant logs ?

    • This reply was modified on 2015-07-08 02:31:15 by finknottle.
    #1245

    Florian Höch
    Administrator
    • Offline

    The log is stored under ~/.local/share/dispcalGUI/logs/dispcalGUI.log and can also be accesed in the “Tools” menu.

    There was a change in 3.x regarding the way videoLUT is handled after checking access to it at program launch (the quick “flash” you should be seeing). In 2.6 and below, the current display profile calibration was loaded after checking videoLUT access, which could mask problems with calibration loading and had other issues, e.g. if you had previously loaded a calibration independent of any display profile. In 3.x, the exact videoLUT contents are restored after checking access, so videoLUT before and after checking access should be exactly the same.

    #1246

    finknottle SourceForge
    Member
    • Offline

    So here’s the log from the latest 3.x

    20:27:56,030 dispcalGUI.pyw 3.0.3.0 2015-07-06T23:04:10.092612Z
    20:27:56,033 Linux #47~14.04.1-Ubuntu SMP Fri Apr 10 17:49:16 UTC 2015 (Ubuntu 14.04 trusty)
    20:27:56,033 Python 2.7.6 (default, Jun 22 2015, 17:58:13)
    20:27:56,033 [GCC 4.8.2]
    20:27:56,033 wxPython 2.8.12.1 (gtk2-unicode)
    20:27:56,034 Encoding: UTF-8
    20:27:56,034 File system encoding: UTF-8
    20:27:56,034 Starting up…
    20:27:56,034 Audio module: pygame 1.9.1release
    20:27:56,035 /home/uname/bin/Argyll_V1.7.0/bin
    20:27:56,035 Argyll CMS 1.7.0
    20:27:56,035 Verify: ‘test.cal’ IS loaded (discrepancy 0.0%)
    20:27:56,035 Dispwin: Error – Expect 256 data sets in file ‘current.cal’
    20:27:56,036 Initializing GUI…
    20:27:56,036
    20:27:56,036 …ok.
    20:27:56,036 Ready.
    20:27:56,333 Setting up scripting host at 127.0.0.1:15411
    20:27:56,380 Check for application update…
    20:27:56,714 dispcalGUI is up-to-date.

    And here’s 2.6

    dispcalGUI.pyw 2.6.0.0 2014-11-15T21:17:46.813138Z
    Linux #47~14.04.1-Ubuntu SMP Fri Apr 10 17:49:16 UTC 2015 (Ubuntu 14.04 trusty)
    Python 2.7.6 (default, Jun 22 2015, 17:58:13)
    [GCC 4.8.2]
    wxPython 2.8.12.1 (gtk2-unicode)
    Encoding: UTF-8
    File system encoding: UTF-8
    Starting up…
    /home/uname/bin/Argyll_V1.7.0/bin
    Argyll CMS 1.7.0
    Verify: ‘test.cal’ IS loaded (discrepancy 0.0%)
    Initializing GUI…

    …ok.
    Ready.
    Setting up scripting host at 127.0.0.1:15411
    Check for application update…
    An update is available: 3.0.3.0

    The “Dispwin: Error – Expect 256 data sets in file ‘current.cal'” seems to be the main difference. Any idea what is causing this ?

    Unrelated: It seems like zeroinstall doesn’t fetch argyll any more ? I’m using the latest 1.7 from the website right now in case that’s relevant

    #1247

    Florian Höch
    Administrator
    • Offline

    The “Dispwin: Error – Expect 256 data sets in file ‘current.cal'” seems to be the main difference. Any idea what is causing this ?

    Hmm, I can’t reproduce this on any of my Linux systems, so this might be driver or hardware related. Annoyingly I actually worked around a similar problem on Mac OS X, but as I didn’t observe the behavior anywhere else the workaround currently doesn’t cover Linux, which is a bummer.

    Can you run

    dispwin -s current.cal

    and attach the cal file? Thanks.

    #1248

    Florian Höch
    Administrator
    • Offline

    Unrelated: It seems like zeroinstall doesn’t fetch argyll any more ?

    Argyll CMS funding situation is a not too great atm unfortunately, so I cannot add it back to the feed in good conscience as I fear this may affect donations Argyll CMS receives directly negatively (because people will never have to go to the Argyll website, and I’m not sure DCG donations will make up for the shortfall). If the situation improves, I’ll reconsider adding it back to the feed.

    I’m using the latest 1.7 from the website right now in case that’s relevant

    That’s absolutely fine. The previous 1.6.3 feed just linked to the original binaries from argyllcms.com

    #1249

    finknottle SourceForge
    Member
    • Offline

    File attached.

    Do you have any linux system with nvidia-drivers ? I also suspect that it might have something to do with it.

    Attachments:
    You must be logged in to view attached files.
    #1251

    Florian Höch
    Administrator
    • Offline

    Do you have any linux system with nvidia-drivers ?

    Yes, my Ubuntu system. What graphics card do you have? It seems from the cal file that it has 2048 videoLUT entries per channel, which would indicate 11-bit videoLUTs.

    #1252

    finknottle SourceForge
    Member
    • Offline

    Thanks for following up. I have a Quadro K2000M in a thinkpad w530. I’m sorry, I am a novice at this. What’s the implication of 11 bit LUTs ?

    #1253

    Florian Höch
    Administrator
    • Offline

    Thanks for following up. I have a Quadro K2000M in a thinkpad w530

    Ok, that would explain it, the Quadro cards are known for having > 8 bit videoLUTs. I was unaware that the graphics driver exposes this though (atleast under Linux), because under other systems (e.g. Windows) the APIs still return 256 entries, just with higher precision.

    I’ve worked around this now in the same way as under Mac OS X, i.e. when saving the videoLUT will be interpolated down to 256 entries so that dispwin doesn’t complain when loading it back in. I’ve added DCG 3.0.3.1 to the Linux 0install feed (testing) so you can try it out. You should see a message “VideoLUT has 2048 entries, interpolating to 256” in the logs.

    What’s the implication of 11 bit LUTs ?

    It’s actually good for calibration, as greater than 8 bit LUTs means higher precision. But it also depends on the display and connection, so the full benefit can only be had with a greater than 8 bit capable display over a DisplayPort connection.

    • This reply was modified on 2015-07-09 15:29:58 by fhoech.
    #1254

    finknottle SourceForge
    Member
    • Offline

    Thanks! I just tested the latest testing release, and it works. I see the message “VideoLUT has 2048 entries, interpolating to 256” in the logs. To understand this better, was the error caused because of a mismatch in written and read values ? So was it always writing 256 entries, but when it tried to read, it got 2048 ? Has DCG always worked 8 bits ?

    Btw, I do have a 10 bit monitor which from the description is described at 8-bit + FRC (10-bit), which sort of simulated 10 bit. Would it benefit from this card ? Or not right now, since everything is 8 bits in DCG/Argyll right now.

    #1255

    Florian Höch
    Administrator
    • Offline

    Thanks! I just tested the latest testing release, and it works.

    Good, thanks for testing.

    I see the message “VideoLUT has 2048 entries, interpolating to 256” in the logs. To understand this better, was the error caused because of a mismatch in written and read values ? So was it always writing 256 entries, but when it tried to read, it got 2048 ? Has DCG always worked 8 bits ?

    dispwin always writes floats when saving the calibration, it was the mismatch of number of entries in the cal file when reading it back in that caused the problem. I.e. it is only possible to load cal files with 256 entries with the current version of dispwin, and I worked around that limitation by interpolating to 256 entries when the cal file is written. So it only indirectly has to do with bit depth of the videoLUT.

    Btw, I do have a 10 bit monitor which from the description is described at 8-bit + FRC (10-bit), which sort of simulated 10 bit. Would it benefit from this card ?

    Even a lower bit depth monitor should benefit somewhat, in that case dithering in the graphics card will smooth out quantization errors. I think under Linux the nVidia control panel even lets you choose a dithering method (or turn it off, although I wouldn’t recommend that).

    #1256

    finknottle SourceForge
    Member
    • Offline

    Thanks once again. I’m glad it’s fixed.

    #1257

    eladrin SourceForge
    Member
    • Offline

    I am using the same laptop under Windows 8.1, which has a wide gamut screen. When I run an uncalibrated report, it says in the log “Effective Video LUT entry depth seems to be 8 bits”. Isn’t it supposed to say 11 bits since it is a Quadro card? Have I done something wrong that prevents me from using the card’s full potential in calibration? Thanks. (I’ve disabled Optimus in the BIOS, running only with discrete card)

    • This reply was modified on 2015-10-03 01:18:39 by eladrin.
Viewing 15 posts - 1 through 15 (of 17 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS