For testing can I install i1profiler and Displaycal together, also with loader?

Home Forums General Discussion For testing can I install i1profiler and Displaycal together, also with loader?

Viewing 15 posts - 1 through 15 (of 17 total)
  • Author
    Posts
  • #31554

    Prapan Chulapinyo
    Participant
    • Offline

    I’m new to calibration. I’ve just order i1display pro plus and now on the way. And after whole day research I decide to choose Displaycal for main software calibration instead of i1profiler in the sense of it more accurate and the advance setting it have. However.. for the testing/compare reason I have question :

    Can both Displaycal and i1profiler install together in same windows 10 machine?  – How? – so that I can make profile from both software to compare the result. At lest for now. Later I will uninstall and keep only one.

    I plan to use Displaycal loader to load ICC and LUT , can it also use to load the one create from i1profiler?  – example once I finish calibrate with i1profiler then I use Displaycal loaderto load that, then calibrate with Displaycal then load, so that I can compare result from both software. How result it look like.

    For using Displaycal loader it will uncheckbox “use windows display calibration” in color management right? Then. Do I have to uncheck anything in Nvidia control panel or set it to something or not? Or leave it be? I use Geforce  GTX1080.

    Calibrite Display Plus HL on Amazon   Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #31557

    Kuba Trybowski
    Participant
    • Offline

    I have the same colorimeter.

    If you are new to display calibration, use CC Profiler (formerly i1 Profiler):

    https://calibrite.com/pl/software-downloads/?noredirect=pl-PL

    This channel contains lots of user-friendly X-Rite/Calibrite tutorials (you can ask additional questions in the comments):

    https://www.youtube.com/channel/UCWEYAVNm0jI5az9MjR9SoHw

    DisplayCAL sure is great, but it’s an advanced tool, dedicated for powerusers whose particular needs can’t be met by simpler apps. As a beginner you’re better off sticking to CC Profiler.

    Regarding potential conflicts between CC Profiler/i1 Profiler and DisplayCAL, a few times I had the following problem:

    Sometimes my OLED display would display black incorrectly, like a cheap LCD screen. Each time I solved it by reloading the .icm profile.

    #31558

    Prapan Chulapinyo
    Participant
    • Offline

    Well thank for answer. I already read a lot lately. Those channel you tell me already watch tutorial of how to use it. Also tutorial how to use Displaycal as well. Now that big deal actually. That why I’d like to install both in place for compare reason.

    How about Nvidia control panel? Do I need to set something off there?

    #31559

    Kuba Trybowski
    Participant
    • Offline

    As far as I know, there’s no need to touch Nvidia’s control panel during the calibration process.

    Just remember to disable any mechanism that automatically changes the brightness and/or color temperature of your display (auto-dimming, a night mode, True Tone).

    Could you please tell me the model of your display?

    #31560

    Prapan Chulapinyo
    Participant
    • Offline

    My display is 2 x Dell U2417H , another is laptop Thinkpad X250.

    I’ve some more question about Nvidia control panel setting here:

    1. I’ve read somewhere told me under “change resolution” from “use default setting” choose “use nvidia color setting” and “make sure output color format” : rgb (my working space is sRGB) and “output dynamic range” : full  as in 1st picture. Is that correct?

    2. Under “adjust desktop color setting” just leave it default like in the 2nd picture is correct? How about override to reference mode? – since Nvidia document tell :

    Override to reference mode: Select this check box to override the current color processing and force the reference mode. When selected, color adjustments from the OS or other adjustment applications are ignored and rendered pixels are processed through the GPU pipeline in 1:1 pixel matching mode.

    So it better to leave it uncheck correct?

    3. Also somewhere in forum under setting  “adjust video color settings” choose “with the video player settings” to leave color untouched as in 3rd picture. correct?

    Some comment pls.

    Attachments:
    You must be logged in to view attached files.
    #31567

    Kuba Trybowski
    Participant
    • Offline

    Reportedly, you should reset all the color setting in Nvidia’s panel to the defalut values:

    https://photo.stackexchange.com/questions/108573/how-to-properly-calibrate-a-monitor

    #31568

    Prapan Chulapinyo
    Participant
    • Offline

    Thank. I’ve already reset all color setting. Anyway but those addition 1), 2), 3) setting that still curious me.

    #31569

    Prapan Chulapinyo
    Participant
    • Offline

    On what bitdepth should I set for my GTX 1080 card? As for document said:

    Bitdepth. Some graphics drivers may internally quantize the video card gamma table values to a lower bitdepth than the nominal 16 bits per channel that are encoded in the video card gamma table tag of DisplayCAL-generated profiles. If this quantization is done using integer truncating instead of rounding, this may pronounce banding. In that case, you can let the profile loader quantize to the target bitdepth by using rounding, which may produce a smoother result.

    But anyway I not sure if I should set 8bit or max 16bit? What will happen if I set 8bit but my card can accept 8bit?

    I’ve see something in Nvidia control panel say something about bit depth 8bit, dont’ know it apply here or not.

    Attachments:
    You must be logged in to view attached files.
    #31591

    Vincent
    Participant
    • Offline

    Can both Displaycal and i1profiler install together in same windows 10 machine?  – How? – so that I can make profile from both software to compare the result. At lest for now. Later I will uninstall and keep only one.

    I plan to use Displaycal loader to load ICC and LUT , can it also use to load the one create from i1profiler?  – example once I finish calibrate with i1profiler then I use Displaycal loaderto load that, then calibrate with Displaycal then load, so that I can compare result from both software. How result it look like.

    For using Displaycal loader it will uncheckbox “use windows display calibration” in color management right? Then. Do I have to uncheck anything in Nvidia control panel or set it to something or not? Or leave it be? I use Geforce  GTX1080.

    Remove Xrite app for LUT loading  and Xrite trap app (task manager, startup). Reboot. Xrite software is useless sh*t.

    Then create a profile with DisplayCAL install DisplayCAL loader. It may load i1Profiler ICCs too.

    You can use i1Profiler to create profiles too but during profiling stage, it will use the useless LUT loader from Xrite, causing banding unless computer reboot even if you have  a good card for calibration (which you have not).
    Anyway, after creating a profile with Xrite software and after reboot you can use high bitdepth LUT loader from DisplayCAL to load ICCs created with other software.

    Also check thread regarding dithering on GPU LUTs, it was an sticky thread. It may improve banding issues on nvidias.

    #31634

    Prapan Chulapinyo
    Participant
    • Offline

    Will try that. Now i test with i1profiler matrix profile for few days to get it taste. Later maybe try table profile. Then will compare with Displaycal.

    Anyway, after creating a profile with Xrite software and after reboot you can use high bitdepth LUT loader from DisplayCAL to load ICCs created with other software.

    Still no idea about Nvidia setting above picture Bitdepth setting set max at 8bit. However Displaycal loader seem to set to 16bit by default. Is that gonna be problem? Should I set both-bitdepth (Nvidia, Displaycal) to what?

    #31635

    Vincent
    Participant
    • Offline

    Will try that. Now i test with i1profiler matrix profile for few days to get it taste. Later maybe try table profile. Then will compare with Displaycal.

    Displaycal, profilling tab

    xrite matrix = DisplayCAL ( 1 curve  or gamma, i do not remember) + matrix + (Black point compensation, i do not remember)

    xrite table = DisplayCAL XYZLUT (3 x TRC + table mapping calibrated colorspace like a 3D mesh)

    Also Xrite stores D50 PCS as white point + CHAD matrix and Xrite calibration patches (not profilling , calibration) cannot be customized => troauble some displays with bad uncalibrated grey response & clor tonts in grey scale can be corrected better with DisplayCAL & slow speed.

    Also Firefox seems to do not like Xrite table, maybe matrix will be rejected too, It’s easy to spot on a widegamut since color management will be turned off if not supported.

    Anyway, after creating a profile with Xrite software and after reboot you can use high bitdepth LUT loader from DisplayCAL to load ICCs created with other software.

    Still no idea about Nvidia setting above picture Bitdepth setting set max at 8bit. However Displaycal loader seem to set to 16bit by default. Is that gonna be problem? Should I set both-bitdepth (Nvidia, Displaycal) to what?

    Read thread regarding registry hack on nvidias for dithering.

    #31646

    Prapan Chulapinyo
    Participant
    • Offline

    xrite matrix = DisplayCAL ( 1 curve  or gamma, i do not remember) + matrix + (Black point compensation, i do not remember)

    xrite table = DisplayCAL XYZLUT (3 x TRC + table mapping calibrated colorspace like a 3D mesh)

    Also Xrite stores D50 PCS as white point + CHAD matrix and Xrite calibration patches (not profilling , calibration) cannot be customized => troauble some displays with bad uncalibrated grey response & clor tonts in grey scale can be corrected better with DisplayCAL & slow speed.

    Thank! Will try that and come back later.

    Read thread regarding registry hack on nvidias for dithering.

    So meaning the default Nvidia at least in my system is default max at 8bit. If need more need some hack right?

    Then Displaycal tray set at 16bit but my card set 8bit what will happen?

    #31651

    Vincent
    Participant
    • Offline

    Then Displaycal tray set at X bit but my card set 8bit what will happen?

    If dithering is on, smooth gradients. If there is no dithering you’ll get banding on non color managed gradients (and color managed too). It is independent of DisplayCAL configuration, it’s a limitation of your card.

    • This reply was modified 2 years, 7 months ago by Vincent.
    #31673

    Gianluca.M.
    Participant
    • Offline

    Hello.
    Maybe he refers to the option “Output color depth” found in the nvidia control panel under “Use Nvidia color settings” section.
    That option refers to your monitor’s panel technology. In your case you have to set 8 bits because your monitor has a panel bit depth of 8 bits (6+2 FRC).

    Regarding the “bitdepth” option that you find in the displaycal profile loader is another thing. i think it is better to leave it set to 16 bit.
    This is what is reported on the displaycal home page:
    Profile loader (Windows)
    Bitdepth:
    “Some graphics drivers may internally quantize the video card gamma table values to a lower bitdepth than the nominal 16 bits per channel that are encoded in the video card gamma table tag of DisplayCAL-generated profiles. If this quantization is done using integer truncating instead of rounding, this may pronounce banding. In that case, you can let the profile loader quantize to the target bitdepth by using rounding, which may produce a smoother result”

    Sorry for my bad english

    #31761

    Prapan Chulapinyo
    Participant
    • Offline

    Maybe he refers to the option “Output color depth” found in the nvidia control panel under “Use Nvidia color settings” section.
    That option refers to your monitor’s panel technology. In your case you have to set 8 bits because your monitor has a panel bit depth of 8 bits (6+2 FRC).

    Yes exactly, so that one on  nvidia control panel is output to my display that is my display support 8bit. But Those in displaycal profile loader is for send to video card gamma table to 16bit and I should leave value there at 16bit huh? So meaning I’d test with 16bit and it there is strange non smooth then try to reduce it. So far I’ve try 16bit and it little bit smoother. When 8bit less smoother but a few almost can’t notice.

    f dithering is on, smooth gradients. If there is no dithering you’ll get banding on non color managed gradients (and color managed too). It is independent of DisplayCAL configuration, it’s a limitation of your card.

    For hack dithering card I’ve not try yet. I’ve read that post already. Seem it like hit or miss meaning sometime it work sometime it won’t? How is your experience about it? Worth?

Viewing 15 posts - 1 through 15 (of 17 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS