I made a tool for applying 3D LUTs to the Windows desktop

Home Forums General Discussion I made a tool for applying 3D LUTs to the Windows desktop

  • This topic has 179 replies, 32 voices, and was last updated 1 week ago by speedy.
Viewing 15 posts - 166 through 180 (of 180 total)
  • Author
    Posts
  • #34779

    Draganche
    Participant
    • Offline

    No need to compile anything! Just click the big “Download latest release” link, that contains everything you need.

    goshh i havent even seen that there sorry

    btw. importing cube into it gives me washed out colors compared to display cal profile loader.
    i disabled display cal before instaling cube lut in dwm lut

    Did you set the gamma properly?

    Are these correct settings for cal and lut gamma

    Attachments:
    You must be logged in to view attached files.
    #34783

    S Simeonov
    Participant
    • Offline

    Use the 3dlut maker, not this one. For rendering intent use “relative colorimetric”.

    #34784

    Draganche
    Participant
    • Offline

    Use the 3dlut maker, not this one. For rendering intent use “relative colorimetric”.

    thanks, but i cant seem to find 3dlut maker… i know of 3d lut creator but thats not it

    #34787

    AL2420
    Participant
    • Offline

    please note does not recognize .CUBE (uppercase) as the 3DLUT extension. Otherwise…

    Cannot thank you enough for this! Have an XPS laptop wide-gamut OLED I’d given up on. Easy to install your tool and suddenly the screen looks incredible as sRGB. Unbelievable. Thank you so much!

    #34868

    speedy
    Participant
    • Offline

    I’ve noticed that dwm_lut ties to the DisplayPort Connector instead of each unique monitor which is causing problems for me since I dock my laptop at multiple docks. This means that I need to go in and manually change the SDR LUT every time I dock my computer at a different desk (where I’ve already calibrated that monitor).

    Would it be possible to change dwm_lut so that it can uniquely identify each display by serial number or something like that?

    I assume this would then also fix another problem I’ve experienced which is that the LUT doesn’t load if I switch video ports on my computer. For example, if I’m using DisplayPort #1 and I re-plug the monitor into DisplayPort #2 then it doesn’t load the LUT. Also, I then load the LUT to DisplayPort #2 then it won’t load again if I switch back to DisplayPort #1. I assume this is also related to dwm_lut being tied to the “Connector”.

    Thanks!

    P.S. I’ve submitted a bug report here too: https://github.com/ledoge/dwm_lut/issues/18

    #34896

    dogelition
    Participant
    • Offline

    please note does not recognize .CUBE (uppercase) as the 3DLUT extension

    What software is generating Cube LUTs with that uppercase extension? The format specification states that the extension must be .cube, so that seems like a bug in whatever program that file came from…

    #35180

    bootleg
    Participant
    • Offline

    Anyone able to make a 3D Lut from an ICC, apply it with DWM and verify it with displaycal?  My verifications are all over the place.

    I create an ICC/ICM with D65/2.2 and then have to select sRGB during the 3D LUT creation. I’ve done a ton of reading to try to understand this, but its still escaping me the proper way to verify the profile when I do this and apply the correction to the display with dwm lut.  I don’t have any specific source material in mind, just trying to throw a “good for everything” calibration together that is easy on my eyes.

    Original profiles I made were 6500K, 2.2, 100cd.

    • This reply was modified 1 month, 1 week ago by bootleg.
    #35182

    Vincent
    Participant
    • Offline

    If you use DWMLUT and want to verify such LUT3D, just load LUT3D to DWMLUT and set as display profile in OS the source colorspace in LUT3D (whatever you are simulating in LUT3D). Then verify default profile in DisplayCAL with no device link or simulationg profile, it will take the one selected as default from OS.

    #35187

    bootleg
    Participant
    • Offline

    Shouldn’t including the VCGT parameters in the LUT creation prevent me from needed to load an ICC in conjuction with the LUT file ?

    Or do I just have no clue what im doing at this point?

    It seems like each step I am worried about applying a double correction somewhere and I don’t know what settings to use for the lut creation or the proper way to implement it.

    • This reply was modified 1 month, 1 week ago by bootleg.
    #35189

    dogelition
    Participant
    • Offline

    You only need to set an ICC profile in Windows if you’re targeting something other than sRGB/Rec.709 with your 3D LUT. If not, just make sure the default sRGB profile is selected. And, as you noted, you need to make sure to include the VCGT data in the 3D LUT.

    For creating the 3D LUT, just create a profile of your display first as you normally would (and don’t install it when prompted!), and then use the 3D LUT tab. Set Rec709 as the source color space, whatever tone curve you want to target (this is independent of the tone curve you selected under the “Calibration” tab as the VCGT target), Absolute colorimetric with white point scaling as the intent, and 65^3 .cube as the format.

    For verification, since you’re currently having issues with that I would suggest first just verifying using the profile instead of the 3D LUT. Select Rec709 as the simulation profile, set tone curve to the same settings you used for the 3D LUT, and don’t enable “Use simulation profile as display profile”.  If this measurement report shows high errors, there’s something wrong with your profile.

    Then, use dwm_lut to load the 3D LUT, enable “Use simulation profile as display profile”, and create another measurement report. That one should look very similar to the previously generated one.

    #35191

    Vincent
    Participant
    • Offline

    Shouldn’t including the VCGT parameters in the LUT creation prevent me from needed to load an ICC in conjuction with the LUT file ?

    Color managed apps will need an ICC to identify monitor behavior. Once you load a LUT3D in DWMLUT that behavior will/should be the colorspace you chose as “source colorspace”.

    Or do I just have no clue what im doing at this point?

    It seems like each step I am worried about applying a double correction somewhere and I don’t know what settings to use for the lut creation or the proper way to implement it.

    If you set as default display profile the colorspace you want to simulate with a LUT3D, it is mostly a simple matrix profile with an idealized TRC an no VCGT calibratuon at all. For example sRGB profile.

    This way you can have DWMLUT running and photoshop working as intented at the same time.

    (if you only wanted to play games or use other non color managed apps like if you had an sRGB screen you can skip this explanation).

    • This reply was modified 1 month ago by Vincent.
    • This reply was modified 1 month ago by Vincent.
    #35194

    bootleg
    Participant
    • Offline

    You only need to set an ICC profile in Windows if you’re targeting something other than sRGB/Rec.709 with your 3D LUT. If not, just make sure the default sRGB profile is selected. And, as you noted, you need to make sure to include the VCGT data in the 3D LUT.

    For creating the 3D LUT, just create a profile of your display first as you normally would (and don’t install it when prompted!), and then use the 3D LUT tab. Set Rec709 as the source color space, whatever tone curve you want to target (this is independent of the tone curve you selected under the “Calibration” tab as the VCGT target), Absolute colorimetric with white point scaling as the intent, and 65^3 .cube as the format.

    For verification, since you’re currently having issues with that I would suggest first just verifying using the profile instead of the 3D LUT. Select Rec709 as the simulation profile, set tone curve to the same settings you used for the 3D LUT, and don’t enable “Use simulation profile as display profile”.  If this measurement report shows high errors, there’s something wrong with your profile.

    Then, use dwm_lut to load the 3D LUT, enable “Use simulation profile as display profile”, and create another measurement report. That one should look very similar to the previously generated one.

    So I do need to manually install the sRGB profile into windows in most cases then.  Any problem with using the displaycal loader?

    I was confused as I assumed if the VCGT was included with the 3DLUT that I would technically not need a ICC profile set in windows/dispcal loader to get the intended calibration changes.

    What doesn’t make sense to me is that if I apply the ICC by itself with the displaycal loader and run a blind verification (without a simulation profile) I get great results.  If I remove the ICC (as you put it, hit don’t install), and set up the 3DLUT with dwm lut, and run the same blind verification, I get bad numbers in the blue and black ranges.    I assume that something is going on where the 3DLUT still requires an (default/base) ICC profile to be installed to properly create the sRGB colorspace I am trying to get, but I thought with VCGT added to the LUT that I would be accidently double correcting somehow.   Sorry for the confusion on my end, most of the content I am reading online has very conflicting information depending on the experience of the author.  It just seems that having to apply a simulation profile means that what I am actually seeing on the display isn’t right.

    Also, do we have a rough idea of the GPU hit/overhead using this DWM method to apply the LUT?  I have tried to benchmark it, even though I am not sure of the correctness of the calibration and can’t really get anything useful number wise.

    Thanks all for the replies.

    #35195

    dogelition
    Participant
    • Offline

    You need to assign the default sRGB profile (or no profile – same thing) in the Windows color management settings to effectively disable color management for applications. Not a profile you generated, just the default one that comes with Windows, so that applications think your monitor is perfectly sRGB to make sure there is no color management happening other than the 3D LUT.

    You always need to have the “Simulation profile” checkbox enabled, otherwise (I think) you’re just verifying the display profile against itself, when in reality you want to verify how well your monitor can display Rec.709/sRGB either using your profile or using no profile (“Use simulation profile as display profile” off and on, respectively).

    Please upload the measurement reports you generated so that I can take a look at them.

    #35196

    bootleg
    Participant
    • Offline

    You need to assign the default sRGB profile (or no profile – same thing) in the Windows color management settings to effectively disable color management for applications. Not a profile you generated, just the default one that comes with Windows, so that applications think your monitor is perfectly sRGB to make sure there is no color management happening other than the 3D LUT.

    You always need to have the “Simulation profile” checkbox enabled, otherwise (I think) you’re just verifying the display profile against itself, when in reality you want to verify how well your monitor can display Rec.709/sRGB either using your profile or using no profile (“Use simulation profile as display profile” off and on, respectively).

    Please upload the measurement reports you generated so that I can take a look at them.

    So my desktop image on my PC changes whether I apply an ICM/ICC w/ displaycal loader, or 3D LUT with dwm lut.  So I assume in both cases the VCGT is being applied to the GPU.  If I’m supposed to apply default sRGB ICC to windows, why does the loader apply the profile I loaded into windows and not the default one and why does it not require a simulation profile during verification when not using a 3D LUT?  If I am just doing a “dumb” verification of what the display panel is putting out, I assume that should include my corrections towards sRGB and whatever tone curve I set.  Having to then set another simulation profile makes no sense to me even after reading the documentation over and over.  I guess there is something I am not grasping.

    When I just use an ICM and the displaycal loader (no 3D LUT, no dwm), I just hit “Install Profile” up by settings, let the loader handle everything and forget about it.  Visually my desktop gamma changes and I see the generated profile show up in windows.  When I verify, I just leave it as <Current> select the correct display and tell the verification to use however many patches I want, and hit Verify.  Windows has the generated ICM applied to it by the displaycal loader, and I have no idea what the loader is or isn’t doing with the association when the displaycal program is running.

    I get the following results:

    No matter what weird thing I do when trying to apply the 3D LUT w/ dwm, using the simulation profile or not, having the sRGB ICC applied to windows or not, having the gamma in verification tab set to the curve I created the profile with, or left unmodified, I get something like this:

    I guess I have no idea what settings I should be using or how these profiles are actually being applied, or what displaycal is doing to the associations when the program is running, or not.  Very confused at this point, I assumed using 3D lut and dwm would allow me to get the same calibration as the ICM but applied across non-managed programs.

    #35455

    speedy
    Participant
    • Offline

    Is anyone else running into an issue with dwm_lut not working on the latest Windows 11 22H1 22000.706 update?

    I submitted an issue for this here: https://github.com/ledoge/dwm_lut/issues/26

Viewing 15 posts - 166 through 180 (of 180 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS