I made a tool for applying 3D LUTs to the Windows desktop

Home Forums General Discussion I made a tool for applying 3D LUTs to the Windows desktop

Viewing 15 posts - 121 through 135 (of 326 total)
  • Author
    Posts
  • #34342

    MW
    Participant
    • Offline

    So you are using novideo_sRGB and DWMLUT at the same time?

    #34343

    Vincent
    Participant
    • Offline

    So you are using novideo_sRGB and DWMLUT at the same time?

    No, my fault, I misunderstood your previous message

    I have used DWMLUT (standalone, no AMD driver or novideosrgb) as a way to simulate HW calibration to native gamut for photo editing: “A widegamut monitor simulating an idealized version of itself with DMWLUT, then use that idealized profile as display profile in OS”

    #34357

    SirMaster
    Participant
    • Offline

    OK this is all confusing me haha.

    Let me ask this another way…

    Does novideo_srgb, when using a high quality ICC profile made with DisplayCAL, produce a better calibrated result than what I normally get from the standard VCGT calibration that comes from putting the ICC profile into the Windows calibration or DisplayCAL Profile loader.

    Because that already works to apply VCGT to all applications.

    I just want to understand, should I be switching from DisplayCAL profile loader calibration to novideo_srgb if I want to best calibration my display for gaming and video watching with no performance hit?

    I now understand it wont be as good as DWM_LUT, but that’s likely OK as the color of my monitor is more or less linear.

    Thank you in advance for your insights.

    #34359

    Vincent
    Participant
    • Offline

    Standard VCGT calibration cannot simulate colorspaces like sRGB, just corrects grey.

    So if you want to play games on any monitor:

    nvidia: novideo_sRGB
    AMD: VCGT (DisplayCAL) + sRGB simulation on AMD control panel

    #34361

    SirMaster
    Participant
    • Offline

    I guess I question why I need to simulate a colorspace.  My display is natively sRGB, or at least rec709.

    I understand novideo_srgb was designed for fixing wide color gamut, but I don’t have that issue so I don’t need that fix.

    Assuming my monitor has proper rec709 primaries, and tracks color reasonably (but grayscale has a tint, and gamma is not flat).  What does novideo_srgb offer me over VCGT?

    VCGT is 3x1DLUT as far as I understand.  DWM_LUT is 3DLUT.   Besides the ability to simulate colorspace (which I don’t believe I need), what does novideo_srgb do over a 3x1DLUT?

    • This reply was modified 2 years, 2 months ago by SirMaster.
    #34363

    MW
    Participant
    • Offline

    I’m using novideo_srgb to make my sRGB display comply to spec, because many don’t despite spec sheets.

    I don’t believe anyone can really tell you what a tool is used for. Run it and see what it does for you. Wide gamut displays and non-color managed apps is a big issue so I get why the developer makes it a seling point.

    VCGT is 3x1DLUT as far as I understand.  DWM_LUT is 3DLUT.   Besides the ability to simulate colorspace (which I don’t believe I need), what does novideo_srgb do over a 3x1DLUT?

    Over DisplayCAL VCGT mainly flexibility of changing setting on the fly. That Includes the option of setting single curve(+matrix) system wide depending on the profile you load. Maybe novideo_srgb is more reliable than VCGT due to being closer to hardware. Certain games and apps resetting VCGT is an old issue. I don’t know for sure if banding is better controlled by novideo_srgb. DisplayCAL loader extends the bit depth of VCGT, however some version of nvidia drivers made the bit depth revert down to 8 bit when the system woke from sleep.

    #34365

    SirMaster
    Participant
    • Offline

    True, I should just try it out and take verification measurements to see how each performs.

    I will probably switch just because like you said it’s more reliable being in the GPU like that.  Games wont reset it like they sometimes do the VGCT, and fullscreen exclusive games don’t usually work with VCGT either as far as I understand.

    #34384

    S Simeonov
    Participant
    • Offline

    Can someone tell me how to create a 3dlut from previous calibration for madvr, without the picture getting too dark. I have to disable DWM LUT to get the correct picture in madvr…

    • This reply was modified 2 years, 2 months ago by S Simeonov.
    #34386

    MW
    Participant
    • Offline

    Can someone tell me how to create a 3dlut from previous calibration for madvr, without the picture getting too dark. I have to disable DWM LUT to get the correct picture in madvr…

    That only happened to me when I selected the wrong icm file in the folder.

    #34399

    zunderholz
    Participant
    • Offline

    I’m using novideo_srgb because it honestly doesn’t seem to have any downsides that I’ve run into for casual use. Doesn’t break gsync or anything. Using it to clamp my wide-gamut display and calibrate to srgb because the srgb mode on my monitor doesn’t allow you to adjust a lot of things like brightness, and it’s not properly clamped anyway. The results with novideo_srgb are very good. Maybe not good enough for professional use, but perceptually it’s very good. Verified my results with HCFR and majority of my dE are less than 1, some over 1, a few over 2, but none that are 3 or higher. Seems to be working correctly in any application/game that I’ve tried.

    • This reply was modified 2 years, 2 months ago by zunderholz.
    #34473

    kekyoin
    Participant
    • Offline

    Novideo_srgb just clamps your gamut to sRGB on video card level using your monitor’s built-in color primaries. It’s very handy for monitors that lack proper sRGB emulation, but it doesn’t really make your colors super accurate. Its accuracy level is good enough for most use cases and I’m using it myself, but if you need perfect accuracy, and/or you work not only in sRGB, then it’s not for you.

    Whereas dwm_lut allows for nearly perfect color accuracy in any color space that you set it up for.

    According to the readme novideo_srgb supports ICC profiles and full LUT-XYX-LUT calibration though.

    No. “Accepts” XYZLUT to transform it to an idealized lut-matrix-lut, like a Eizo CS or Ultrasharps HW cal. This assumes a very linear response in display, without volume correction in middle values. Just mixes primaries to simulate an idealized colorspace (like sRGB or other) trusting that once grey is calibrated with and 1D LUT display behavior can be predicted with a matrix and a TRC.

    The true equivalent of transformation from a X source colorspace to a colospace described by a XYZLUT ICC is a LUT3D, but you need shaders (DWMLUT) to run that… while the simpler LUT-Matrix-lut simulation is loaded in the specific HW in GPU for this task, both AMD (driver) and nvidia (novideosrgb).

    So right now the best way to go for both gaming and color critical work would still be a mix of:

    • Reshade with 3DLUT applied for games.
    • DisplayCal Profile Loader for Lightroom and other color-managed apps to read the profile and display colors correctly.

    Because from what I see,  if you use DWMLUT exclusively, it provides global correction but incurs a noticeable performance loss in games. While novideo_sRGB has no perfomance loss, but accuracy is not as good as either DWMLUT or 3DLUT in games?

    #34477

    Vincent
    Participant
    • Offline

    Novideo_srgb just clamps your gamut to sRGB on video card level using your monitor’s built-in color primaries. It’s very handy for monitors that lack proper sRGB emulation, but it doesn’t really make your colors super accurate. Its accuracy level is good enough for most use cases and I’m using it myself, but if you need perfect accuracy, and/or you work not only in sRGB, then it’s not for you.

    Whereas dwm_lut allows for nearly perfect color accuracy in any color space that you set it up for.

    According to the readme novideo_srgb supports ICC profiles and full LUT-XYX-LUT calibration though.

    No. “Accepts” XYZLUT to transform it to an idealized lut-matrix-lut, like a Eizo CS or Ultrasharps HW cal. This assumes a very linear response in display, without volume correction in middle values. Just mixes primaries to simulate an idealized colorspace (like sRGB or other) trusting that once grey is calibrated with and 1D LUT display behavior can be predicted with a matrix and a TRC.

    The true equivalent of transformation from a X source colorspace to a colospace described by a XYZLUT ICC is a LUT3D, but you need shaders (DWMLUT) to run that… while the simpler LUT-Matrix-lut simulation is loaded in the specific HW in GPU for this task, both AMD (driver) and nvidia (novideosrgb).

    So right now the best way to go for both gaming and color critical work would still be a mix of:

    • Reshade with 3DLUT applied for games.
    • DisplayCal Profile Loader for Lightroom and other color-managed apps to read the profile and display colors correctly.

    Because from what I see,  if you use DWMLUT exclusively, it provides global correction but incurs a noticeable performance loss in games. While novideo_sRGB has no perfomance loss, but accuracy is not as good as either DWMLUT or 3DLUT in games?

    A well behaved display may need just novideosRGB, since a lut-matrix-lut is equivalent to HW calibration in Ultrasharps or Eizo CS… IF display is well behaved.
    For LR and other color managed apps you can use DMWLUT too as explained in other threads or mmessages: just simulate an idealized native gamut of display and assign that synthetic idealized profile as display profile in OS.

    #34487

    Евгений
    Participant
    • Offline

    @dogeliton

    I wanted to ask if it is possible to implement novideo_srgb for AMD video cards? The problem with AMD’s native implementation is that you can’t give it a calibrated profile for your monitor (as I understand novideo_srgb can), and if the display reports incorrect EDID data, the result will be bad (in my case, very bad because EDID reports 140 % sRGB and actually 115%).
    I used dwmlut and I really like the result, but it reduces performance, which is detrimental for a mid-budget 2016 video card.

    #34488

    Евгений
    Participant
    • Offline

    I’m not sure, but as I understand it, it’s enough to deceive the driver and give him the necessary data

    #34490

    Евгений
    Participant
    • Offline

    I found that you wrote about replacing EDID with the help of CRU, but I can’t embed data from ICM into EDID. Tried one of the editing programs, but it just doesn’t work. and I probably won’t be able to do it right.
    just in case, I’ll leave here EDID in bin and dat + ICM.

    I had to change the file extension, otherwise it can’t be loaded, at the end of the file name it is indicated what exactly

    Attachments:
    You must be logged in to view attached files.
Viewing 15 posts - 121 through 135 (of 326 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS