8 bit vs 10 bit monitor – what’s the practical difference for color

Home Forums Help and Support 8 bit vs 10 bit monitor – what’s the practical difference for color

Viewing 3 posts - 16 through 18 (of 18 total)
  • Author
    Posts
  • #31098

    Vincent
    Participant
    • Offline

    Go to displaycal folder, open synth profile editor. Make a synth profile with the same white, and red, green and blue primaries coordinates (illuminnst relative xyY data on profile info in displaycal) and same nominal gamma. Usually you want to play teh “infinite contrast tick” (black point compensation) on both profiles.

    Then make a LUT3D with that sytn profile as source colorspace, target your displaycal profile with VCGT caibration.
    Resulting LUT3D is close to a monitor with HW calibration calibrated to native gamut, hence perfect for PS or LR or other color managed apps.

    Assign synth profile as default display profile in OS (control panel, color managemen , device tab). Open DWMLUT and load LUT3D. This way you can get no banding even with intel iGPUs, unless VCGT to be applied is to extreme to be simulated with 65 node per color ramp.

    Games will look oversaturaed (native gamut) but for PS or LR os like yu had an Eizo CS with HW calibrayion and idealized ICC profile (matrix 1xTRC) than minimized banding caused BY color management app.

    OK. I think I did it. But cannot be sure if I understood all the steps required.

    Man, if this combination of DWMLUT and DisplayCAL can make my wide color gamut monitor show proper colors on Windows in different apps and games, then this is GOLD!  I could easily pay some money for a comprehensive guide on what to do and why, or for further development of DisplayCAL to do the proper things automatically for me.

    From what I understand I will need to switch different profiles (OS + DWM LUT) for when I use Photo apps and for when I run games or browsers. This could be further automated. Same with generation of the synthetic profile from the ICM profile.

    Conctact LeDoge here https://hub.displaycal.net/forums/topic/i-made-a-tool-for-applying-3d-luts-to-the-windows-desktop and make app suggestions. Since it requires admin rights I’m not sure if loader can be automated like DisplayCAL loader on startup. Perhaps installing a dwmlut core app as a service, then with a tray app / GUI that calls it on startup, IDNK, ask him.

    • This reply was modified 2 years, 8 months ago by Vincent.
    #31102

    provanguard
    Participant
    • Offline

    In the 3D LUT Maker app, the option called: “Apply calibration (vcgt)”, what does it do exactly?

    Does it include some specific data into the generated 3D LUT file, or does it apply some settings into the graphic card as part of the process?

    If it is the second option, how can I switch between different already generated 3D LUTs without running this tool to apply calibration and as  a side effect create a new LUT every time?

    #31103

    Vincent
    Participant
    • Offline

    1st one. Modifies “default ” translation between source and target profile to take account of grey errors corrected on 1D LUT “VCGT” (usual calibration).

    That is the reason you cannot apply it twice. If it goes into GPU, cannot be in LUT3D. If iot is in LUT3D cannot be applied on System level GPU calibration.

Viewing 3 posts - 16 through 18 (of 18 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS