Calibrate an already calibrated monitor. Parameters to reset?

Home Forums Help and Support Calibrate an already calibrated monitor. Parameters to reset?

Viewing 4 posts - 16 through 19 (of 19 total)
  • Author
    Posts
  • #24773

    Enrico Gugliotta
    Participant
    • Offline

    All this to ask one question: if I calibrate my monitor using the Standard mode (in which I can’t adjust the gain of RGB channels), DisplayCal can compensate the little color shifting? Or the calibration will result in a wrong way?

    If you set a white other than native and you cannot get to extacly the white point target, whitepoint difference will be corrected in GPU. 1-2 channel max output will be limited

    Locked OSD modes are like a limited display (iMac, laptop). Whitepoint is corrected in GPU LUT.

    Thanks. So if white point is corrected in the gpu lut, why all this hassle to achieving a white point correcting the RGB inside the monitor?

    Not all GPU LUTs are equal. Not the same bitdepth, some do not use dithering, some behave different if link to display is 8-10bit and others don’t (because of dithering).

    Setting them in OSD lower the chance of posterization effects/banding. Same for gamma settings in OSD

    Ok, your answer knocked me out. Never imagined that GPU LUTs could be different. My head hurts.

    Hower apparently I found out the solution to my problem. My monitor when I set the blue light filter option (in the morning, to write emails etc. I usually use it to relax my eyes) in the OSD switches automatically to Standard mode, and remains in this state when I deactivate it. So if I want to use the Darkroom mode with which I calibrated my monitor, I can’t use the blue light filter option anymore.

    #24777

    Vincent
    Participant
    • Offline

    Unfortunately they are not. For example and unless intel has solved that in latest models intel iGPU are limited to 8bit/entry luts and no dithering, hence banding in laptops if you calibrate through iGPU LUTs are unavoidable.
    Of the other side AMD cards have 10bit+ LUTs and dither output, so even through an 8bit DVI connection you can get a grey visuallly equivalent to HW calibration (bandless)

    #24778

    Enrico Gugliotta
    Participant
    • Offline

    Understood.

    On my PC I have a GTX 1060 that with my monitor can use 10 bit.

    #24779

    Vincent
    Participant
    • Offline

    Yes (fullscreen & OpenGL apps if driver studio) & No (all the others) and only if display admits such high bitdepth link (and most don’t).
    Since your GTX lacks of reliable dithering at LUT output, bandless calibration is not guaranteed. Lots of threads about these. Some people could enable dither via registry, others find it hit & miss. Search forum if you wish to know more.

    If OpenGL 10bit while in desktop (which almost is limited to Photoshop) is not a requirement, a cheap gamer AMD is the safest choice if you want bandless GPU calibration (although in Linux you needed propietary driver)

    • This reply was modified 3 years, 11 months ago by Vincent.
Viewing 4 posts - 16 through 19 (of 19 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS