Hi. Need help with my LG 27GL850 B

Home Forums Help and Support Hi. Need help with my LG 27GL850 B

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #32765

    mat a
    Participant
    • Offline

    Hi

    1, To use the wide gamut, do I need to use 10 bit colours in Nvidia Control Panel?

    2, I read a thread where Florian mentioned that some monitors that use 8 bts + FCR (which mine use) do nor suffer from the flickering induced by the FCR. Do you know if my monitor is worth leaving 10 bits colours on? What am I meant to look for on the dark shade? Any test for it?

    3, How to calibrate in sRGB (and not use wide gamut)?

    Do I need to use the SRGB preset on the monitor? If so, I then use the ability to adjust R, G and B or do I se the default Game mode and select sRGB in displaycal when I calibrate?

    4, Finally, if I use sRGB, do I need to use 8 or 10 bit colours?

    Thank you

    #32768

    Vincent
    Participant
    • Offline

    1. No

    2. Yes, if color managed app does not uses dither but can output 10bit. Example photoshop (actually… it’s the main/only example).
    It is not needed for LR or Capture One (do dither) or other apps without dither and 10bit output (other apps from Adobe Suite, GIMP, Affinity, On1, DxO… almost all of them )

    3. sRGB OSD preset + Whitepoint + gamma correction in GPU
    OR
    LeDoge’s DMWLUT

    4.Same answer as 2.

    • This reply was modified 2 years, 5 months ago by Vincent.
    #32806

    mat a
    Participant
    • Offline

    Thank you for your reply.

    About question 2:

    There is no benefit for gaming or to minimise banding?

    3: By Gamma correction in GPU, you mean in the DisplayCAL options (2.2/ sRGB)?

    Do you know what correction I need to use for my nano IPS monitor? What is the difference between the correction presets and the user uploaded corrections?

    I realise that it is a lot of question and I apologise for it but you have been extremely helpful and I appreciate it.

    #32812

    Vincent
    Participant
    • Offline

    2. No, banding appears because Nvidia faulty HW, luts without dithering by default. You can try nvidia registry hack for ditheirng.

    3. I mean VCGT calibration stored in ICC to make grey color neutral to its white and track suggested target TRC (2.2/sRGB).

    Colorimeter are not a perfect match to CIE std observer, hnece ther measurenents needs to be corrected.
    An universal correction (i1d3, Spyders…) for RGB displays is to measure WRGB with a reference device (“no correction needed, or you believe that to be true”) and then measure with your device. This is not portable from me to you (CCMX)
    A correction available to i1d3 is to store spectral sensivities in factory (i1d3 firmware) and then provide a spectral power distribution of a display (CCSS) so a custom coprrection can be computed on the fly. These CCSS are portable (from me to you as long as we have the same display model or a very close one)  and can be measured with lab grade devices (1nm) or tools more aimed to printer profilling (i1Pro family, 3nm, 10nm).
    Also there is a chance that user provided CCSS were not measured at native gamut but in a sRGB emulation mode, making them useless. That’s the reason you can “plot” a CCSS in DisplayCAL to inspect it.

    Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #32847

    mat a
    Participant
    • Offline

    Thank you

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS