LG 31MU97 strange calibration…

Home Forums Help and Support LG 31MU97 strange calibration…

Viewing 15 posts - 16 through 30 (of 90 total)
  • Author
    Posts
  • #28394

    Алексей Коробов
    Participant
    • Offline

    Look at the attachement. Generally it is better to set monitor’s native gamma. Monitor menu settings may confuse you with “level 3” or “gamma 1”, but actuallly default mode is gamma 2.2 in most cases. Sometimes you need another setting in monitor’s menu, check its manual. I think that using DisplayCAL together with True Color Pro isn’t good idea. Hardware calibration (i.e. using monitor internal LUT ) is primarily targeted to standard profiles emulation (sRGB, AdobeRGB, Rec.709 etc), so non-CM software will show standard images/video correctly. It should also provide clear color gradients, external calibration usually make them dirty. But, I’ve met bad gradients after TCP calibration on 27UL850-W, check it for your display.

    Attachments:
    You must be logged in to view attached files.
    #28398

    Vincent
    Participant
    • Offline

    I think that using DisplayCAL together with True Color Pro isn’t good idea.

    If TCP fails becaue it cannot measure backlight properly on widegamuts (and that is a FACT) or because it fails to correct grey due to base uncalibrated status with severe color tints and low number of grey calibration patches (not profiling, calibration), or it fails to achieve the whiet it wanted… then mixing them is the only choice:

    -to correct white
    -to correct grey color issues

    Hardware calibration (i.e. using monitor internal LUT ) is primarily targeted to standard profiles emulation (sRGB, AdobeRGB, Rec.709 etc), so non-CM software will show standard images/video correctly.

    No, that is false. Only if image/video content  colorspace matches calibration target colorspace and that calibration is accurate.
    Calibrate to sRGB (sRGB gamut 2.2g) and all non color managed software that shows content not encoded in sRGB colorspace will look off.

    It should also provide clear color gradients, external calibration usually make them dirty. But, I’ve met bad gradients after TCP calibration on 27UL850-W, check it for your display.

    Yes, calibration in GPU causes bading with iGPUs and nvidias (unless other trick are made). Other GPU vendors have not that problem (AMD) or you may have luck with nvidia dithering trick.

    But that is the main reson to test numerically TCP results and see how off they are: white & grey color (range), before trying to correct it as general purpose desktop profile (with GPU calibration).  For Resolve only this “GPU calibration” step for general use is not needed since LUT3D can try do solve all at once.
    That test is still missing, so we do not now if he needed or not to correct grey & white for general purpose using GPU.

    If 27UL850-W is sRGB-like TCP should have not measurement issues since it has WLED colorimeter correction. It should work fine.
    The problem is in widegamuts, not WLED PFS correction for P3 displays, no GB-LED correction for Niklas’ monitor. All are corrected by RGB-LED correction for very old HP dreamcolor. Depending on colorimeter firmware, depending how close manufacturer says i1d3 is close to std observer, you may end with a little pink or little green whitepoint. If after HW calibration gains are locked and you want the proper white on desktop calibration, fro all apps, not just Resolve, then GPU calibration on top of TCP HW cal is the only choice. On an AMD this results on smooth gradients without banding.

    #28402

    Алексей Коробов
    Participant
    • Offline

    I think that using DisplayCAL together with True Color Pro isn’t good idea.

    If TCP fails becaue it cannot measure backlight properly on widegamuts (and that is a FACT) or because it fails to correct grey due to base uncalibrated status with severe color tints and low number of grey calibration patches (not profiling, calibration), or it fails to achieve the whiet it wanted… then mixing them is the only choice

    But why don’t we use sole GPU calibration and profiling here? You can always set display to user preset or some standard space. Some displays (BenQ SW240) have color space choice without native gamut choice (some displays support Rec.2020 that actually means native gamut with specific gamma). DisplayCAL does perfect job here for typical AdobeRGB preset, I’ve made this for SW240.

    Vincent, could you say why it is important to make colorimeter correction for display’s native color gamut? Is it actual for both matrix and spectral corrections? You can always count XYZ weights out of spectral correction for any RGB proportion, am I right?

    Hardware calibration (i.e. using monitor internal LUT ) is primarily targeted to standard profiles emulation (sRGB, AdobeRGB, Rec.709 etc), so non-CM software will show standard images/video correctly.

    No, that is false. Only if image/video content  colorspace matches calibration target colorspace and that calibration is accurate.

    Right, this is one of the ways in commercial workflow. A good one for talented artists that are dummies in computers.

    It should also provide clear color gradients, external calibration usually make them dirty. But, I’ve met bad gradients after TCP calibration on 27UL850-W, check it for your display.

    Yes, calibration in GPU causes bading with iGPUs and nvidias (unless other trick are made). Other GPU vendors have not that problem (AMD) or you may have luck with nvidia dithering trick.

    I should test this issue with nVidia, but generally it’s a mistake. Color is corrected in 2 general steps: calibration (better to say linearization) in videocard 1D LUT (some final GPU stage for every signal output) and ICC second part, that can be different (TRC, 3D LUT…) and that is made by CMS for color managed application. I always check BW gradient in PS and I see that calibration and 2nd part make different contribution for different displays. Sometimes tints are made by CMS only. You can use RGB Monitor soft proofing in PS to switch off CMS. I usually meet Intel integrated graphics and my PC has integrated AMD Vega 8, but rare PCs produce clear gradients with ICC profiles, using 8bit/ch. output at least.

    #28416

    Vincent
    Participant
    • Offline

    I think that using DisplayCAL together with True Color Pro isn’t good idea.

    If TCP fails becaue it cannot measure backlight properly on widegamuts (and that is a FACT) or because it fails to correct grey due to base uncalibrated status with severe color tints and low number of grey calibration patches (not profiling, calibration), or it fails to achieve the whiet it wanted… then mixing them is the only choice

    But why don’t we use sole GPU calibration and profiling here? You can always set display to user preset or some standard space. Some displays (BenQ SW240) have color space choice without native gamut choice (some displays support Rec.2020 that actually means native gamut with specific gamma). DisplayCAL does perfect job here for typical AdobeRGB preset, I’ve made this for SW240.

    If factory gamut emulation is accurate, yes. But you have to test first.
    Same for whitepoint, maybe factory white is very off, but TCP is mild off (so you can live with that or correct further)

    Test first then make a choice based on data. The approach you made may be the best for your display, once you have tested

    Vincent, could you say why it is important to make colorimeter correction for display’s native color gamut? Is it actual for both matrix and spectral corrections? You can always count XYZ weights out of spectral correction for any RGB proportion, am I right?

    Yes, all colors should be a linear combination of (or close to) native gamut spectral power distributions of R, G & B (excluding WOLED), so correction must be measured at native gamut.

    Some user sRGB configuration could be different than yours because it may be not exactly sRGB or yout factory sRGB preset can be off.

     

    It should also provide clear color gradients, external calibration usually make them dirty. But, I’ve met bad gradients after TCP calibration on 27UL850-W, check it for your display.

    Yes, calibration in GPU causes bading with iGPUs and nvidias (unless other trick are made). Other GPU vendors have not that problem (AMD) or you may have luck with nvidia dithering trick.

    I should test this issue with nVidia, but generally it’s a mistake. Color is corrected in 2 general steps: calibration (better to say linearization) in videocard 1D LUT (some final GPU stage for every signal output) and ICC second part, that can be different (TRC, 3D LUT…) and that is made by CMS for color managed application. I always check BW gradient in PS and I see that calibration and 2nd part make different contribution for different displays. Sometimes tints are made by CMS only. You can use RGB Monitor soft proofing in PS to switch off CMS. I usually meet Intel integrated graphics and my PC has integrated AMD Vega 8, but rare PCs produce clear gradients with ICC profiles, using 8bit/ch. output at least.

    Calibration (grey ramp calibration) needs to be tested without color management, MS Paint or if you set to not color manage Photoshop.

    Banding induced by color management in limited precission is solved by app if it uses dither (ACR, LR, C1). If Paint gradient is clear & smottyh (calibration = no banding) and you open an 8bit smooth sRGB gradient in PS and it shows banding with “colors”, open it in ACR in Photoshop, it should be gone in that view. Its Photoshop fault. Also you can correct iot in PS using their 10bit OpenGL output to driver (driver may try to set 10bit data to dislplay or dither silently)

    Choosing single curve + matrix profiles makes this issue less visible if app cannot use dither like Ai/In. It your display behaves OK and you are worried about this, use that kind of profiles.

    • This reply was modified 11 months, 3 weeks ago by Vincent.
    #28445

    Googloiss
    Participant
    • Offline

    Hi all! I followed Vincent’s speech, when I tried to handle the calibration with my equipment two years ago (LG 31MU97-z, i1 Display Pro, windows 10, nvidia 1070, decklink mini monitor 4K), I found no one to walk me through it, and so I went simple with only the hardware calibration (which is now called LG calibration studio), but now (also in view of an important project’s color grading, for which I’ll use davinci for the first time) I would like to have the situation under control, with the best quality and accuracy I can get out of my equipment.

    So basically, I think I understand what to do, but let me recap (also in relation to what I want to achieve). First of all I need 3 calibration for 3 different scenarios:

    1. when I want to color grade in premiere (my primary editing software), for small and fast projects
    2. when I want to color grade in davinci, for larger projects
    3. when I want to step-up the game in 10bit’s field using my blackmagic decklink mini monitor 4K

    Among the three, with the last one I’m not sure how the decklink fits into conversation, because it bypasses windows (and thus also displaycal), so when I switch to it there’s only the hardware calibration to work… If I calibrate with LG software using native gamut, how can I get rec709 or sRGB using the Decklink? I don’t know if there’s a way of communication using davinci to solve that.

    As for the other two scenarios, I have to:

    1. Calibrate with LG software using native gamut, D65, 100cd/m2, gamma 2.2 – why gamma 2.2 and not 2.4?
    2. Calibrate with displaycal for windows using the GB-r-LED/RG Phosphor correction, sRGB gamut (I currently work for the web), D65, 100cd/m2 – Some doubts: Should I check white and black level drift compensation? What amount of patches should I set? Where’s gamma setting and which one should I set?  I’ve always seen that sRGB and rec709 gamuts rgb coordinate values are the same, so why choose one over the other? What changes would I see if I put online a rec709 video work? I know rec709 is for television works, but what if I have to release a work for both tv and web? Which gamut “wins” over the other in that case?
    3. Calibrate with displaycal for davinci using the same settings as above (?)

    There’s a lot of things I have to clear in my mind, I know! Thank you!

    • This reply was modified 11 months, 3 weeks ago by Googloiss.

    i1Display Pro on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #28447

    Vincent
    Participant
    • Offline

    Among the three, with the last one I’m not sure how the decklink fits into conversation, because it bypasses windows (and thus also displaycal), so when I switch to it there’s only the hardware calibration to work… If I calibrate with LG software using native gamut, how can I get rec709 or sRGB using the Decklink? I don’t know if there’s a way of communication using davinci to solve that.

    DisplayCAL FAQ for Resolve cover that.

    As for the other two scenarios, I have to:

    1. Calibrate with LG software using native gamut, D65, 100cd/m2, gamma 2.2 – why gamma 2.2 and not 2.4?
    2. Calibrate with displaycal for windows using the GB-r-LED/RG Phosphor correction, sRGB gamut (I currently work for the web), D65, 100cd/m2 – Some doubts: Should I check white and black level drift compensation? What amount of patches should I set? Where’s gamma setting and which one should I set?  I’ve always seen that sRGB and rec709 gamuts rgb coordinate values are the same, so why choose one over the other? What changes would I see if I put online a rec709 video work? I know rec709 is for television works, but what if I have to release a work for both tv and web? Which gamut “wins” over the other in that case?
    3. Calibrate with displaycal for davinci using the same settings as above (?)

    There’s a lot of things I have to clear in my mind, I know! Thank you!

    HW cal or DisplayCAL native gamut and its ICC.

    -Premiere 2020, color managed, reads OS display ICC

    -Resolve, LUT3D from X colorspace (rec709 g2.4) to display colorspace (ICC)

    If using macOS, you’ll need a simple profile for OS and usually  a more detailed one for making Resolve LUT3D (XYZLUT), both will share same calibration for GPU (which include “no GPU calibration” case if using TCP or new LG HW cal software)

    • This reply was modified 11 months, 3 weeks ago by Vincent.
    • This reply was modified 11 months, 3 weeks ago by Vincent.
    #28450

    Googloiss
    Participant
    • Offline

    Ok, for the decklink side I’ll check displaycal FAQ, but for the second part I don’t understand what you suggest me to do

    #28452

    Vincent
    Participant
    • Offline

    The same. 1 HW cal native gamut or 1 native gamut GPU calibration from DisplayCAL (depending on your choice) fits all your needs.

    Premiere is color managed.

    For Resolve make LUT3D.

    • This reply was modified 11 months, 3 weeks ago by Vincent.
    #28454

    Googloiss
    Participant
    • Offline

    Okok, but after the HW cal native gamut I have to do the finishing sRGB displaycal calibration, right? Where I have to use the GB-r-LED correction you mentioned.

    #28456

    Vincent
    Participant
    • Offline

    No, I did not say that. Read what I wrote, it’s tiresome to repeat.

    And the end of the process I described you end with a profile and native gamut calibration. Depending on your choices it can be: HW cal alone (+ICC), GPU cal alone (and the ICC that stores it) or a GPU calibration correcting LG software results if needed (and the ICC that stores it)

    Once you have that, read my previous post.

    #28458

    Googloiss
    Participant
    • Offline

    I’m sorry to bother you, I’m italian, I’m trying to understand the whole process step by step, but I still can’t, now I’m more confused than before honestly.

    Niklas, what did you do then with your LG monitor?

    #28461

    Алексей Коробов
    Participant
    • Offline

    I’m trying to understand the whole process step by step, but I still can’t, now I’m more confused than before honestly.

    Color management tutorial probably can answer to all of your questions. )) You can call me by Skype tomorrow evening (see attachment), but not now, it’s 1:30 AM in Urals.

    #28517

    Niklas Ladberg
    Participant
    • Offline

    Hi everyone! I get better and better results but this seems very hard to know for sure 🙂 (I work with filming, editing and grading of commercials and corporate films but when you read these threads you feel like a beginner regarding calibration though I have tried for years on and off)

    Short question: Is there a good way to by eye correct the white balance setting (please send link to step by step if possible)?

    The long question/report of my mission: I managed to get the best result yet with almost 100% sRGB and Adobe RGB with this LG monitor.

    The way I did it (tried to follow Vincents recommendations to the best of my ability): True Color Pro calibration with native gamut 2.2 6500K.

    Then DisplayCAL (slowest/best calibration) set to aim for 6500K 2.2. gamma, sRGB (since I saw a calinbration tutorial that recommended that in this step since it is easier to get the colors right (without making Adobe RGB worse), and yes I got better results than before and higher score on Adobe RGB). The colors look pretty good. If I compare to out of the box retina displays and Iphone 11 pro (with tweaked setting for white balance (Iphone is a bit warm at standard setting) then the LG screen is a bit warmer and a bit more saturated.

    In Davinci resolve, if I change whitebalance to -210K the white is more white than the slight warmer image after all calibrations (I have 6500K full spectrum lights in room). I also tried  to make LUT3D for Resolve with source Rec709 colorpsace and 2.4 with DisplayCAL. I can not add that LUT to the clean feed monitor (do not have a blackmagic display card) but did try to add it as a timeline lut. It changes the calibration quite a lot, which is strange, but the report (attached) gives pretty much the same great results for sRGB and Adobe RGB. I again compared to uncalibrated retina displays and Iphone 11 pro and the timeline lut seems to go too far (but in the right direction), but do not make up for the lack of perfect white. But at about 45% of the LUT and added -210K the colours are pretty close compared to iPhone. Still very hard to really KNOW what is right, but to my knowledge iPhone do have a great colour reproduction, not that it perfect, but still white should be white. 🙂 So still confused. Easy to understand input is always appreciated!

    Attachments:
    You must be logged in to view attached files.
    #28522

    Vincent
    Participant
    • Offline

    Macbooks out of the box are not D65, more likely to be 6700-6800K and <3dE from daylight curve of whites. They look white but cool.

    If you want a match to a Retina display, measure a macbook P3 with your i1d3 and WLED PFS correction for macs. Then move to your TCP HW calibration and in Displaycal set as white target:
    -measured white for Retina
    or
    -Closest daylight white for Retina
    or
    -Visual whitepoint editor

    Then when making LUT3D choose “Relative Colorimetric”.

    ONLY IF you want to match a Retina display. I would say match D65, don’t care about Retina’s white.

    #28527

    Niklas Ladberg
    Participant
    • Offline

    Thank you Vincent that is great info! I do not want to match apple retina, I want to match true colours and white balance, then I assume I should trust the measurement (and just see apple’s image as one of the many looks my clients may experience). Does the meaurement report (attached in my previous post) seem good to you?

    I guess, based on your previous comment, that a correctly calibrated D65 Adobe RGB monitor should (have a slightly warmer white and) be a bit more more vivid/saturated  than a out out the box Apple retina display?

Viewing 15 posts - 16 through 30 (of 90 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS