Various Monitor Questions

Home Forums General Discussion Various Monitor Questions

Viewing 9 posts - 1 through 9 (of 9 total)
  • Author
    Posts
  • #37116

    JensR
    Participant
    • Offline

    Hi,

    I have to replace my HP LP2475w and am considering an EIZO CS2731 as replacement. For the past couple years I have used DisplayCal in combination with Spyder/ColorMunki to get my colors (sort of) right. I’m doing a lot more photo retouching & grading these days and am therefore considering spending the extra cash on an Eizo. However even after reading through much of this forum, I still have a few questions left — if someone could help me out figuring this out, it would be much appreciated:

    1. Can I run the Eizo in 10-bit on my Radeon XT 5500 (or does this require workstation GPU/drivers or NVIDIA card)?
    2. I’m mainly running Photoshop, Capture One, Resolve & Foundry Modo (all on Windows).  As far as I understand only Photoshop supports 10-Bit displays, will the other application benefit from a 10-bit display?
    3. How much of a (practical) need is there to run DisplayCal on top/after hardware calibration?
    4. Does the cable connection (HDMI/Displayport/etc) matter for image quality?

    thanks, Jens

    #37122

    Vincent
    Participant
    • Offline

    Hi,

    I have to replace my HP LP2475w and am considering an EIZO CS2731 as replacement. For the past couple years I have used DisplayCal in combination with Spyder/ColorMunki to get my colors (sort of) right. I’m doing a lot more photo retouching & grading these days and am therefore considering spending the extra cash on an Eizo. However even after reading through much of this forum, I still have a few questions left — if someone could help me out figuring this out, it would be much appreciated:

    1. Can I run the Eizo in 10-bit on my Radeon XT 5500 (or does this require workstation GPU/drivers or NVIDIA card)?

    If you mean using OpenGL 10bit hook to PS and other tools you’ll need pro gpu models, current gamer nvidia models (maybe with “studio” drivers) or AFAIK current AMD gamer models too.

    If you mean RDNA XT 5500 it should be “yes” but you’ll have to test. Check on AMD control panel enabling 10bit (google)

    1. I’m mainly running Photoshop, Capture One, Resolve & Foundry Modo (all on Windows).  As far as I understand only Photoshop supports 10-Bit displays, will the other application benefit from a 10-bit display?

    Capture One does not need 10bit, it uses dithered ouputs. 10bit is not used in PS to have 1024 tones of grey, it s to avoid the brute truncation in PS output (poor implementation). Capture one or Lightroom or Adobe ACR uses dithered outputs so they do not need 10bit, pristine 16bit gradients even over 8bit DVI connection.

    IDNK if Resolve uses dithered output or not.

    Adobe Ai or In does not use 10bit.

    Also a 10bit display means “i can accept 10bit signal, then I would translate this to whatever bitdepth my panel has”

    1. How much of a (practical) need is there to run DisplayCal on top/after hardware calibration?

    a) verify Color Navigator calibration, remember that you need to “hack it” with i1d3 colorieter because Eizo has not bundled the proper spactral correction.
    Also that model uses WLED PFS backlight so a coormunki photo won’t measure it properly at 10nm with color navigator and a spyder is an innacurate device… so add to cart an i1displaypro.
    Proper correction for CN and CS2731 can be forged by you (see Midas post about CS2731) or downloaded from lift gamma gain in a thread with same tittle.

    b) with other monitors (not CS2731) if HW calibration fails to make grey neutral (dell, Benq… etc), you can use DisplayCAL on top to fix it.

    c ) create a detailed  profile to make LUT3D for Resolve and such.

    So using an Eizo ColorEdge you can skip b). No need to run DisplayCAL calibration on top of HW cal but a and c are useful (this does not need GPU calibration to be loaded on top of HW cal)

    1. Does the cable connection (HDMI/Displayport/etc) matter for image quality?

    thanks, Jens

    You need to read CS manual regarding ycrbr modes for HDMI if needed. For RGB 4:4:4 at 8 or 10bit there should be no difference.
    I do not remember exactly but 4:4:4 10bit was exclusive for RGB mode. Full details on lift gamma gain forums. Maybe I am wrong , i do not remember the details.

    • This reply was modified 4 days, 14 hours ago by Vincent.
    • This reply was modified 4 days, 14 hours ago by Vincent.

    i1Display Pro on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #37125

    JensR
    Participant
    • Offline

    Hi Vincent,

    thanks a lot for the detailed explanation!!! 🙂

    1. Can I run the Eizo in 10-bit on my Radeon XT 5500 (or does this require workstation GPU/drivers or NVIDIA card)?

    If you mean using OpenGL 10bit hook to PS and other tools you’ll need pro gpu models, current gamer nvidia models (maybe with “studio” drivers) or AFAIK current AMD gamer models too.

    If you mean RDNA XT 5500 it should be “yes” but you’ll have to test. Check on AMD control panel enabling 10bit (google)

    Excuse my sloppyness: the card is properly called RX 5500 XT. Specs say support for 10-bit, HDR & it’s RDNA 1.0 architecture.

    I have the AMD control panel switch for 10-bit displays visible, but not sure what that gets me. I googled this before and every source I found had conflicting info, some saying 10-bit was only available in DirectX, which would be no use in Photoshop. How do I check for sure? Do I need to call AMD tech support?

    I don’t think I have proper understanding of what 10-bit capability means, but from what I gleaned it would eliminate banding visible in 16-bit files (in Adobe PS), which I encounter quite often and find incredibly annoying. Also I am always worried the banding is in the file, not just visible in my display, so it would be a benefit being able to tell my clients the problem is with their monitor, not the file I delivered.

    Capture One does not need 10bit, it uses dithered ouputs. 10bit is not used in PS to have 1024 tones of grey, it s to avoid the brute truncation in PS output (poor implementation). Capture one or Lightroom or Adobe ACR uses dithered outputs so they do not need 10bit, pristine 16bit gradients even over 8bit DVI connection.

    If dithering gives the same (or comparable) result in applications like C1, then I don’t see the benefit of having a 10-bit display (at least for my use case).

    a) verify Color Navigator calibration, remember that you need to “hack it” with i1d3 colorieter because Eizo has not bundled the proper spactral correction.
    Also that model uses WLED PFS backlight so a coormunki photo won’t measure it properly at 10nm with color navigator and a spyder is an innacurate device… so add to cart an i1displaypro.

    Oh boy, looks like I need to read up on that. Hoped the “hacky” stuff could be avoided. I do have a Colormunki DISPLAY though, not Photo — I thought this is the same hardware as i1d3? Why do you recommend the i1d3? Is the EX4 no good? A bit strange Eizo recommends them for use with their screens.

    c ) create a detailed  profile to make LUT3D for Resolve and such.

    Didn’t manage to wrap my head around color management in Resolve yet — but I will look it up. Thanks again for the detailed answers — that furthered my understanding of the issue quite a bit!

    i1Display Studio on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #37126

    Vincent
    Participant
    • Offline

    Hi Vincent,

    thanks a lot for the detailed explanation!!! 🙂

    1. Can I run the Eizo in 10-bit on my Radeon XT 5500 (or does this require workstation GPU/drivers or NVIDIA card)?

    If you mean using OpenGL 10bit hook to PS and other tools you’ll need pro gpu models, current gamer nvidia models (maybe with “studio” drivers) or AFAIK current AMD gamer models too.

    If you mean RDNA XT 5500 it should be “yes” but you’ll have to test. Check on AMD control panel enabling 10bit (google)

    Excuse my sloppyness: the card is properly called RX 5500 XT. Specs say support for 10-bit, HDR & it’s RDNA 1.0 architecture.

    I have the AMD control panel switch for 10-bit displays visible, but not sure what that gets me. I googled this before and every source I found had conflicting info, some saying 10-bit was only available in DirectX, which would be no use in Photoshop. How do I check for sure? Do I need to call AMD tech support?

    I don’t think I have proper understanding of what 10-bit capability means, but from what I gleaned it would eliminate banding visible in 16-bit files (in Adobe PS), which I encounter quite often and find incredibly annoying. Also I am always worried the banding is in the file, not just visible in my display, so it would be a benefit being able to tell my clients the problem is with their monitor, not the file I delivered.

    The banding appears in the “X” if this pipeline
    File content in colorspace A -> color management to colospace B (=display) -> XXX Photoshop to GPU driver XXX -> driver -> GPU LUT -> output (+dither) -> cable -> display input -> display calibration HW (+dither) -> panel (+dither)

    It is not needed 10bit end to end. It is needed that Photoshop send color managed image to driver without truncation. That’s where the “hook” of GPU driver OpenGL with 10bit surface drawing enters the scene. If that trunaction is avoided, just by a “bigger hook”, 10bit/channel, the next steps in pipeline could dither to 8bit over a DVI cable and “16bit test ramp.psd” woudl render smooth.

    OTOH Capture One and LR develop module and Adobe ACR in XXX step they do dither , so no 10bit driver is needed.

    So:
    -to test that 10bit OpenGL driver is working: select 10bit in AMD control panel, go to PS preferences, performance, advanced, enable 30bit, restart PS. Open 10bit test ramo (you can find it in google) and use zoom 100%.

    -to test that actually no 10bit is needed:
    with above test failing (for example because 30bit is disabled) so there are bands, go to filter and open “Adobe camera raw”. Do you see the bands? no (unless there is something weird in your system) That’s dithering

    Capture One does not need 10bit, it uses dithered ouputs. 10bit is not used in PS to have 1024 tones of grey, it s to avoid the brute truncation in PS output (poor implementation). Capture one or Lightroom or Adobe ACR uses dithered outputs so they do not need 10bit, pristine 16bit gradients even over 8bit DVI connection.

    If dithering gives the same (or comparable) result in applications like C1, then I don’t see the benefit of having a 10-bit display (at least for my use case).

    But all of them are 10bit XDDD
    I mean, AdobeRGB+P3 monitor, HW cal, very good color uniformity, reliable…all of them are 10bit too as part of the set of features of these displays.

    a) verify Color Navigator calibration, remember that you need to “hack it” with i1d3 colorieter because Eizo has not bundled the proper spactral correction.
    Also that model uses WLED PFS backlight so a coormunki photo won’t measure it properly at 10nm with color navigator and a spyder is an innacurate device… so add to cart an i1displaypro.

    Oh boy, looks like I need to read up on that. Hoped the “hacky” stuff could be avoided. I do have a Colormunki DISPLAY though, not Photo — I thought this is the same hardware as i1d3?

    CN and other software using Xrite SDK are locked to do not offer HW calibration to munki display /i1display studio. You need the pro… that’s part of Xrite business.
    LeDoge did a hack to fake i1d3 pro but i’ve not tested it

    https://github.com/ledoge/i1d3_hook

    Why do you recommend the i1d3?

    Because is the only choice in sub $1000 colorimeters: fast, reliable and able to be corrected in a distributed way (EDR/CCSS colorimeter corrections).

    Is the EX4 no good? A bit strange Eizo recommends them for use with their screens.

    It’s a rebranded SpyderX. Eizo made an agreement to sell them but nobody cares about since they are innacurate (spyder4/5) and unable to be corrected in a distributed way (4/5/X)

    All colorimeters can be corrected by a custom matrix using a true reference device (like a high end spectrophotometer) to say colorimeter, “Those are the true RGBW coordinates of this display, make yourself measure that” (a matrix). This is an oversimplification, but i hope you get it.

    But only i1d3 family can use firmware data (spectral sensivities of filters recorded at factory) to be corrected in a distributed way, with an spectral power distribution of display tech / type (stored in EDR or CCSS files).

    CN has no EDR for WLED PFS, it is using a GB-LED EDR as sample (like CS2730 or CS240) or a generic matrix that is not scustomized to each i1d3, hence errors in whitepoint.
    Read Midas therad here about CS2731.

    So yes, a hack is needed. That hack is just replace GB-LED EDR (RG_phopshor family EDR) with a modified version that contains WLED PFS spectral power disytribution, That’s all, copy & paste a file.

    c ) create a detailed  profile to make LUT3D for Resolve and such.

    Didn’t manage to wrap my head around color management in Resolve yet — but I will look it up. Thanks again for the detailed answers — that furthered my understanding of the issue quite a bit!

    This means to use DisplaycAL to measure again your dsplay and make a profile (without GPU calibration) that describes display behavior in a 3d mesh.
    With that info you can make a LUT3D to transform your display to others displays. For example, calibrate CS2731 to full native gamut D65 gamma 2.2, profile display (make that 3dmesh in a ICC file) => compute a LUT3D to make display behave like a Rec709 gamma 2.4 display in resolve.

    • This reply was modified 4 days, 11 hours ago by Vincent.
    #37129

    JensR
    Participant
    • Offline

    Hi Vincent,

    thank you again for taking the time writing such a detailed explanation.

    But all of them are 10bit XDDD
    I mean, AdobeRGB+P3 monitor, HW cal, very good color uniformity, reliable…all of them are 10bit too as part of the set of features of these displays.

    Ah, sorry — I didn’t know that.

    Read Midas therad here about CS2731.

    Thanks, I did do that (anyone else looking for it, it is this one: https://hub.displaycal.net/forums/topic/confusion-calibrating-cs2731/ ). Also read the corresponding LGG thread (here: https://www.liftgammagain.com/forum/index.php?threads/eizo-cs2731-calibration-issues.16585/)

    Why do you recommend the i1d3?

    Because is the only choice in sub $1000 colorimeters: fast, reliable and able to be corrected in a distributed way (EDR/CCSS colorimeter corrections).

    It seems X-rite discontinued the product. I’m guessing the Calibrite ColorChecker Display Pro (CCDIS3) is what I should buy?

    So yes, a hack is needed. That hack is just replace GB-LED EDR (RG_phopshor family EDR) with a modified version that contains WLED PFS spectral power disytribution, That’s all, copy & paste a file.

    Ok, so if I understand correctly I can use the EDR & CCSS for CS2731 by Stuart from the LGG thread with Calibrite colorimeter in CN7 and I will be all set. (or maybe try my luck with ColorMunki & DLL injection + EDR & CCSS)

    Thanks again!

    #37131

    EP98
    Participant
    • Offline

    It seems X-rite discontinued the product. I’m guessing the Calibrite ColorChecker Display Pro (CCDIS3) is what I should buy?

    Calibrite is the same exact meter as the i1D3, just a re-brand. Hardware is exactly the same

    #37133

    JensR
    Participant
    • Offline

    It seems X-rite discontinued the product. I’m guessing the Calibrite ColorChecker Display Pro (CCDIS3) is what I should buy?

    Calibrite is the same exact meter as the i1D3, just a re-brand. Hardware is exactly the same

    Yeah, I was able to figure that out. Still a bit confused about the models: Display, Pro and Plus. If I understand correctly the Display is not for HW calibration — I want to avoid running into the same snag as with the ColorMunki. 😬

    #37134

    Vincent
    Participant
    • Offline

     

    So yes, a hack is needed. That hack is just replace GB-LED EDR (RG_phopshor family EDR) with a modified version that contains WLED PFS spectral power disytribution, That’s all, copy & paste a file.

    Ok, so if I understand correctly I can use the EDR & CCSS for CS2731 by Stuart from the LGG thread with Calibrite colorimeter in CN7 and I will be all set. (or maybe try my luck with ColorMunki & DLL injection + EDR & CCSS)

    Yes. Then verify results with Stuart CCSS (although it should behave very close to HP Z24x G2 WLED PFS ccss correction bundled with Displaycal for i1d3.

    If you are using a CS2731 even without CN HW calibration gray should be neutral out of the box … so just using a colormunki display in DisplayCAL with OSD RGB gains to set D65 white on a native gamut configuration, then using DisplayCAL to create a single curve + matrix +blacpoint compensation profile WITHOUT calibration (just profiling) should do the job for PS and other color managed tools.

    But i would try at least i1d3 hook + CN + custom EDR (it’s free, you own it)

    #37137

    Vincent
    Participant
    • Offline

    It seems X-rite discontinued the product. I’m guessing the Calibrite ColorChecker Display Pro (CCDIS3) is what I should buy?

    Calibrite is the same exact meter as the i1D3, just a re-brand. Hardware is exactly the same

    Yeah, I was able to figure that out. Still a bit confused about the models: Display, Pro and Plus.

    Same i1d3 pro but Plus is certified to measure accuraely very high HDR brightness. AFAIK the “plus” model is identified by i1d3 pro OEM unlock code in Xrite SDK (and alt sw like ArgyllCMS).

    If I understand correctly the Display is not for HW calibration — I want to avoid running into the same snag as with the ColorMunki. 😬

    Yes, is the same old colormunki display.

Viewing 9 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS