Home › Forums › General Discussion › DisplayCAL for Wacom Cintiq
- This topic has 1 reply, 2 voices, and was last updated 4 years ago by Vincent.
-
AuthorPosts
-
2020-04-04 at 2:19 #24075
Currently, I use the X-Rite i1 Display Pro to calibrate my two 10-bit Dell monitors. I can’t say I’m a big fan of that X-Rite/Dell system.
I just bought a Wacom Cintiq 24 Pro (also 10-bit). Xrite has a separate i1 Display Pro hardware/software combination for the Cintiq. (I can’t use the i1 Display Pro I purchased for my Dells to run the X-Rite software for the Cintiq.) Rather than buying a second Display Pro, I’m thinking of using DisplayCAL with my current Display Pro for the Cintiq. I hear good things about DisplayCAL. However, it seems the prevailing opinion is that 10-bit hardware calibration to the monitor LUT is preferable to 8-Bit software calibration. I haven’t been able to find any comparisons online. Any advice?
Calibrite Display Pro HL on Amazon
Disclosure: As an Amazon Associate I earn from qualifying purchases.2020-04-05 at 14:53 #24092Currently, I use the X-Rite i1 Display Pro to calibrate my two 10-bit Dell monitors. I can’t say I’m a big fan of that X-Rite/Dell system.
I just bought a Wacom Cintiq 24 Pro (also 10-bit). Xrite has a separate i1 Display Pro hardware/software combination for the Cintiq. (I can’t use the i1 Display Pro I purchased for my Dells to run the X-Rite software for the Cintiq.) Rather than buying a second Display Pro, I’m thinking of using DisplayCAL with my current Display Pro for the Cintiq. I hear good things about DisplayCAL. However, it seems the prevailing opinion is that 10-bit hardware calibration to the monitor LUT is preferable to 8-Bit software calibration. I haven’t been able to find any comparisons online. Any advice?
GPU calibration is not limited to 8bit, for example AMD GPUs have high bitdepth LUTs and dither since 15 years ago , so even over a old DVI 8bit GPU to display connection shows no banding and it’s visually equivalent to HW calibration. With laptops and intel iGPU its very likely that you are limited to 8bit LUT and no dithering so you may see truncation/rounding errors typical of GPU calibration.
It is not 10bit panel or 10bit GPU to display connection what gives you smooth gradients, it’s more than X bit LUT (calibration correction) and dither that grants you a smooth Xbit gradient. That’s what you’ll find inside your dells. Hence if GPU can provide such level of detail for applied corrections (although in GPU is limited to grey, not to gamut emulation)… it’s visually equivalent.
On top of that there are color management computations and rounding errors in such operations may render bands. 10bit is not needed to avoid that although is one of the commercial solutions (openGL 10bit + Photoshop + supported HW). Anothr solution used in commercial software like Lr / ACR filter or Capture One is to use dither… bandless gradients even 8bit connection IF display shows no banding without color management, like for example that AMD example.First of all try to find in the more accurate way you can what backlight tech & type uses that Wacom. IDNK. sRGB displays => White LED IPS correction in DisplayCAL, if that Wacom is widegamut you may try to find whcih ones have available Wacom-Xrite software even if it is ntthe most accurate one, very likely to be RG_phoshor correction (GB-r LED) bundled with DisplayCAL.
IDNK which factory preset OSD has that Cintiq, choose the closest one to your calibration target, or “custom” OSD mode with acces to internal RGB gains if there is such OSD mode. The proceed with DisplayCAL calibration like in other screen.
If you want a match to your dell monitors you may need to use visual whitepoint editor. -
AuthorPosts