8 bit vs 10 bit monitor – what’s the practical difference for color

Home Forums Help and Support 8 bit vs 10 bit monitor – what’s the practical difference for color

Viewing 15 posts - 1 through 15 (of 18 total)
  • Author
    Posts
  • #31052

    provanguard
    Participant
    • Offline

    After getting a new monitor few days ago – ASUS PG329Q (10 bit + gaming) – I started my journey of calibrating a wide gamut monitor (previously only did it for 8 bit) and trying to understand what owning a 10 bit monitor really means.

    I have run numerous calibrations, using DisplayCAL including 8 bpc and 10 bpc settings in nvidia control panel. I am a bit surprised that there is a quite negligeble difference in results – at least to my understanding – the percentage of coverage of sRGB, Adobe RGB and DCI-P3 are nearly identical for 8 and 10 bpc.  Is this expected or am I doing something wrong?

    My results after calibration are at best like this for gamut coverage:

    • sRGB: 99,6%
    • Adobe RGB: 99,4%
    • DCI-P3: 92,4%

    Gamut volume is at 180%, 124% and 128% respectively. ASUS advertises 160% of sRGB colors and 96% of DCI-P3.

    Does having a 10 bit monitor make any difference for calibration result numbers? Should I be using separate ICM profiles for calibration at 8 and 10 bpc, depending in which mode I am running (this changes based on refresh rate of the monitor, only 60 Hz so far works with 10 bpc, while I run games with 165 Hz).

    • This topic was modified 2 years, 9 months ago by provanguard.
    #31059

    Vincent
    Participant
    • Offline

    After getting a new monitor few days ago – ASUS PG329Q (10 bit + gaming) – I started my journey of calibrating a wide gamut monitor (previously only did it for 8 bit) and trying to understand what owning a 10 bit monitor really means.

    I have run numerous calibrations, using DisplayCAL including 8 bpc and 10 bpc settings in nvidia control panel. I am a bit surprised that there is a quite negligeble difference in results – at least to my understanding – the percentage of coverage of sRGB, Adobe RGB and DCI-P3 are nearly identical for 8 and 10 bpc.  Is this expected or am I doing something wrong?

    Expected. Coverage is given by LED backlight spectral power distribution, not by panel.

    My results after calibration are at best like this for gamut coverage:

    • sRGB: 99,6%
    • Adobe RGB: 99,4%
    • DCI-P3: 92,4%

    Gamut volume is at 180%, 124% and 128% respectively. ASUS advertises 160% of sRGB colors and 96% of DCI-P3.

    Does having a 10 bit monitor make any difference for calibration result numbers? Should I be using separate ICM profiles for calibration at 8 and 10 bpc, depending in which mode I am running (this changes based on refresh rate of the monitor, only 60 Hz so far works with 10 bpc, while I run games with 165 Hz).

    With dithered ouput at app or at GPU HW output no real difference.

    Without them, if monitor accepts 10bit input, even if panel is 8bit, you can use some app lack of features to let monitor handle the rounding error instead of the application, because if aplication goes down from 10 to 8 without dither before sending to GPU likely to cause some truncation errors.

    People and vendors too usually mix “accepts 10bit input” with “10bit input at panel” with “true 10bit panel”. They are different depending on who has the responsibility to truncate : app, monitor HW, monitor panel… although if properly done results are interchangeable on SDR contrast windows (256 step can cover that kind of window with dithering).

    • This reply was modified 2 years, 9 months ago by Vincent.
    #31061

    Алексей Коробов
    Participant
    • Offline

    Simply draw grey (black to white) gradient in Photoshop, you’ll see it.

    #31063

    provanguard
    Participant
    • Offline

    From your comment I understand that there can be 3 cases of different monitor hardware: “accepts 10bit input” with “10bit input at panel” with “true 10bit panel”.

    For my model, the only things I can tell is that:

    • ASUS says it has: “Display Colors : 1073.7M (10 bit)”
    • Monitor shows in the UI if it gets 8 or 10 bpc signal
    • Nvidia drivers are able to output 10 bit at certain refresh rates to the monitor

    The specs can be found here: https://rog.asus.com/monitors/32-to-34-inches/rog-swift-pg329q-model/spec

    However, my question was more general about any 10 bit monitor vs 8 bit. My limited understand of the topic and your comment is that there is no difference in color qualities (?). Please correct me if I am wrong here.

    #31068

    Vincent
    Participant
    • Offline

    Simply draw grey (black to white) gradient in Photoshop, you’ll see it.

    Because wrong truncation in PS, not because it was needed.

    From your comment I understand that there can be 3 cases of different monitor hardware: “accepts 10bit input” with “10bit input at panel” with “true 10bit panel”.

    For my model, the only things I can tell is that:

    • ASUS says it has: “Display Colors : 1073.7M (10 bit)”
    • Monitor shows in the UI if it gets 8 or 10 bpc signal
    • Nvidia drivers are able to output 10 bit at certain refresh rates to the monitor

    The specs can be found here: https://rog.asus.com/monitors/32-to-34-inches/rog-swift-pg329q-model/spec

    It is not a partition. A device can have 1 or 3 features. Asus claims are at least “accepts 10bit” which is the requirement of poor implementation in Photoshop to avoid truncation on 16bit gradienst since PS is not capable of dithering. LR or C1 do and do not need nor use 10bit..

    The funny part is that photographers do not need it (10bit output) and designers and illustrators who are likely to work with synthetic gradients cannot use it because Adobe has not (AFAIK) 10bit output or dithered output for them.

    However, my question was more general about any 10 bit monitor vs 8 bit. My limited understand of the topic and your comment is that there is no difference in color qualities (?). Please correct me if I am wrong here.

    For SDR  contrast window no, there is not, unless poor output to screen like Gimp, PS, Ai, In….

    #31071

    Алексей Коробов
    Participant
    • Offline

    Because wrong truncation in PS, not because it was needed.

    Switching to Microsoft ICM you get some cleaner grey, but tints are still visible. You also see them in browsers and viewers, this is general problem in Windows. But, being plugged in Macbook and calibrated, 8-bit display shows clean grey. Here is the same thing, Vincent: we may talk on theory and tech aspects, but 10-bit gives practical advantage to Windows users by now.

    Opinion: I show this effect to photographers and describe how to check gradients purity, – totally switch off ICC usage in two steps: flash vcgt in Profile Loader and use “Monitor RGB” proof. It’s important for B&W and mixed studio shots, commercial design and design over photo (popular in product photography) as well. This take no sense in real world colour photos.

    #31072

    Vincent
    Participant
    • Offline

    Because wrong truncation in PS, not because it was needed.

    Switching to Microsoft ICM you get some cleaner grey, but tints are still visible. You also see them in browsers and viewers, this is general problem in Windows. But, being plugged in Macbook and calibrated, 8-bit display shows clean grey. Here is the same thing, Vincent: we may talk on theory and tech aspects, but 10-bit gives practical advantage to Windows users by now.

    No, that is false, it is not monitor, it is color management what causes banding. Do it with dither and there is no banding (ACR,LR,C1, DMWLUT, madVR..)…. unless GPU calibration causes it. This last one is NOT windows related, it is related to HW in GPU. AMD can dither on 1D LUT, even on DVI connections, other vendro may fail (intel) or hit & miss (nvidia registry hack, here in this forum there was a thread).

    Also your own post is a proof that you are wrong. 8bit macbook can render smooth gradients in PS because Apple provided an OpenGL driver that have a “server hook” at 10bit to client app (PS), then driver do whatever it wants, dither to 8 or send 10bpc if chain allows it: the kay is that poor PS implementation regarding truncation was avoided.
    On that macbook is running dither in the background. There are several post on LuLa with Andrew Rodney macbook an 8bit pixel format be bandless in PS because Open GL driver has that hook as it if were 10bit.

    Actually if every GPU vendor provide that hook avery display even on 8bit DVI link can show bandless color managed gradients. Nvidia does for gamer GPU (studio driver) although 1DLUT can be problematic, newer AMDs can enable it and also do 1D LUT dither since 10 yers or more. For intel unless Apple custom drivers there is no 10bit hook to app and ther is no 1D LUT dither (but you can use DWM LUT on win).

    Also color management with 3xTRC and app using 8bit rounding like Firefox is prone to that kind of “color” banding instead of typical grey step banding with 1xTRC.

    All of this is NOT monitor related at all, just software and GPU HW limitations.

    #31075

    provanguard
    Participant
    • Offline

    I appreciate your deep input on the topic.

    Could you (both) suggest your recommended monitor specs for photo editing and viewing, primarily on Windows? Apps: Capture One, DxO PhotoLab, Affinity Photo. GPU: Nvidia RTX 3080.

    Size: 32 inch (my current  target)

    Panel technology:

    Panel color depth:

    Backlight technology:

    Color Space:

    etc…

    Knowing these paramters from the experts, I will try to combine them with gaming specs. I know this will most certainly result in some compromises, but I would like to get at least 80% of the way in both aspects.

    #31077

    Алексей Коробов
    Participant
    • Offline

    OK, Vincent, but what do you think on signal type syncronization? I don’t mean composite, YCbCr etc., but the effect of data type and derivative adaption of hardware workflow. By example, I have already described the flaw with weak black at some horizontal frequencies. Could similar thing happen with vcgt (2provanguard: videocard gamma table) dithering on/off/level? My experience tells me that 10bit displays realy draw better grey in Photoshop and this happens even with nVidia cards, though 10bit displays are seldom items here.

    Hm, why do you call it dithering? I know the thread with nVidia hack, but is the effect described anywhere for programmers?

    2provanguard: 32″ displays are rare birds in my practice. Mostly they’re of MVA type or consumer-graded IPS. I may only recommend you to find at least two good tests of some model, the most clear testing bench for graphics is prad.de. Asus gamers displays I’ve met are totally ugly toys. They have graphics displays, but these are also strange, don’t trust in their calibration and quality. MSI makes some better monitors, but one of MSI notebooks had terrible color flaw in “pro” software (RGB pallete drop out). You should simply understand that gaming is agressive show biz, so gaming hardware manufacturers won’t care of natural vision. I’ve even met top nVidia gaming card without vcgt at one of outputs. Note that games don’t use ICC profiles as they slowing computations, video editors usually work with LUTs instead of ICC profiles. nVidia bad vcgt may also be a back side of high velocity. Probably some expensive displays combine pretty fast panel (you knows it better), wide gamut (full coverage of some stadard profiles ), correct RGB primaries with separate RGB spectra – better color stablility under different light (too difficult to novices), good uniformity (delta C < 1,5 at square part of the whole screen), high enough contrast (>1200:1 for IPS, but video editing needs more, MVA hass up to 5500:1) and smooth gradiends (check by eye), check also for color-to-brightness stability (avoid jumping colour change).

    #31079

    Vincent
    Participant
    • Offline

    OK, Vincent, but what do you think on signal type syncronization? I don’t mean composite, YCbCr etc., but the effect of data type and derivative adaption of hardware workflow. By example, I have already described the flaw with weak black at some horizontal frequencies. Could similar thing happen with vcgt (2provanguard: videocard gamma table) dithering on/off/level? My experience tells me that 10bit displays realy draw better grey in Photoshop and this happens even with nVidia cards, though 10bit displays are seldom items here.

    It is no display related as I said. It’s because the whole chain:

    Photoshop:

    processing (GPU basic vs accel) -> truncation to interface driver  -> openGL vendor driver -> (1) LUT -> output (dither/no dither) -> phys connection -> display input (8/10)-> monitor HW calibration/factory calibration/calibration with OSD) -> dithering to panel input -> panel input (8/10) -> (optional dither) -> actual panel bits.

    LR/ACR on Photoshop/C1:

    Processing -> truncation with temp dithering -> Windows composition at 8bit -> (1)

    GIMP / In design  / Ai / firefox /others:

    Procesing -> truncation to win composition iface -> (1)

    If display has no banding non color managed, color managed banding  is ONLY caused by steps before (1).

    Photshop chose to do it in the “expensive way” (before gamer Geforces and Radeon), requiring 10bit hook opn OpenGL and 10bit end to end pipeline because GPU vendor needs, before taht “casual 10bit driver” people had to pay for Quadros and Firepros for task that do not require such high bitdepth end to end (others do, but no photo SDR work).

    Apple know the trick and with RGB8888 (I do not remember name) pixel format they provide hook for 10bit input, although they will truncate with temp dithering on GPU, out of PS scope

    Also Adobe for other tools chose to do it the RIGHT WAY: processing output dithering to whatever windows composition it has. 0 banding if non color managed the monitor have no banding.

    Also Adobe for other tools chose to do nothing: truncate to win composition to 8bit: Illustrator/Indesign, which is a shame because syntehics gradienta are common tools there.

    If you have issues with PS is
    -poor driver implementation: basic GPU on 8bit seems to cause less issues since no simplified color management is done by GPU)
    -poor PS implementation (open the same image with Adobe Camera raw filter in PS… banding is gone in 16bit images)

    and none if this is related to 10bit advantage end to end on SDR contrast windows.

    Hm, why do you call it dithering? I know the thread with nVidia hack, but is the effect described anywhere for programmers?

    Temporal dithering. It works auto in AMD cards (related to 1DLUT output) and in ACR/LR/C1. It is done by default, user need to do nothing.


    @provanguard

    As said by Alexei IPS/VA & good uniformity. Since you have a newer GPU model you can get one with 10bit input (whatever panel it has behind) so only for Photoshop you can get rid of colormanagement simplifications done in that app (truncation of processing to driver interface with no temporal dithering).

    Also since you want a gamer display those new 165Hz 27″ QHD or UHD are usually P3 displays, some of them do not have gamut emulation capabilities so for gamer all will look “wrong” oversaturated… but you can look at @LeDoge DWM LUT app , works like a charm.
    Look carefully at monitor manual to spot those gamer models that DO NOT HAVE sRGB mode, or by review if such OSD is locked at high brightness.
    Anyway DWM LUT works fine if display lacks of sRGB mode.

    • This reply was modified 2 years, 9 months ago by Vincent.
    #31081

    provanguard
    Participant
    • Offline

    Hi @Vincent I downloaded the app. Looks very promising!

    How do I do the following in DisplayCAL: “Use DisplayCAL or similar to generate the 65x65x65 .cube LUT files you want to apply” ?

    Is using this app effectively replacing usage of ICC profiles (in some situations) ?

    #31082

    Vincent
    Participant
    • Offline

    ICC with GPU calibration and DMW LUT “can be” mutually exclusive, depending on VCGT.

    Use DisplayCAL and calibrate display at native gamut to your desired white. Install it & etc.

    Then if you wish a LUT3D for DMW LUT:
    -Open LUT3D maker app in Displaycal folder
    -source profile: colospace to simulate
    -target colorspace: diplay colorspace
    -Create LUT3D

    If you wish to calibrate grey using DWM LUT because your card don’t dither or do not make it properly or because you want to, “apply VCGT” to LUT3D when you create it. BUT if you do this you cant have displaycal profile as display profile in OS. Assign as default profile in PS the ICC of colospace to be simulated.

    It’s like Resolve LUT3D for GUI monitors, you have to choose who is going to calibrate grey, 1DLUT in GPU HW or LUT3D in software.

    For non color managed apps if you rely on ICC to gray calibration, no need to change it on OS, LUT3D won’t have VCGT applied.
    But for color managed apps things will look bad, mostly desaturated.
    If you wish a full  native gamut LUT3D to native gamut ideal colospace look on DWM LUT thread here, explained. The concept it’s easy, make a synth profile that represent your idealized display. Assign it to OS default display. PS & LR & Firefox will color manage to that profile but calibration is donw through DWMLUT, no through 1D GPU LUT.

    • This reply was modified 2 years, 9 months ago by Vincent.
    #31091

    provanguard
    Participant
    • Offline

    Use DisplayCAL and calibrate display at native gamut to your desired white. Install it & etc.

    Check!

    Then if you wish a LUT3D for DMW LUT:
    -Open LUT3D maker app in Displaycal folder
    -source profile: colospace to simulate
    -target colorspace: diplay colorspace
    -Create LUT3D

    Source Profile: sRGB IEC1966-2.1 (Equivalent….)

    Tone curve: Gamma 2.2, Relative, black output offset: 100%

    Destination Profile: the profile I created using DisplayCAL for my monitor D65, gamma 2.2

    Apply Calibration (vcgt): UNCHECKED  (Is this correct?)

    Rest are default values.

    If you wish to calibrate grey using DWM LUT because your card don’t dither or do not make it properly or because you want to, “apply VCGT” to LUT3D when you create it. BUT if you do this you cant have displaycal profile as display profile in OS.

    Well, I do have the profile created using DisplayCAL loaded and active, so I did not use “apply VCGT”.

    Assign as default profile in PS the ICC of colospace to be simulated.

    Is this only specific to PhotoShop?

    For non color managed apps if you rely on ICC to gray calibration, no need to change it on OS, LUT3D won’t have VCGT applied.

    From here forward I am a bit lost.  How can I rely on ICC if an app is not color managed?

    But for color managed apps things will look bad, mostly desaturated.

    I have noticed that my settings (DisplayCAL produced ICC is loaded) Capture One and DxO Photo lab receive desaturation when I activate the 3D LUT. Burning reds go to reds, and then go to brown with 3D LUT enabled.

    If you wish a full  native gamut LUT3D to native gamut ideal colorspace look on DWM LUT thread here, explained. The concept it’s easy, make a synth profile that represent your idealized display. Assign it to OS default display. PS & LR & Firefox will color manage to that profile but calibration is done through DWMLUT, no through 1D GPU LUT.

    I would like to try that. I would require a noob level instruction, please 🙂

    #31092

    Vincent
    Participant
    • Offline

    Apply Calibration (vcgt): UNCHECKED  (Is this correct?)

    Rest are default values.

    If you wish to calibrate grey using DWM LUT because your card don’t dither or do not make it properly or because you want to, “apply VCGT” to LUT3D when you create it. BUT if you do this you cant have displaycal profile as display profile in OS.

    Well, I do have the profile created using DisplayCAL loaded and active, so I did not use “apply VCGT”.

    If your GPU causes banding you can check use VCGT, and assign as default display profile a synth version without VCGT. Explained  bellow

    Assign as default profile in PS the ICC of colospace to be simulated.

    Is this only specific to PhotoShop?

    I meant “OS” not “PS”, control panel\ color management, PS is a typo :D. To not mess with Photoshop color options if you do not know what you are doing.

    For non color managed apps if you rely on ICC to gray calibration, no need to change it on OS, LUT3D won’t have VCGT applied.

    From here forward I am a bit lost.  How can I rely on ICC if an app is not color managed?

    VCGT is grey calibration, embebed into disoplaycal ICCand loaded into GPU.

    If you wish a full  native gamut LUT3D to native gamut ideal colorspace look on DWM LUT thread here, explained. The concept it’s easy, make a synth profile that represent your idealized display. Assign it to OS default display. PS & LR & Firefox will color manage to that profile but calibration is done through DWMLUT, no through 1D GPU LUT.

    I would like to try that. I would require a noob level instruction, please ????

    Go to displaycal folder, open synth profile editor. Make a synth profile with the same white, and red, green and blue primaries coordinates (illuminnst relative xyY data on profile info in displaycal) and same nominal gamma. Usually you want to play teh “infinite contrast tick” (black point compensation) on both profiles.

    Then make a LUT3D with that sytn profile as source colorspace, target your displaycal profile with VCGT caibration.
    Resulting LUT3D is close to a monitor with HW calibration calibrated to native gamut, hence perfect for PS or LR or other color managed apps.

    Assign synth profile as default display profile in OS (control panel, color managemen , device tab). Open DWMLUT and load LUT3D. This way you can get no banding even with intel iGPUs, unless VCGT to be applied is to extreme to be simulated with 65 node per color ramp.

    Games will look oversaturaed (native gamut) but for PS or LR os like yu had an Eizo CS with HW calibrayion and idealized ICC profile (matrix 1xTRC) than minimized banding caused BY color management app.

    • This reply was modified 2 years, 9 months ago by Vincent.
    #31094

    provanguard
    Participant
    • Offline

    Go to displaycal folder, open synth profile editor. Make a synth profile with the same white, and red, green and blue primaries coordinates (illuminnst relative xyY data on profile info in displaycal) and same nominal gamma. Usually you want to play teh “infinite contrast tick” (black point compensation) on both profiles.

    Then make a LUT3D with that sytn profile as source colorspace, target your displaycal profile with VCGT caibration.
    Resulting LUT3D is close to a monitor with HW calibration calibrated to native gamut, hence perfect for PS or LR or other color managed apps.

    Assign synth profile as default display profile in OS (control panel, color managemen , device tab). Open DWMLUT and load LUT3D. This way you can get no banding even with intel iGPUs, unless VCGT to be applied is to extreme to be simulated with 65 node per color ramp.

    Games will look oversaturaed (native gamut) but for PS or LR os like yu had an Eizo CS with HW calibrayion and idealized ICC profile (matrix 1xTRC) than minimized banding caused BY color management app.

    OK. I think I did it. But cannot be sure if I understood all the steps required.

    Man, if this combination of DWMLUT and DisplayCAL can make my wide color gamut monitor show proper colors on Windows in different apps and games, then this is GOLD!  I could easily pay some money for a comprehensive guide on what to do and why, or for further development of DisplayCAL to do the proper things automatically for me.

    From what I understand I will need to switch different profiles (OS + DWM LUT) for when I use Photo apps and for when I run games or browsers. This could be further automated. Same with generation of the synthetic profile from the ICM profile.

Viewing 15 posts - 1 through 15 (of 18 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS