Is there a difference between GPU side dithering, and Sending 10bpc?

Home Forums General Discussion Is there a difference between GPU side dithering, and Sending 10bpc?

Viewing 10 posts - 1 through 10 (of 10 total)
  • Author
    Posts
  • #29735

    Adnan Cesko
    Participant
    • Offline

    Hi does anyone know if there a difference between GPU side dithering, and Sending 10bpc on 8bit+FRC displays so FRC is used internally in the display instead?

    #29737

    Vincent
    Participant
    • Offline

    The difference is that with GPU with high bitdepth luts + dithering to 8 or more, there is no truncation of calibration. Without dithering calibration truncation may be a problem.

    #29738

    Adnan Cesko
    Participant
    • Offline

    The difference is that with GPU with high bitdepth luts + dithering to 8 or more, there is no truncation of calibration. Without dithering calibration truncation may be a problem.

    Hmm I don’t  seem to see a difference between enabling driver dithering, and enabling 10bpc for my 8+2 display both look smooth.

    #29739

    Vincent
    Participant
    • Offline

    Depending on correction stored in VCGT tag. It’s not the same almost linear VCGT that shifts in channel. If you want to avoid it on every typical calibration and bpc, dithering is the solution and akin to HW cal for grey.

    #29740

    Adnan Cesko
    Participant
    • Offline

    Hmm I think sending 10bpc does make use of dithering, DisplayCAL atleast does report 10 bit LUT.

    #29743

    Vincent
    Participant
    • Offline

    Hmm I think sending 10bpc does make use of dithering,

    It depends on  card manufacturer. AMDs do dither on all bpcs.

    DisplayCAL atleast does report 10 bit LUT.

    It could be high bitdepth LUT and link 10bit, or high bitdepth LUT, dither and link 8bit to display.

    Anyway, you should have no banding. Dither is better because it is independent of bitdepth in link from GPU to display (it can be bandless with 8bit DVI, or with HDMI/DP displays that only accepts 8bit signal at that resolution & refresh rate).

    #29748

    Adnan Cesko
    Participant
    • Offline

    hmm I think the link is 10bit when sending 10bpc, and then the display does its thing and use that info to do dithering/FRC.

    It’s  atleast much more reliable than GPU side dithering due to me using a nvidia GPU which has a buggy dithering implementation needing to restart/logout to check if it’s  applied correctly.

    Due to bandwidth issues though I won’t  be able to use the full 165hz refresh rate of the display I bought just for gaming, but I managed to create a custom resolution with reduced blanking which allow for 144hz, and a 10bit link.

    #29751

    Vincent
    Participant
    • Offline

    hmm I think the link is 10bit when sending 10bpc, and then the display does its thing and use that info to do dithering/FRC.

    For your card vendor, yes. For others not.

    It’s  atleast much more reliable than GPU side dithering due to me using a nvidia GPU which has a buggy dithering implementation needing to restart/logout to check if it’s  applied correctly.

    Yes, it’s an issue on nvidias.

    Due to bandwidth issues though I won’t  be able to use the full 165hz refresh rate of the display I bought just for gaming, but I managed to create a custom resolution with reduced blanking which allow for 144hz, and a 10bit link.

    #29752

    Adnan Cesko
    Participant
    • Offline

    If I may ask do you know what the bit option for dithering in nvidia actually does it has options for 6/8/10 bit dithering but selecting 10 bit dithering seems to cause bands on my displays while 8 does not.

    #29753

    Vincent
    Participant
    • Offline

    IDNK, there was an sticky thread regarding nvidia Windows registry options. Try there.

Viewing 10 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS