Banding and bits per channel

Home Forums Help and Support Banding and bits per channel

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #4501

    Omelette
    Participant
    • Offline

    Maybe this is not the best place to ask this question… I’m sorry about this, I don’t know where to ask.

    I read somewhere that it’s possible to reduce the banding produced by the calibration curves on NVIDIA GeForce (consumer-level) cards by using a DisplayPort cable and enabling the “10 bpc” driver setting.

    I would really appreciate to know if changing my current DVI cable with a DisplayPort cable could help to improve the software calibration. My setup is a Dell U2412M (6-bit + A-FRC) and a NVIDIA GeForce GTX 1060.

    #4502

    Florian Höch
    Administrator
    • Offline

    There’s methods to test if your graphics card + monitor combination is 10-bit capable. See http://forum.doom9.org/showthread.php?t=172128

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS