Correct bitdepth setting for GTX 1080 Ti?

Home Forums Help and Support Correct bitdepth setting for GTX 1080 Ti?

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #13639

    bitfidelity
    Participant
    • Offline

    I looked at the following documentation but am still not sure if the default bitdepth setting (16) in the profile loader is correct for my GTX 1080 Ti.

    Bitdepth. Some graphics drivers may internally quantize the video card gamma table values to a lower bitdepth than the nominal 16 bits per channel that are encoded in the video card gamma table tag of DisplayCAL-generated profiles. If this quantization is done using integer truncating instead of rounding, this may pronounce banding. In that case, you can let the profile loader quantize to the target bitdepth by using rounding, which may produce a smoother result.

    Does anyone know what the actual bitdepth of the 1080 Ti is? I know the card lets you output in 12bpc; does that mean the bitdepth setting should be set to 12? I should also mention my display (LG OLED EG9100) comprises an 8-bit panel, although I have no idea if that figures into anything.

    #13644

    Florian Höch
    Administrator
    • Offline

    Does anyone know what the actual bitdepth of the 1080 Ti is?

    This depends on the connection (HDMI/DisplayPort, etc.), the graphics driver settings (output bitdepth in the nVidia control panel), the connected display (note that the panel bitdepth is often not indicative of the effective display bitdepth, because most panels employ some form of dithering), as well as graphics driver bugs.

    Unfortunately, high bitdepth output with nVidia graphics cards under Windows is very much hit and miss. On my own system (Win 10 Pro, GTX 1070 connected to a Philips TV over HDMI) does support higher than 8 bits output until I put the system into standby or hibernation, at which point it loses the higher output bitdepth (banding appears on test gradients) until I restart the system. YMMV.

    #15817

    Enterprise24
    Participant
    • Offline

    Does anyone know what the actual bitdepth of the 1080 Ti is?

    This depends on the connection (HDMI/DisplayPort, etc.), the graphics driver settings (output bitdepth in the nVidia control panel), the connected display (note that the panel bitdepth is often not indicative of the effective display bitdepth, because most panels employ some form of dithering), as well as graphics driver bugs.

    Unfortunately, high bitdepth output with nVidia graphics cards under Windows is very much hit and miss. On my own system (Win 10 Pro, GTX 1070 connected to a Philips TV over HDMI) does support higher than 8 bits output until I put the system into standby or hibernation, at which point it loses the higher output bitdepth (banding appears on test gradients) until I restart the system. YMMV.

    Hello Florian I greatly appreciate your forums. I learn a lot of things from this. I just sign up here just to ask question. Did you figure out how to deal with Nvidia issue ? If my monitor or system wake up from sleep it loses higher bit depth. Also sometimes turn on PC is hit and miss. Sometimes PC turn on with 16 bit but if I am unlucky then 8 bit kicked in. I know only workaround method is sign out and sign in again which still depend on luck. If 3 sign out / in didn’t work then last choice is restart which again depend on luck.

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS