8-bit display accepting 10-bit input?

Home Forums General Discussion 8-bit display accepting 10-bit input?

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #29324

    You Too
    Participant
    • Offline

    Hi again, I thought this deserved its own thread because I found out something very interesting.

    I recently made a thread about i1 Display Pro and my monitor.
    I have an Acer XV240YP. It has a 144hz IPS panel by Panda which as far as I know is 6-bit + 2-bit FRC unless something has changed since the earlier panel batches.
    It’s factory overclocked by default to 165hz but I’d rather go for longevity so I’m using the standard 144hz, the difference isn’t visible anyway.

    Now the thing is I’ve had problems with banding since I calibrated my monitor as I mentioned in my last thread.
    I have an Nvidia GTX 1060 graphics card for which the solution was to enable the registry hack for 8-bit temportal dithering.
    Sadly, the driver resets the dithering on startup for which I made a scheduled task with a bat file and a registry entry but not even that worked at all times.

    Now the weird thing was, during my experimenting with settings and all this, I found that I was actually able to select 10-bit output instead of 8-bit in the Nvidia control panel. What happened was that even with no dithering on the driver side, I got rid of all banding! I didn’t think an “8-bit” monitor would accept a 10-bit output and it seems the monitor handles the dithering itself or something.

    What do you guys know about this? I couldn’t find anything when searching around. Only that people say you need a 10-bit panel to use 10-bit in the Nvidia control panel but my monitor proves it works anyway, at least with my panel. I’m using DisplayPort 1.2 by the way.

    • This topic was modified 3 years, 1 month ago by You Too.
    • This topic was modified 3 years, 1 month ago by You Too.

    Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #29332

    You Too
    Participant
    • Offline

    Can’t edit the first post so making another one:

    I tried to let Windows put the monitor on standby to check if this was a driver or monitor-related thing. After waking the monitor up there was banding. This makes me think the “10-bit mode” in the Nvidia control panel is just a way for the driver to force dithering without actually putting it in the registry. The driver probably knows my monitor is 8-bit.

    #29336

    Vincent
    Participant
    • Offline

    Only that people say you need a 10-bit panel to use 10-bit in the Nvidia control panel but my monitor proves it works anyway, at least with my panel.

    No, 10bit settings in nv panel or 10bit end to end for some OpenGL apps is related to link between GPU and monitor, if display accepts 10bit signal or not. Panel can be 6bit+FRC, or 8 or 8+FRC or true 10bit. Driver does not care about it because it cannot know shuch information, only link from GPU to monitor.

    Can’t edit the first post so making another one:

    I tried to let Windows put the monitor on standby to check if this was a driver or monitor-related thing. After waking the monitor up there was banding. This makes me think the “10-bit mode” in the Nvidia control panel is just a way for the driver to force dithering without actually putting it in the registry. The driver probably knows my monitor is 8-bit.

    Interesting, but “The driver probably knows my monitor is 8-bit” applies only to link to display, not about panel.

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS