Color calibration on Nvidia GPU

Home Forums Help and Support Color calibration on Nvidia GPU

Viewing 11 posts - 1 through 11 (of 11 total)
  • Author
    Posts
  • #2194

    Victor Wolansky
    Participant
    • Offline

    HI! One of the sales points of Eizo is that they can do a better correction because their hardware works internally at 16 bits, and that GPUs do it only at 8 bits. I think that is probably not true anymore, but I can’t find much information on internet about it. Do you know at what bit precision are processed the LUTs loaded on the current NVIDIA chips? Becuase may be there in no reason anymore to have the corrections processed by different hardwares when you have a dual monitor configuration like me, and just do it all in the GPU.

    Thanks!

    #2197

    Florian Höch
    Administrator
    • Offline

    Do you know at what bit precision are processed the LUTs loaded on the current NVIDIA chips?

    It depends on the specific card, the type of connection, and maybe also driver settings. You’ll have to do some experimenting to find out if the card can output anything higher than 8 bits. Usually this at least requires a DisplayPort or HDMI >= 1.3 connection.

    #2198

    Victor Wolansky
    Participant
    • Offline

    I’m not talking about the output, I know I can output 10 bits to the Eizos, I’m talking about the internal precision for calculating the color calibration. They say that their LUTs are 16 bits and the ones that the GPUs uses are 8 bits. Hope it makes sense. I’m using Quadro M6000 and GTX 980 Titan

    #2234

    p.dada
    Participant
    • Offline

    Last time I checked, only AMD consumer grade cards had 10-bit internal LUTs. Your GTX most probably has only 8-bit internal LUT. That Quadro should do the job though. In any case, you can check yourself for the GPU LUT bit-depth from within DisplayCAL (Tools -> Report on uncalibrated display device). Please post the results for that GTX 980.

    #2237

    Victor Wolansky
    Participant
    • Offline

    Thanks. I will take a look tonight. I know for a fact that this 980 can output 10 bits via Display Port, but that might now be related to the internal LUT processing. Will post what I find.

    #2238

    p.dada
    Participant
    • Offline

    In that case, please check both DVI @8-bit and DisplayPort @10-bit. Do not use madVR (madTPG) for the test patches, as the dithering algorithm elevates internal processing bit depth to 11-bits, even if the GPU only does 8-bit processing.

    • This reply was modified 8 years, 1 month ago by p.dada.
    #2348

    Victor Wolansky
    Participant
    • Offline

    But how would I know by testing if the internal process of the GPU if of which depth?  Not sure I get that.

    #2350

    p.dada
    Participant
    • Offline

    Do a report on uncalibrated display and the program will tell you the video card’s LUT bit depth.

    #2387

    Victor Wolansky
    Participant
    • Offline

    Do a report on uncalibrated display and the program will tell you the video card’s LUT bit depth.

    Done that, but I do not see that information anywhere.

    Although when I finish I get this report that says something about a 16 bits table of 3 channels, as you can see on the attached image, but then says its 256 entries per channel? Isn’t that just 8 bits?

    Then says something of device to PCS, with 3 channels and 2048 entries pare channel,

    what is PCS?  The PCS

    then after that says PCS to device 3 channels 4096 entries per channel,  and the output table 256?

    It is a bit confusing.

    Attachments:
    You must be logged in to view attached files.
    #2391

    (ex)Deejjjaaaa/AlterEgo
    Participant
    • Offline

    what people @ NVidia forums are saying ? I mean NVidia own people…

    #2392

    p.dada
    Participant
    • Offline

    Do a report on uncalibrated display and the program will tell you the video card’s LUT bit depth.

    Done that, but I do not see that information anywhere.

    Although when I finish I get this report that says something about a 16 bits table of 3 channels, as you can see on the attached image, but then says its 256 entries per channel? Isn’t that just 8 bits?

    That’s not the log window. That’s the profile info window. Go to “Tools” menu and click on “Show log window”. After you do the report on uncalibrated display, it should tell you the LUT bit-depth.

Viewing 11 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS