Home › Forums › Help and Support › Color calibration on Nvidia GPU
- This topic has 10 replies, 4 voices, and was last updated 8 years ago by p.dada.
-
AuthorPosts
-
2016-03-06 at 0:26 #2194
HI! One of the sales points of Eizo is that they can do a better correction because their hardware works internally at 16 bits, and that GPUs do it only at 8 bits. I think that is probably not true anymore, but I can’t find much information on internet about it. Do you know at what bit precision are processed the LUTs loaded on the current NVIDIA chips? Becuase may be there in no reason anymore to have the corrections processed by different hardwares when you have a dual monitor configuration like me, and just do it all in the GPU.
Thanks!
2016-03-06 at 1:03 #2197Do you know at what bit precision are processed the LUTs loaded on the current NVIDIA chips?
It depends on the specific card, the type of connection, and maybe also driver settings. You’ll have to do some experimenting to find out if the card can output anything higher than 8 bits. Usually this at least requires a DisplayPort or HDMI >= 1.3 connection.
2016-03-06 at 1:23 #2198I’m not talking about the output, I know I can output 10 bits to the Eizos, I’m talking about the internal precision for calculating the color calibration. They say that their LUTs are 16 bits and the ones that the GPUs uses are 8 bits. Hope it makes sense. I’m using Quadro M6000 and GTX 980 Titan
2016-03-07 at 19:15 #2234Last time I checked, only AMD consumer grade cards had 10-bit internal LUTs. Your GTX most probably has only 8-bit internal LUT. That Quadro should do the job though. In any case, you can check yourself for the GPU LUT bit-depth from within DisplayCAL (Tools -> Report on uncalibrated display device). Please post the results for that GTX 980.
2016-03-07 at 19:39 #2237Thanks. I will take a look tonight. I know for a fact that this 980 can output 10 bits via Display Port, but that might now be related to the internal LUT processing. Will post what I find.
2016-03-07 at 20:11 #2238In that case, please check both DVI @8-bit and DisplayPort @10-bit. Do not use madVR (madTPG) for the test patches, as the dithering algorithm elevates internal processing bit depth to 11-bits, even if the GPU only does 8-bit processing.
- This reply was modified 8 years, 1 month ago by p.dada.
2016-03-21 at 6:30 #2348But how would I know by testing if the internal process of the GPU if of which depth? Not sure I get that.
2016-03-21 at 7:45 #2350Do a report on uncalibrated display and the program will tell you the video card’s LUT bit depth.
2016-03-28 at 5:48 #2387Do a report on uncalibrated display and the program will tell you the video card’s LUT bit depth.
Done that, but I do not see that information anywhere.
Although when I finish I get this report that says something about a 16 bits table of 3 channels, as you can see on the attached image, but then says its 256 entries per channel? Isn’t that just 8 bits?
Then says something of device to PCS, with 3 channels and 2048 entries pare channel,
what is PCS? The PCS
then after that says PCS to device 3 channels 4096 entries per channel, and the output table 256?
It is a bit confusing.
Attachments:
You must be logged in to view attached files.2016-03-28 at 16:54 #2391what people @ NVidia forums are saying ? I mean NVidia own people…
2016-03-28 at 17:12 #2392Do a report on uncalibrated display and the program will tell you the video card’s LUT bit depth.
Done that, but I do not see that information anywhere.
Although when I finish I get this report that says something about a 16 bits table of 3 channels, as you can see on the attached image, but then says its 256 entries per channel? Isn’t that just 8 bits?
That’s not the log window. That’s the profile info window. Go to “Tools” menu and click on “Show log window”. After you do the report on uncalibrated display, it should tell you the LUT bit-depth.
-
AuthorPosts