Home › Forums › Help and Support › 8 bit vs. 10 bit
- This topic has 7 replies, 3 voices, and was last updated 7 years, 1 month ago by Florian Höch.
-
AuthorPosts
-
2017-03-04 at 18:24 #6108
Hello
Do you think I could benefit in LUT quality by getting a 10 bit video card? … My HDTV is 8 bit so will this nullify the effect or does the 10 bit video card still offer some improvement in image quality (Banding, etc.)
Thanks!
2017-03-05 at 12:59 #6109There is a high chance your GPU already has support for 10bit LUT(10 bits per color, bpc), and you do benefit from it, even on 8bit output device, because of dithering which reduces banding.
I will presume that by 10 bit video card you mean cards that support 10 bits per pixel, bpp. These are Nvidia Quadro and AMD FirePro. 10bpp is, AFAIK, only supported in few applications, and I doubt you will have significant improvement in situations with real content.
2017-03-05 at 20:16 #6111There is a high chance your GPU already has support for 10bit LUT(10 bits per color, bpc), and you do benefit from it, even on 8bit output device, because of dithering which reduces banding.
I will presume that by 10 bit video card you mean cards that support 10 bits per pixel, bpp. These are Nvidia Quadro and AMD FirePro. 10bpp is, AFAIK, only supported in few applications, and I doubt you will have significant improvement in situations with real content.
Hi there, and thanks for the info! … I have an Intel i5 processor which has it’s own internal graphics processing. (Intel HD Graphics 4300)
Do you know if this type pf graphics supports the 10 bit we are talking about? … Yes, bpc.
p.s. I thought the whole chain had to be 10 bit for this to work… Video card,Program and Display?
Thanks again!
- This reply was modified 7 years, 2 months ago by Steve Smith.
2017-03-05 at 21:52 #6115I too use Intels HD GPU currently and it only supports 8bit LUT, there are no Intel GPU Control Panel settings to change it.
For 10bpc(10bit LUT) your best bet would be any Radeon GPU, Nvidia GTX’s are a big maybe(from my expierence).
2017-03-05 at 21:59 #6116Ok, thanks! … Maybe I should get an entry-level Radion video card to provide me with better gradations (10 bit) Even though I have an 8 bit panel HDTV … I’ve been calibrating with DispCAL for some time now and it would be really nice to eliminate that medium banding I usually see in my gradient ramp test images after calibrating… So it was my integrated video processor that was holding me back! … I want to ‘pull out’ the potential on my new Samsung HDTV. 🙂
Thoughts?
2017-03-06 at 12:34 #6117it would be really nice to eliminate that medium banding I usually see in my gradient ramp test images after calibrating
Note that in a lot of the cases I’ve looked at (talking about ICC color management capable graphics applications), banding was introduced by limited bit depth processing in the application itself, not by the calibration. What apps should do is process image data at greater than 8 bits (or floating point) internally, and then dither down for display.
2017-03-08 at 22:12 #6131Can dci-p3 be calibrated properly using display-cal? … Should I enable this color space on my hdtv? – (HDMI UHD Color) (HDR+) mode.
2017-03-09 at 11:27 #6142Can dci-p3 be calibrated properly using display-cal?
Sure, just select DCI-P3 as source colorspace on the 3D LUT tab. Note though that for HDR, the correct choice is always Rec. 2020 (which is the container format). Also, calibrating for HDR may be difficult if the TV employs power-/brightness limiting/A(S)BL, in which case you can try if white level drift compensation helps.
-
AuthorPosts