Nvidia drivers – Color Accuracy Mode

Home Forums Help and Support Nvidia drivers – Color Accuracy Mode

Viewing 15 posts - 1 through 15 (of 23 total)
  • Author
    Posts
  • #27324

    zunderholz
    Participant
    • Offline

    There is a new nvidia driver setting called “Color Accuracy Mode.” I’m not 100% sure how it works from the in-driver description.

    It sounds like the driver may be forcing the icc/lut to be active at all times, hopefully meaning the entire OS will be color managed, but I’m not sure.

    Attachments:
    You must be logged in to view attached files.
    #27333

    Patrick1978
    Participant
    • Offline

    I was also wondering exactly what this new option does.  The description provided is a bit of word salad.

    From playing with it a bit it seems that the color controls (when available) are independent of the vlut now.  At least changes I was making weren’t being overwritten by DisplayCAL’s profile loader like they used to be.

    #27338

    Vincent
    Participant
    • Offline

    Looks like
    Reference: EDID emulation to sRGB
    Accurate: as always but maybe dithered LUTs?
    Enhanced: as always have been

    If you guys have a widegamut display just switch to native gamut and enable reference, desktop wallpaper should look sRGB-like.
    For accurate just do the lagom gradient test in MS paint with a display that does not accept 10bit input (and no dither trick on registry).

    Maybe they fixed all this stuff 15 years later… it’s never too late if they actually fixed it.

    • This reply was modified 3 years, 3 months ago by Vincent.
    #27486

    ivad24
    Participant
    • Offline

    Hi guys, what would be the best setting in color accuracy mode to pair with displaycal profile loader? With Displaycal loader, I want my monitor to display the same color accuracy it had with the old Nvidia control panel.

    (first photo) I used to have this on the old NVCP with “Other applications control color settings” selected. Since i did not want to have any enhancements brought by NVIDIA.

    (second photo) Now I have this NEW NVCP with Color Accuracy Mode setting. Question is if I don’t make any changes on the “3. Apply color enhancements” would it have the same effect with the old NVCP default color settings i had before?

    • This reply was modified 3 years, 3 months ago by ivad24. Reason: typo
    Attachments:
    You must be logged in to view attached files.
    #27520

    LtRoyalShrimp
    Participant
    • Offline

    Googled and found this – https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/mergedProjects/nvdsp/CS_Adjust_Color_Settings_Advanced.htm

    Color Accuracy Mode
    Displays the current color accuracy mode – reference, accurate, or enhanced. The color accuracy mode depends on the color space pipeline used for processing each rendered pixel.

    • Reference:
      • Rendered pixels are processed through the GPU pipeline in 1:1 pixel color-matching mode. This pipeline is used if there are no pixel color enhancements from the OS or from color adjustment applications such as the NVIDIA Control Panel.
    • Accurate:
      • Rendered pixels are processed using one of the following:
        • Color space conversion (CSC) or custom gamma ramp via NvAPI
        • Windows Display Color Transform (DCT) with Microsoft Hardware Calibration (MHC) ICC profile applied
    • Enhanced:
      •  Rendered pixels are processed using one of the following:
        • Color adjustments through the NVIDIA Control Panel (see the Apply Color Adjustments section)
        • DCT is on but no MHC ICC color profile is associated with it

    Override to reference mode: Select this check box to override the current color processing and force the reference mode. When selected, color adjustments from the OS or other adjustment applications are ignored and rendered pixels are processed through the GPU pipeline in 1:1 pixel matching mode.

    Apply Color Enhancements
    NOTE: If the Override to reference mode check box is selected, these controls are set to default values but are greyed out and cannot be changed.

    #27521

    Darkmatter
    Participant
    • Offline

    Interesting.

    So is this in both the Game Ready drivers and the Studio 10bit drivers?

    Why they don’t just enable 10bit as an option in the Game Ready drivers is beyond me…

    BTW, is there any game performance hit to using the Studio drivers? I’m sure there is if you have it set to 10bit, as the card has a lot more colours to deal with, but what if you set the Studio driver to 8bit?

    In fact, if it wasn’t for the greater general output of the 3000 series cards (so far) and the rendering capabilities of the 2nd gen ray trace, I’d seriously consider going AMD. They not only have 10bit by default, but if I remember correctly, they also offer 8bit + dither, which, for some totally unknown reason, Nvidia doesn’t… 0.o’

    I know AMD’s next gen cards have ray trace, but I’m assuming, just like Nvidia’s 2000x series cards, it was too big of a FPS hit to use on much. Such is the fate of most 1st gen advancements. 🙁

    #27538

    Patrick1978
    Participant
    • Offline

    Why they don’t just enable 10bit as an option in the Game Ready drivers is beyond me

    10bit opengl support was added to the game ready drivers a little over a month after it was added to the studio drivers.  Nvidia has supported 10bit output in directx for a long time.  The 10bit opengl support was important since that’s what creative apps like photoshop use.  The desktop and most applications don’t benefit from having the output set to 10 bit (unless you have HDR turned on I guess) since they don’t use either api that is required.

    As best I can tell the only difference between the studio and game ready drivers is that game fixes/improvements get pushed to the game ready drivers first then the next studio driver, and creative application fixes/improvements get pushed to the studio drivers first then the next game ready driver.  It’s just a matter of what you need fixes right way for more.

    #27539

    Darkmatter
    Participant
    • Offline

    I didn’t know they added that recently.

    Thanks for the heads up!

    DM

    #27540

    Vincent
    Participant
    • Offline

    Why they don’t just enable 10bit as an option in the Game Ready drivers is beyond me

    10bit opengl support was added to the game ready drivers a little over a month after it was added to the studio drivers.  Nvidia has supported 10bit output in directx for a long time.  The 10bit opengl support was important since that’s what creative apps like photoshop use.  The desktop and most applications don’t benefit from having the output set to 10 bit (unless you have HDR turned on I guess) since they don’t use either api that is required.

    Nvidia truncates output to bpc with no dither, hence it is important even if you only use desktop apps that compose images in 8bit. A side effect is that if a display does not accept 10bit you may not be able to configure 10bpc and you’ll have banding (unless dither trick).
    Weird as always have been with nvidia and their poor color quality once you start playing with LUTs.

    OTOH and AMD just need to use its propietary drivers and it’s bandless as long as you use DisplayCAL loader, no matter how many bits in HW link.

    ****

    Regarding post from @LtRoyalShrimp it seems it works the opposite I guessed: Reference = old ways, accurate = EDID translation.

    Reference:Rendered pixels are processed through the GPU pipeline in 1:1 pixel color-matching mode. This pipeline is used if there are no pixel color enhancements from the OS or from color adjustment applications such as the NVIDIA Control Panel.

    Accurate:Rendered pixels are processed using one of the following:Color space conversion (CSC) or custom gamma ramp via NvAPI
    Windows Display Color Transform (DCT) with Microsoft Hardware Calibration (MHC) ICC profile applied

    But if somebody has an nvidia with a widegamut dsplay configured in a native colrospace preset it will be better to confirm it. Just put 3 big patches in MS paint with R 255, G255, B255 each one and see how they vary changing that nvidia panel settings. Then share here if you wish so other nvidia owners can configure their P3 multimedia displays to play rec709/sRGB games as intended.

    ****

    Also Merry Xmas to all DisplayCAL community.

    #27544

    Vincent
    Participant
    • Offline

    With “you’ll have banding (unless dither trick)” I meant you’ll have it if you modify LUT contents from their default linear values. I think it was implicit since its DisplayCAL forum.. but just it case.

    #27547

    Darkmatter
    Participant
    • Offline

    With “you’ll have banding (unless dither trick)” I meant you’ll have it if you modify LUT contents from their default linear values. I think it was implicit since its DisplayCAL forum.. but just it case.

    Hey Vincent! Merry Christmas!

    I do have a 10bit display, but as you said, a lot of things in a Windows OS environment only render 8bit, which is usually fine since programs that you’d care about 10bit are made to allow 10bit.

    I was curious about your comment on Nvidia and LUTs. LUTs are pretty much always non-linear correct? Because no monitor outputs a perfectly linear curve, which is at least partly why we do corrections. So, I guess my question is, can you use Nivida with 10bit in drivers, with a LUT, on a true 10bit display without banding?

    I suppose my question is more for colour critical work in programs such as PS, so I’m guessing I should be OK, but I wanted your opinion. I’d love to have no banding at all, but honestly, the amount of time I’ve had to play a game has been so minimal these last few years I don’t know if I’d even care. lol

    I do know of the dither trick, but from what I’ve read (which admittedly was a while ago, it’s become increasingly hard to do, or to keep activated as MS updates Windows 10.

    Thanks for any input you can provide, and I hope you and your family are doing well! And everyone for that matter. 🙂

    BTW, (and I loath doing this… truly, I feel like I need a shower now) do you have any advice about the gobbly goop answer I got via email regarding hardware support for the i1Display Pro Plus for my monitor? I put in General Discussion because it wasn’t specifically a DCal issue.

    What would be really awesome, but unlikely, would be for Argyle CMS to be able to use a monitors native hardware chip to do corrections, but I’m guessing that would require a lot of code hacking. :/

    DM

    Calibrite Display Plus HL on Amazon   Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #27550

    Vincent
    Participant
    • Offline

    With “you’ll have banding (unless dither trick)” I meant you’ll have it if you modify LUT contents from their default linear values. I think it was implicit since its DisplayCAL forum.. but just it case.

    Hey Vincent! Merry Christmas!

    I do have a 10bit display, but as you said, a lot of things in a Windows OS environment only render 8bit, which is usually fine since programs that you’d care about 10bit are made to allow 10bit.

    I was curious about your comment on Nvidia and LUTs. LUTs are pretty much always non-linear correct? Because no monitor outputs a perfectly linear curve, which is at least partly why we do corrections. So, I guess my question is, can you use Nivida with 10bit in drivers, with a LUT, on a true 10bit display without banding?

    Video card LUTs with no custom GPU calibration are linear in the sense of input=output, I was not talking about actual display output.

    AFAIK 10bpc in nvidia panel is only enabled if you can make a link between card and monitor at 10bit… whatever app can use ot or not. Check it maybe I’m wrong. Also test it with Diplaycal “uncalibrated display report”

    I suppose my question is more for colour critical work in programs such as PS, so I’m guessing I should be OK, but I wanted your opinion. I’d love to have no banding at all, but honestly, the amount of time I’ve had to play a game has been so minimal these last few years I don’t know if I’d even care. lol

    I do know of the dither trick, but from what I’ve read (which admittedly was a while ago, it’s become increasingly hard to do, or to keep activated as MS updates Windows 10.

    PS can have banding even if calibration is perfect, because it is a perfect HW cal or because you have an AMD with dither. THAT particular banding, not caused by calibration at all, is caused by rounding errors and truncation while color managing. There are several ways to get rid of it:

    -PS OpenGL 10bit drawing OpenGL 30bit surfaces over a desktop composed at 8bit in a 30bit link between GPU and display

    -Dithering, at app level, before going to GPU LUTs: Adobe Camera Raw (yes, even in PS which does not support it, funny guys at Adobe), Adobe LR, madVR, CaptureOne… etc

    Thanks for any input you can provide, and I hope you and your family are doing well! And everyone for that matter. ????

    BTW, (and I loath doing this… truly, I feel like I need a shower now) do you have any advice about the gobbly goop answer I got via email regarding hardware support for the i1Display Pro Plus for my monitor? I put in General Discussion because it wasn’t specifically a DCal issue.

    What would be really awesome, but unlikely, would be for Argyle CMS to be able to use a monitors native hardware chip to do corrections, but I’m guessing that would require a lot of code hacking. :/

    DM

    You’ll need a monitor SDK to upload LUTs to monitor as HP or Dell have. There are threads about this less than 1 month ago. If Asus provides no SDK the only way will ve reverse engineering Asus app and its DLLs… as no one will do that since if such efford was needed it can me measured in $ or euro… and with taht money people will buy an Eizo and skip all issues.

    PS: Asus has an SDK but AFAIK it is only provided to friendly vendors like Lightspace or Calman for integration with their tools. AFIAK there is no publick SDK like Dell or HP.

    • This reply was modified 3 years, 3 months ago by Vincent.
    #27552

    Darkmatter
    Participant
    • Offline

    Hi Vincent, thanks.

    The funny thing about Asus is that they DO have support for the Pro Plus for their ultra high end monitors along with a v2 of their calibration software, which is the only way to take advantage of the hardware calibration in the monitor.

    The thing is, that post I made was the answer I got about why they can’t do that for the PA329C, and I suppose to some extent, why you can only get the v1.x calibration software.

    It sounds like a lot of round about talk for, “we don’t think your monitor is worth the money to add the support.”

    Also, X-Rite is a friendly company to Asus, as they have several monitors (very high end) that come with an X-Rite i1D Pro. Is this likely some sort of licensing issue where Asus would have to pay X-Rite to integrate the Pro Plus for the PA329C?

    In the end, I’ll probably get a Pro Plus as it’s much newer tech and more future proof. Besides, the chip in the monitor is useless if the software sucks. :/

    #27580

    benjamin
    Participant
    • Offline

    Since Nvidia released this driver feature, applying the colour profile via windows colour management allows my profile to stay active in games that used to block it (Dying Light, Destiny 2 – fullsceen mode).  Once the profile is set in windows, it seems to stay  working in all games.

    I was wondering.. For a long time I have been using DisplayCal profile loader to automatically reapply the colour profile. and keep it active.  That no longer seems necessary.  I can close the profile loader and my colour profile stays active.

    For us gamers, who calibrated using DisplayCal to correct the grayscale/gamma in games/desktop, I’m wondering now as windows is now able to do the job itself Is using the profile loader now pointless?

    • This reply was modified 3 years, 3 months ago by benjamin.
    • This reply was modified 3 years, 3 months ago by benjamin.
    #27583

    ivad24
    Participant
    • Offline

    @benjamin

    Nope, you will still need displaycal profile loader. Just tested it with the blueish-yellowish cLUT icm file WITHOUT displaycal loader and it didn’t change the color to blueish-yellowish of the game (twin mirror) i was playing on and when i use Displaycal loader the blueish-yellowish icm displayed in the game. Windows is still unreliable.

    • This reply was modified 3 years, 3 months ago by ivad24.
Viewing 15 posts - 1 through 15 (of 23 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS