OLED laptop calibration possible without banding?

Home Forums Help and Support OLED laptop calibration possible without banding?

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #22899

    DaveK
    Participant
    • Offline

    I have an HP Spectre x360 laptop with 4K OLED screen and integrated Intel plus NVIDIA graphics. Out of the box the calibration is not great, the gamma is too high especially in HDR mode (2.5+) and inconsistent across levels. I have tried loading icm LUT files with 16 bit precision using Windows color management, but they are apparently truncated to 8 bits and cause obvious banding, both in SDR and HDR mode. Apparently the integrated Intel graphics is responsible for the LUTs, and anything rendered using NVIDIA is passed through. No color controls are available in the NVIDIA control panel, and very limited ones in the Intel control panel (not even gamma, or any gamut settings).

    My question: is it possible to calibrate the display in a way that avoids the banding caused by 8 bit truncation? Can it be calibrated in such a way for both for SDR and HDR modes (maybe with separate profiles – applying icm LUTs behaves differently between the two)?

    Note: I created the icm files myself (so I know they are 8->16 bit LUTs created from smooth curves), and have not yet tried to calibrate with DisplayCAL and a colorimeter. I’d like to know if it’s possible to eliminate banding before I bother buying a colorimiter, or if the hardware is limited to LUTs with 8 bit output. Even in 10 bit HDR mode, the 8->16 bit LUTs are applied with interpolation that introduces banding, yet are still 10 bits in and at least 10 bits out. It’s as if it first truncates them to 8 bits out, then interpolates from there – an identity ramp has no banding, but bumping a single entry up enough to truncate to the next 8 bit step causes an interpolated bump in 10 bit output, something like 4 steeper steps then 4 flat.

    I have done help chats with HP, NVIDIA, and Intel support staff, and none of them understood this well enough to answer my questions.

    #22909

    Vincent
    Participant
    • Offline

    If laptop GPU+CPU has enough computing power you may try madVR for videos & compatible players like MPC-BE. It will take your profile as input in DisplayCAL to make a LUT3D. MadVR should be able to make use to gerneral purpose computing on GPU to do calculations at high precision, then dither it to 8bit as an AMD would do in its HW 1D-LUTs.

    For photo in LR/CaptureOne  just profilling display without caibration could solve issues, since they do dithering to… and you trust all corrections to their color management engine. IDNK if display is too off to be corrected that way.

    For desktop or other apps, I’m afraid that there is no solution to banding.

    • This reply was modified 1 month, 2 weeks ago by Vincent.
    #22915

    DaveK
    Participant
    • Offline

    If laptop GPU+CPU has enough computing power you may try madVR for videos & compatible players like MPC-BE. It will take your profile as input in DisplayCAL to make a LUT3D. MadVR should be able to make use to gerneral purpose computing on GPU to do calculations at high precision, then dither it to 8bit as an AMD would do in its HW 1D-LUTs.

    For photo in LR/CaptureOne  just profilling display without caibration could solve issues, since they do dithering to… and you trust all corrections to their color management engine. IDNK if display is too off to be corrected that way.

    For desktop or other apps, I’m afraid that there is no solution to banding.

    Thanks for the response.  From what you are saying, it sounds like the hardware simply won’t do any better than I described (always truncating to 8 bits).  That’s quite disappointing, especially for a display that’s otherwise very impressive, and makes it more unforgivable for them not to have calibrated it well out of the box.  There’s obviously a layer that takes the graphics card output and modulates the OLED pixels, but I assume it wouldn’t be possible to tweak the calibration at that level.  It sure would be nice if they exposed a way to do that.  Basically the laptop display is like a monitor with absolutely no user adjustment or calibration capability whatsoever, relying on the graphics card and drivers which are insufficient.

    I did find a couple video players that could at least adjust the gamma in software, and I think one that would take a calibration, but the video I was trying it with was at the limit of what it could play with no adjustments (4K HDR @ 60 fps – had to be transcoded to H265 since it could hardly do 30 fps in VP9), and any processing at all makes it drop a lot of frames.  HD or non-HDR 4K @ 30 fps may work.

    I figured I could use the calibration in software in Photoshop or whatever (haven’t installed it yet), but it would be a pain to always have to use specific picture viewers to see images correctly.  For example, just looking at images in a browser, unless there’s one that does it in software.

    #23150

    DaveK
    Participant
    • Offline

    Following up on this – I installed Photoshop and found that it won’t make use of the 10 bit or HDR capabilities of the display.  It behaves like any other 8 bit SDR application when I use HDR mode – nothing above nominal white, and 8 bit output, even with 30 bit mode selected under advanced graphics processor settings.  I also found that it will not dither the display when applying color profiles (in HSR or SDR mode), or when displaying 16+ bit per channel images.  So it introduces banding when doing color correction just like the hardware LUTs do, which was very disappointing.

    However, I found that using 16 bit mode and adding a top layer with just the right amount of noise using Linear Light mixing and low opacity essentially forces it to dither, and the banding disappears.  If the settings are just right, it introduces no more noise than what’s necessary to interpolate between 8 bit levels.  I guess that’s what I’ll have to do when editing in PS.  That doesn’t help for displaying images in any other application though.  They will always be banded when doing color correction, unless they can load the profile and use high precision and dithering in software.

    #23152

    Vincent
    Participant
    • Offline

     

    Thanks for the response.  From what you are saying, it sounds like the hardware simply won’t do any better than I described (always truncating to 8 bits).  That’s quite disappointing, especially for a display that’s otherwise very impressive, and makes it more unforgivable for them not to have calibrated it well out of the box.  There’s obviously a layer that takes the graphics card output and modulates the OLED pixels, but I assume it wouldn’t be possible to tweak the calibration at that level.  It sure would be nice if they exposed a way to do that.  Basically the laptop display is like a monitor with absolutely no user adjustment or calibration capability whatsoever, relying on the graphics card and drivers which are insufficient.

    Like any other laptop with very few exceptions that allow HW cal using vendor tools.

    Following up on this – I installed Photoshop and found that it won’t make use of the 10 bit or HDR capabilities of the display.It behaves like any other 8 bit SDR application when I use HDR mode – nothing above nominal white, and 8 bit output, even with 30 bit mode selected under advanced graphics processor settings.

    Like any other computer for HDR. Like any other computer & monitor for 10bit if you do not have Photoshop requirements for that: 10bit OpenGL driver + 10bit link to display (although 10bit “in” OpenGL driver could do dither to 8bit link to display). Currently that is limited to Quadro/Firepro/Geforce(wuth StudioDriver) + 10bit link to a display that support 10bit input, or newer macs with 10bit support or dither in OpenGL driver to their embeded 8bit displays in laptops

      I also found that it will not dither the display when applying color profiles (in HSR or SDR mode), or when displaying 16+ bit per channel images.  So it introduces banding when doing color correction just like the hardware LUTs do, which was very disappointing.

    That is Photoshop’s fault, see other threads. It is not related to your computer.
    Lightroom (develop module), ACR window in Photoshop, or Capture One do dither so all is fine ***if*** GPU calibration causes no banding like in a GPU with more than 8bit per entry lut and dither (desktop AMD cards and may be others and monitors with reliable HW cal)

    • This reply was modified 1 month ago by Vincent.
Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS