OLED laptop calibration possible without banding?

Home Forums Help and Support OLED laptop calibration possible without banding?

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #22899

    DaveK
    Participant
    • Offline

    I have an HP Spectre x360 laptop with 4K OLED screen and integrated Intel plus NVIDIA graphics. Out of the box the calibration is not great, the gamma is too high especially in HDR mode (2.5+) and inconsistent across levels. I have tried loading icm LUT files with 16 bit precision using Windows color management, but they are apparently truncated to 8 bits and cause obvious banding, both in SDR and HDR mode. Apparently the integrated Intel graphics is responsible for the LUTs, and anything rendered using NVIDIA is passed through. No color controls are available in the NVIDIA control panel, and very limited ones in the Intel control panel (not even gamma, or any gamut settings).

    My question: is it possible to calibrate the display in a way that avoids the banding caused by 8 bit truncation? Can it be calibrated in such a way for both for SDR and HDR modes (maybe with separate profiles – applying icm LUTs behaves differently between the two)?

    Note: I created the icm files myself (so I know they are 8->16 bit LUTs created from smooth curves), and have not yet tried to calibrate with DisplayCAL and a colorimeter. I’d like to know if it’s possible to eliminate banding before I bother buying a colorimiter, or if the hardware is limited to LUTs with 8 bit output. Even in 10 bit HDR mode, the 8->16 bit LUTs are applied with interpolation that introduces banding, yet are still 10 bits in and at least 10 bits out. It’s as if it first truncates them to 8 bits out, then interpolates from there – an identity ramp has no banding, but bumping a single entry up enough to truncate to the next 8 bit step causes an interpolated bump in 10 bit output, something like 4 steeper steps then 4 flat.

    I have done help chats with HP, NVIDIA, and Intel support staff, and none of them understood this well enough to answer my questions.

    #22909

    Vincent
    Participant
    • Offline

    If laptop GPU+CPU has enough computing power you may try madVR for videos & compatible players like MPC-BE. It will take your profile as input in DisplayCAL to make a LUT3D. MadVR should be able to make use to gerneral purpose computing on GPU to do calculations at high precision, then dither it to 8bit as an AMD would do in its HW 1D-LUTs.

    For photo in LR/CaptureOne  just profilling display without caibration could solve issues, since they do dithering to… and you trust all corrections to their color management engine. IDNK if display is too off to be corrected that way.

    For desktop or other apps, I’m afraid that there is no solution to banding.

    • This reply was modified 2 days, 23 hours ago by Vincent.
    #22915

    DaveK
    Participant
    • Offline

    If laptop GPU+CPU has enough computing power you may try madVR for videos & compatible players like MPC-BE. It will take your profile as input in DisplayCAL to make a LUT3D. MadVR should be able to make use to gerneral purpose computing on GPU to do calculations at high precision, then dither it to 8bit as an AMD would do in its HW 1D-LUTs.

    For photo in LR/CaptureOne  just profilling display without caibration could solve issues, since they do dithering to… and you trust all corrections to their color management engine. IDNK if display is too off to be corrected that way.

    For desktop or other apps, I’m afraid that there is no solution to banding.

    Thanks for the response.  From what you are saying, it sounds like the hardware simply won’t do any better than I described (always truncating to 8 bits).  That’s quite disappointing, especially for a display that’s otherwise very impressive, and makes it more unforgivable for them not to have calibrated it well out of the box.  There’s obviously a layer that takes the graphics card output and modulates the OLED pixels, but I assume it wouldn’t be possible to tweak the calibration at that level.  It sure would be nice if they exposed a way to do that.  Basically the laptop display is like a monitor with absolutely no user adjustment or calibration capability whatsoever, relying on the graphics card and drivers which are insufficient.

    I did find a couple video players that could at least adjust the gamma in software, and I think one that would take a calibration, but the video I was trying it with was at the limit of what it could play with no adjustments (4K HDR @ 60 fps – had to be transcoded to H265 since it could hardly do 30 fps in VP9), and any processing at all makes it drop a lot of frames.  HD or non-HDR 4K @ 30 fps may work.

    I figured I could use the calibration in software in Photoshop or whatever (haven’t installed it yet), but it would be a pain to always have to use specific picture viewers to see images correctly.  For example, just looking at images in a browser, unless there’s one that does it in software.

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS