Huge inconsistency with DisplayCal

Home Forums Help and Support Huge inconsistency with DisplayCal

Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
    Posts
  • #19213

    Mike8040
    Participant
    • Offline

    Hello,

    unfortunately I’m really disappointed with the results as there is huge banding and color tints that weren’t there in the first place.  I use a LG 34UM88 C-P with IPS panel. I’m using a new EIZO EX3 for calibration. The LG has a good stock setting with calibration but I wanted to take it further (professional 3D Artist with clients getting kinky with color proofs). No printing involved here. I need a good gradient contrast balance.

    Problem is. Every calibration has a huge tint variation (sometimes to much yellow and sometimes to much red) and has always banding. (Banding checked in Win10 1803 native photo viewer and Photoshop CC). No constancy which is a huge trust killer. For now the displays stock configuration beats the calibration. (No chance changing to AMD graphic card as it is inferior regarding 3D work for the case one see the need to advice me on that)

    What I’ve done:

    • Used the tutorial to set it up DisplayCal (googles first suggestions)
    • 2 room light condition were tested. Daylight and working bulb on evening.
    • Used my own displaycal settings depending on the descriptions by tooltips
    • Set nvidia driver to 8bit after reading the threads here
    • restarted and retested
    • made a ridiculous  10 hours calibration (calibration speed “low” / profil quality high / patches 570) – best result so far regarding color tint but still banding

    Unfortunately the EX3 doesnt work with the Hardware calibration through TCP which I would prefer now. Ridiculous time consuming with displaycal and one need to study first to use this tool…I really don’t want to go crazy about but at least the calibrations should be nearly identical to another.

    Any idea what bothers DisplayCal here?

    Self Check reports hints to a  lot of bad value and the measurement report is ok valuewise. Whats the difference anyway? Why not just ONE verification test?

    Sorry for bothering as its getting frustrating now

    • This topic was modified 4 years, 8 months ago by Mike8040.
    • This topic was modified 4 years, 8 months ago by Mike8040.
    • This topic was modified 4 years, 8 months ago by Mike8040.
    • This topic was modified 4 years, 8 months ago by Mike8040.
    • This topic was modified 4 years, 8 months ago by Mike8040.
    Attachments:
    You must be logged in to view attached files.
    #19226

    Vincent
    Participant
    • Offline

    Eizo EXs are some kind of rebranded Spyders. Up to 5 and including it they have a bad reputation. If using default corrections available for them white does not looke white then:

    a) try to borrow or rent an spectrophotometer, create a custom matrix correction for each of your displays and your EX.
    b) try to borrow or rent an i1d3 using default White LED correction for common displays
    c) The B solution and apply A for that i1d3
    d) Visual white point editor. Get close to target with colorimeter, then visual tweak till it is “white” (no magenta or green)

    **********

    Regarding banding, banding can be caused by calibration by color management. I think that you are saying that you tested gradients with latest Win 10 default photo app so under NO color management banding is visible. That is calibration banding.
    If your LG 34″ accepts 10bit input try to configure 10bpc in nvidia panel. It should avoid worst part of LUT contents truncation. Otherwhise there was some thread about enabling dithering at GPU ouputs in nvidia, via registry. Search it & try.

    If it does not solve your issues try a better colorimeter.

    Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #19227

    Florian Höch
    Administrator
    • Offline

    Hi,

    I’m using a new EIZO EX3 for calibration

    That’s an OEM version of a Spyder5. Not the best device, slow, not very accurate, cannot read very low.

    there is huge banding and color tints

    Are you using Windows 10 1903 with the latest updates? See sticky topic in this forum.

    No chance changing to AMD graphic

    AMD has reliable dithering though. nVidia is very hit and miss (at least under Windows) when it comes to that. So if you are going to use the graphics card 1D LUTs (video card gamma table, videoLUT) for calibration, as you are, nVidia may not be the best choice (at least currently).

    Used the tutorial to set it up DisplayCal (googles first suggestions)

    It’s up to you if you really want to prefer a random third party site over the official documentation’s quickstart guide (which just boils down to using defaults).

    Anyway, you are using some questionable settings, and the settings between the reports do not look consistent.

    1. Decide whether you want to use the Spyder5’s white LED measurement mode (recommended) or the “Spectral: White LED” correction. Stick to one, and don’t use them interchangeably.
    2. Turn off ambient light level adjustment.
    3. Turn off black point correction.
    4. Set calibration speed back to fast.
    5. Turn off advanced options. You do not need them.
    6. Set testchart back to “Auto” and then profiling patches with the slider back to the default 175.
    7. You are now pretty much back to the recommended defaults.

    Make sure you disable any and all “dynamic” features of your monitor (dimming, dynamic contrast, energy saving etc).

    Self Check reports hints to a lot of bad value and the measurement report is ok valuewise.

    The self check report is the one that looks ok, I don’t see anything bad with it. Did you attach the right one?

    Whats the difference anyway?

    Self check report just looks up what the display profile predicts for the input values and compares them to what the simulation profile (or reference values) are. Measurement report does actual display measurements (obviously) which allows you to check profile accuracy and display drift.

    #19228

    Mike8040
    Participant
    • Offline

    First thanks for the quick and detailed response. I get on to the points made.

    Eizo EXs are some kind of rebranded Spyders. Up to 5 and including it they have a bad reputation. If using default corrections available for them white does not looke white then:
    If it does not solve your issues try a better colorimeter.

    I was on to ordering i1Display Pro now. Is it sufficient better compared to the EX3? The net folks are split on the quality

    Regarding banding, banding can be caused by calibration by color management. I think that you are saying that you tested gradients with latest Win 10 default photo app so under NO color management banding is visible. That is calibration banding.
    If your LG 34″ accepts 10bit input try to configure 10bpc in nvidia panel. It should avoid worst part of LUT contents truncation. Otherwhise there was some thread about enabling dithering at GPU ouputs in nvidia, via registry. Search it & try.

    Disabling the DisplayCal profil the visible banding is gone. Using Photoshop and Windows native viewer shows the same issue with the DisplayCal profils. WITHOUT DC color management gradients are fine. I must be confused by your wording as its exactly opposite as you predict. I saw the registry hack but I would consider it as a last option not a solution.

    Are you using Windows 10 1903 with the latest updates? See sticky topic in this forum.

    (Banding checked in Win10 1803 native photo viewer and Photoshop CC).

    No chance changing to AMD graphic
    AMD has reliable dithering though. nVidia is very hit and miss (at least under Windows) when it comes to that. So if you are going to use the graphics card 1D LUTs (video card gamma table, videoLUT) for calibration, as you are, nVidia may not be the best choice (at least currently).

    Yes I read about those topics here. Of course its not great and Nvidia must repair their drivers but as I said no chance. Those renderes I use are sole depending on Nvidia cards and optimized for them. In those segment Nvidia is on the forefront.

    It’s up to you if you really want to prefer a random third party site over the official documentation’s quickstart guide (which just boils down to using defaults).

    No luck in finding that one. I have a look again.

    Anyway, you are using some questionable settings, and the settings between the reports do not look consistent.

    As I said its my 10th calibration and at end I was fiddling around withdifferent settings because every calibration was garbage. My first setting was the default with the worst result! That was my intial thought/hope as there is no drastic change to be made. Calibrate, enjoy and work on..

    Decide whether you want to use the Spyder5’s white LED measurement mode (recommended) or the “Spectral: White LED” correction. Stick to one, and don’t use them interchangeably.

    I don‘t even know where this „Spyder5’s white LED measurement mode (recommended)“ is so the app should have been responsible for this setting. I tested Auto and White Led family. I used the settings for the 10h calibration on the pics I posted

    Turn off black point correction.

    • Set calibration speed back to fast.
    • Turn off advanced options. You do not need them.
    • Set testchart back to “Auto” and then profiling patches with the slider back to the default 175.
    • You are now pretty much back to the recommended defaults.

    My first setting was the default with the worst result. I tested those default settings triple. Every cal with different tint and always banding..

    Make sure you disable any and all “dynamic” features of your monitor (dimming, dynamic contrast, energy saving etc).

    Of course these are off from begin with

    The self check report is the one that looks ok, I don’t see anything bad with it. Did you attach the right one?

    My bad. I meant the report from my last 10h calibration which is below. This profile looks 100% like it changed my gamma down for whatever reason… Don‘t pretend to know what those values are but there is a clear BAD in written red.

    Whats the difference anyway?
    Self check report just looks up what the display profile predicts for the input values and compares them to what the simulation profile (or reference values) are. Measurement report does actual display measurements (obviously) which allows you to check profile accuracy and display drift.

    So to get it right. The self check just compares the values of a reference with that one in the profile without actually checking the colorimeter values that is getting from the display?
    In the verification tab there is a dropdown for testchart/reference with dozens of entries. Which one is preferable and is there any quick documentation which values are out of range and should be optimized?

    • This reply was modified 4 years, 8 months ago by Mike8040.
    • This reply was modified 4 years, 8 months ago by Mike8040.
    • This reply was modified 4 years, 8 months ago by Mike8040.
    Attachments:
    You must be logged in to view attached files.
    #19233

    Florian Höch
    Administrator
    • Offline

    No luck in finding that one. I have a look again.

    https://displaycal.net/

    Every cal with different tint and always banding..

    In a 8-bit display system without dithering, calibration will always introduce quantization artifacts (banding). There is no way around it. If you want a banding free result, you need to have more than 8 bits (e.g. 10 bit), or dithering, or no calibration at all.

    I don‘t even know where this „Spyder5’s white LED measurement mode (recommended)“ is so the app should have been responsible for this setting

    Nope. I can see from the reports you attached that you tried different settings for measurement mode and correction. “Auto” would have never picked any of those specific ones.

    So to get it right. The self check just compares the values of a reference with that one in the profile without actually checking the colorimeter values that is getting from the display?

    There are no colorimeter values from the display in that case. It uses the existing display profile.

    In the verification tab there is a dropdown for testchart/reference with dozens of entries. Which one is preferable

    That’s kind of the wrong question to ask. First you need to ask yourself: What is it that you want to verify? And then you can make a choice.

    #19241

    Mike8040
    Participant
    • Offline

    In a 8-bit display system without dithering, calibration will always introduce quantization artifacts (banding). There is no way around it. If you want a banding free result, you need to have more than 8 bits (e.g. 10 bit), or dithering, or no calibration at all.

    For the graphic card I’m only able to set 8bit. Maybe because of using HDMI. The display I dunno. For a non scientific point of view if there is no errors on the stock profile and after a “correction” by a software they occur, then whats the point of the correction. Wouldn’t be wise to skip this parts of the correction at all if they are known to produce unwanted results?
    For now I ditch the calibration as its clearly a degradation. I ordered the iDiplayPro and compare it with their system.

    There are no colorimeter values from the display in that case. It uses the existing display profile.

    Which one? The stock display profile? As it seems I can change the profile in the OS without DisplayCal knowing it and vice versa?

    In the verification tab there is a dropdown for testchart/reference with dozens of entries. Which one is preferable
    That’s kind of the wrong question to ask. First you need to ask yourself: What is it that you want to verify? And then you can make a choice.

    Easy as that. Is the result good or bad? Did the process optimize the prior state or harm it? Did I just waste my time or was it worth the effort..
    In other words which verify test in that list provide a clean and non scientific overview of the calibration that was made?

    Just to clarify the hole point here. I’m an artist not a techy. I have made dozen of visualizations for clients which served me after many years as my own reference. I’ve seen them printed and all sorts of monitors. Deviations were mostly contrast and brightness on different monitors. (Print is an hole other story) So I know how they should look like.
    But now here it comes. I make three calibration within the same day and ambient room light situation and get one noticeable red tinted , the next yellow tinted, the last screws up the gamma. So which one to trust? Well the one you are most familiar with. But none of them is that.

    If I can’t reproduce the calibration with just minor variations then the calibration is just coincidence. How can reproducible calibration can be achieved with Dispaycal to break it down.

    I mean it can be a possibility that the EX3 is weak or faulty even its new out of the box but the banding problems seems to appear only on DisplayCal on google research.

    #19243

    Florian Höch
    Administrator
    • Offline

    Wouldn’t be wise to skip this parts of the correction at all if they are known to produce unwanted results?

    I hope it occurs to you that you cannot generalize from your case to everyone else. In many cases, even in 8-bit display systems, any graphics card banding is not readily apparent and people care more about the accuracy. Even then, nothing stopping you from skipping calibration and just doing a profile for your color managed applications.

    Which one? The stock display profile?

    No, the one selected under “Settings” up top.

    Easy as that. Is the result good or bad?

    “Good” or “bad” do not exist without a clear specification of what they mean. A “good” result is where the profile describes the (calibrated) display behavior accurately, and thus has a low dE (delta E, or color distance) of measured values vs what the profile predicts.

    In other words which verify test in that list provide a clean and non scientific overview of the calibration that was made?

    The default one (select “Extended verification chart”, disable “Simulation profile”).

    If you want to see how good (or bad) your display conforms to any standard colorspace, select (e.g.) sRGB from the simulation profile dropdown and enable “use simulation profile as display profile”, then do a measurement report.

    So I know how they should look like.

    What’s the colorspace of your work usually? If you can answer that question, then at least you have a way of checking. But it still doesn’t mean that you know how they “should” look like, only how they generally look like on the devices you have available, i.e. what you are used to (subjectively), irrespective if that is accurate from an objective point of view.

    I make three calibration within the same day and ambient room light situation and get one noticeable red tinted , the next yellow tinted, the last screws up the gamma.

    Either your EX3 is really broken, or you are exaggerating. Attach the respective profiles, and I will be able to tell you which it is.

    How can reproducible calibration can be achieved with Dispaycal to break it down.

    Easy. Same settings, same device parameters, same environment, same result. Unless of course the measurement device is broken, or the display not stable.

    the banding problems seems to appear only on DisplayCal on google research

    Definitely not.

Viewing 7 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS