General purpose. Should I buy a colorimeter?

Home Forums General Discussion General purpose. Should I buy a colorimeter?

Viewing 15 posts - 1 through 15 (of 19 total)
  • Author
    Posts
  • #29541

    Millenium7
    Participant
    • Offline

    I’ll first start by saying I have no experience with colorimeters, and little experience with display calibration in general

    My main question is, is it worth it for me to buy a colorimeter to calibrate my monitors?

    The situation is this. I have a MSI MPG341CQR which is already color calibrated from the factory and quite honestly looks fantastic, right next to an IPS it doesn’t give up anything in the colour department and has better contrast………….. in SDR. But in HDR its absolutely garbage, its massively red shifted to the point of white looking light pink. I contacted MSI about this and they insist the display is also HDR calibrated as per spec but either they forgot to load the calibration on mine or they are full of crap. Either way i’m left with the only option for actually using HDR to be a quick mess around with QuickGamma to generate an advanced ICC profile for HDR just to correct this red shift and try and get white closer to actual white. There are no adjustment options in the monitors menu’s when in HDR mode. The profile I made works well enough to achieve the result of white looking white and the net result is gaming in HDR is still quite good, but the colours are wrong. SDR content when in HDR mode looks very flat and dull like a reference panel would

    Hence the idea popped into my head to buy a colorimeter. However the main question is will it be of any use? Keep in mind I know nothing about this, it’s purely an idea in my head and this is a speculative post, so things like 3D LUT? yeah no idea, over my head. But if I bought something like the X-Rite i1 Display Studio can I use it to with my monitor in HDR to calibrate it so the image is correct? I don’t really care about SDR as i’m very happy with the display as it is, but everything i’ve read online doesn’t specifically mention HDR calibration

    That’s the first question, the follow up is what can I do with it outside of that? Hardware calibration for instance, how do I know if say my TV can be hardware calibrated? Or any of my monitors? Or is this strictly to generate an ICC profile for use in Windows?

    Calibrite Display SL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #29571

    Vincent
    Participant
    • Offline

    HDR mode = as is in most displays.

    If that display is not actually HDR but SDR with near P3 gamut and a translator from HDR (Rec2020 PQ) to panel capabilities, its better to make a LUT3D in SDR mode and feed that to madVRto watch HDR content. No way of HDR gaming this way.

    #29599

    Millenium7
    Participant
    • Offline

    Its a proper HDR display https://www.msi.com/Monitor/Optix-MPG341CQR/Specification

    But its like the colour temperature is completely wrong in HDR. It’s akin to going into the monitor menu and cranking red up by 30%. Otherwise it definitely does display HDR, it’s just that all options are completely locked out when running in HDR, can’t adjust anything on the monitor itself

    So DisplayCAL cannot make a HDR colour profile? What does work for me at the moment is generating an advanced ICC profile from QuickGamma – which is software that only lets you adjust gamma values thats it, I take a lot of red out – and then applying this in Windows. Net result is this profile applies across the board in every single application i’ve tried. All HDR content is then much, much closer to having white actually look white. And when I switch back to SDR this profile is seemingly ignored and everything looks as it should in SDR

    I’m hoping that DisplayCAL could do something similar, run calibration with a colorimeter whilst in HDR mode, do its colour test patterns and see “yep, colours completely screwed” and generate a profile to counter the massive red shift. Apply it and bobs your aunty it should result in perfect colour, no?

    Keep in mind i’m a completely newbie at this sort of stuff, so I really have no idea how it actually works and what goes on under the hood. All I know is I have some amount of success with a gamma profile, but I want it fine tuned to be correct across the spectrum, not just a broad gamma adjustment

    #29605

    Vincent
    Participant
    • Offline

    Its a proper HDR display https://www.msi.com/Monitor/Optix-MPG341CQR/Specification

    It isn’t, 3000:1

    But its like the colour temperature is completely wrong in HDR. It’s akin to going into the monitor menu and cranking red up by 30%. Otherwise it definitely does display HDR, it’s just that all options are completely locked out when running in HDR, can’t adjust anything on the monitor itself

    No. It translates Rec2020 PQ content to panel capabilities like any other display which accepts HDR signal.

    So DisplayCAL cannot make a HDR colour profile? What does work for me at the moment is generating an advanced ICC profile from QuickGamma – which is software that only lets you adjust gamma values thats it, I take a lot of red out – and then applying this in Windows. Net result is this profile applies across the board in every single application i’ve tried. All HDR content is then much, much closer to having white actually look white. And when I switch back to SDR this profile is seemingly ignored and everything looks as it should in SDR

    No. It’s monitor’s fault: translation from Rec2020 to panel capabilities cannot be disabled.

    I’m hoping that DisplayCAL could do something similar, run calibration with a colorimeter whilst in HDR mode, do its colour test patterns and see “yep, colours completely screwed” and generate a profile to counter the massive red shift. Apply it and bobs your aunty it should result in perfect colour, no?

    Keep in mind i’m a completely newbie at this sort of stuff, so I really have no idea how it actually works and what goes on under the hood. All I know is I have some amount of success with a gamma profile, but I want it fine tuned to be correct across the spectrum, not just a broad gamma adjustment

    It could be possible IF you monitor supported what I wrote in other threads:
    -can disable HDR translator but keep HDR backlight (FALD+max brightness)
    -has a public API to upload a LUT3D to monitor, or vendor has a tool to do that.

    HDR mode is “as is” unless display has those features, although some of them (mostly TVs) meet those requirements partially (like 10 point greyscale to do not “deform” full translator cube)

    • This reply was modified 3 years ago by Vincent.
    #32720

    Millenium7
    Participant
    • Offline

    Going to revive my thread as I didn’t bother with HDR at all until again recently – I just use it for gaming

    Here’s the simple reality: way too much red-shift in HDR. And yes an ICC profile can and does offset it, as mentioned before I’ve used QuickGamma to very roughly do a profile that simply offsets the red down significantly, save and load the profile. It achieves the net result in every application of counter-acting the enormous red shift on this monitor, but its still off

    I just want to know if a colorimeter can essentially create a ‘negative’ to offset whatever the monitor is displaying, to get it to look very close to what the colours should look like. I’m not chasing perfection here, but if I can HDR to look as good as when this monitor is in SDR (which is factory calibrated) – but of course with extra colour depth to remove banding, increase low/high light sections in the scene etc – then i’m happy

    I don’t need it to go intoi the monitor and make perfect calibrations. I just need a very good profile, what I don’t know is if a colorimeter and DisplayCAL is going to do this. All I know is that my super-duper-poor-mans QuickGamma profile goes a hell of a long way to making HDR actually tolerable, but its still washed out and incorrect as its a very simple R/G/B offset. I clearly want something better than that, don’t need ‘designer perfect reproduction’ however

    #32722

    Vincent
    Participant
    • Offline

    Changes you made are for grey. Other non grey values will go through built- in, not modificable LUT translation from PQ HDR signal to panel capabilities with that grey correction and nothing else.

    Easier to rely on a software HDR SDR translation like madVR than to tweak a faulty HDR mode.

    • This reply was modified 2 years, 5 months ago by Vincent.
    #32734

    Millenium7
    Participant
    • Offline

    Mate you’re talking language that doesn’t make any sense. I don’t understand the lingo. I’m completely new to the idea of this, have never once in my life used a colorimeter or any tuning software outside of a very, very basic RGB calibration

    Here’s the really simple question: Is a colorimeter worthwhile for what I want to accomplish?

    Basic general purpose tuning that takes it from ‘bad’ to ‘good’. I’m not chasing ‘perfect’. I’m not a graphics designer, i’m just someone with a HDR monitor that has crap tuning when in HDR mode, SDR is perfect and thats what I have to compare to

    #32735

    Vincent
    Participant
    • Offline

    i’m just someone with a HDR monitor that has crap tuning when in HDR mode, SDR is perfect and thats what I have to compare to

    Don’t use HDR mode. Use only SDR modes and a video player that can map HDR to SDR in software. It’ts the same your monitor does… but you can configure it.

    For example video player compatible with madVR, relying on  HDR to SDR based on shaders.

    #32736

    Millenium7
    Participant
    • Offline

    I want to use HDR because it is a very real improvement in some area’s. Say what you want about HDR400, it still definitely does help massively reduce colour banding and allow for peak brightness and darkest darkness in the same scene. There’s a very real difference in very low light or very high light situations. SDR doesn’t support enough gradients to show all details in a very dark or very light (or combined) scene. Turning HDR on shows those details that are otherwise completely invisible and/or banded out

    The problem is just that the calibration is off

    For simplicities sake lets just say that my monitor has a +10% to the red channel, a +2% to the green channel and is otherwise perfect on blue

    What i’m chasing is the ability to generate a ‘negative’ profile that offsets all red colours down by 10% on the red and down by 2% on the green before sent to the monitor. Thus it will even out and everything balances at it should and colours are all accurate. Can a colorimeter do this? If it can’t then its game over (It should be able to because I can generate a very simple ICC profile with QuickGamma that seemingly does exactly that)

    But more appropriately if i’m going to spend the money on a colorimeter I want it to do this exact principle on the entire colour spectrum, so it can see “ok it looks fine at the darkest of dark on all channels, but it progressively gets worse the brighter the image is on the red channel, so let me generate an ICC profile that offsets these inaccuracies through the entire brightness spectrum of each channel so that images look correct (albeit clipped at extreme low/highs)”

    Again, can a colorimeter do this? and is it worth buying one for this purpose?

    #32737

    Vincent
    Participant
    • Offline

    Changes you made are for grey. Other non grey values will go through built- in, not modificable LUT translation from PQ HDR signal to panel capabilities with that grey correction and nothing else.

    Already explained, if you cannot understand those concepts you won’t be able to apply them.

    You can only correct grey when doing a GPU calibration.
    Even if you apply a HDR LUT3D with LeDoge’s DWMLUT, under the hood you are relying on an innacurate HW fixed HDR to SDR panel capabilities translation. You can try it.

    On SDR, if you just correct grey “it is expected” that monitor behaves more or less like an additive display.
    On HDR because this translation under the hood you may or not may be able to properly correct it.
    If display is actually SDR, and a 3000:1 display IS SDR, it’s better to rely on software HDR signal to SDR signal translation as explained before.

    #32738

    district9prawn
    Participant
    • Offline

    I’ve tried dwmlut to apply a 3dlut to improve the hdr mode of a regular pc monitor. It does work well to resolve the off oversaturated colors of the hdr mode. For the reasons Vincent mentioned you cannot expect it to be perfect and the overall experience is better with sdr as the hdr mode will not increase the contrast of the display without a dimming solution.

    #32745

    Millenium7
    Participant
    • Offline

    Lets take a step back for a second, assume i’m completely retarded and my assumption of how this works may be completely wrong

    I can’t change anything on the monitor itself, so that option is out. I would have to rely entirely on software

    Now my assumption is that if I used a colorimeter, I stick it on the monitor and run tuning software to generate an ‘offset’ to correct for the incorrect colours

    The software will progressively steps through a lot of colours and compares what it is reading to what it intended. I.e. lets say (going to use SDR RGB values for simplicity but I actually just want a HDR tuning) that the computer intends to display a colour value of 128/128/128 but the colorimeter is reading 156/130/128. Then it should know to reduce the red value by 28, green by 2 and blue is fine. So it does this and re-reads and finds its now showing 128/128/128, job done for that specific colour. So then it moves onto another colour. Then it’ll do say 200/0/0 and find it’s actually showing 192/0/0 so it’ll bump it up by 8 on the red channel in order to show 200/0/0. Etc etc rinse repeat afor the entire colour spectrum and it’ll end up creating a gradient curve that isn’t linear, its not just a simple -28 on the red across the board, it may take away red in the middle, but add some up the top etc. It’s creating a custom map for every colour (or at least a significant number of steps then blending in between)

    The net result you end up with an ICC profile that when applied, acts as a negative to the inaccuracies of the monitor. So that in theory it should display correct colours most of the time. Though it may clip at the extreme highs/lows

    Is this how it works? or am I completely wrong? Because if that is how it works then I can’t understand why it isn’t a simple answer to my question which would be: YES! – because now it can map the colours correctly (for the most part, at least good enough and much better than a simple -28 across the board)

    Instead it seems like i’m asking a very simple question to which there are ridiculously confusing answers, but maybe my frame of reference is wrong

    #32746

    Vincent
    Participant
    • Offline

    Now my assumption is that if I used a colorimeter, I stick it on the monitor and run tuning software to generate an ‘offset’ to correct for the incorrect colours

    Answered on the first replies. Common monitors without full HW calibration can be calibrated in GPU… but that calibration only corrects grey,

    Then after grey calibration (and with grey calibration applied) it will make a “suit” (ICC), generic M/L/XL size (matrix profile)… or taylor made (XYZLUT). Only apps with full color management know how to use it.
    There are 3rd party apps like DWMLUT that apply a full calibration to whatever you want as long as it can be fixed inside monitor’s colorspace. For example a near P3 gaming display to Rec709 gamma 2.2/sRGB.

    HDR works a little different since there is a built in transaltion inside monitor from HDR singal (RGB numbers encoded in rec2020 PQ) to the limited colorspace and dynamic range of your monitor. This translation cannot be disabled (unless it has some kind of full HDR HW calibration stuff inside), thus 3rd party solution ike DWMLUT are limited by this. => use SDR modes of monitor and rely on software HDR to SDR mapping as instructed, like madVR OR try to use DWMLUT and a HDR LUT3D.

    • This reply was modified 2 years, 5 months ago by Vincent.
    • This reply was modified 2 years, 5 months ago by Vincent.
    #32749

    Millenium7
    Participant
    • Offline

    So Windows cannot intercept the colour reproduction at all?

    Again my understand is with any application – shouldn’t matter what it is – i.e. if I wrote my own program that displays an image. Then ‘before’ the image gets sent to the monitor, those colours can be manipulated

    Therefore it doesn’t matter what the monitor is doing internally, same net result of offsetting a colour before it gets to the monitor, monitor does whatever processing it wants to, but the net result is the ‘correct’ colour shown on screen by being fed then ‘incorrect’ one to begin with that offsets whatever the monitor is doing

    I don’t actually understand what processing is done on the monitor. Aren’t pixel colour values sent as raw bits over the DisplayPort cable? so again that can be offset on the computer side

    If I generate an ICC profile with QuickGamma, every single application then offsets the colours by what i’ve set with that ICC profile. I don’t think applications need to ‘understand’ the ICC profile themselves. I highly doubt Doom Eternal understands ICC profiles yet if I remove all colour profiles its extremely red shifted, do up a quickie with quickgamma and I can correct the red shift

    • This reply was modified 2 years, 5 months ago by Millenium7.
    #32751

    Vincent
    Participant
    • Offline

    So Windows cannot intercept the colour reproduction at all?

    It does not work that way.
    Grey correction globally.
    Then it publishes default display ICC to apps. Apps should ask for it and run a color management engine. On color managed desktops like macOS it does the same but (simplification) OS provides such engine (although partially faulty) and requests app which is the colorspace of the RGB numbers that app draws.

    Anyway windows provides a way that an app can hook glovbally before sending RGB to screen and do what you request=> LeDoge’s DWMLUT.

    Again my understand is with any application – shouldn’t matter what it is – i.e. if I wrote my own program that displays an image. Then ‘before’ the image gets sent to the monitor, those colours can be manipulated

    By you (application developer), relying on your own engine or OS provided engine.

    I don’t actually understand what processing is done on the monitor. Aren’t pixel colour values sent as raw bits over the DisplayPort cable? so again that can be offset on the computer side

    On SDR and if display has not enabled some sRGB emulation mode, yes, raw RGB numbers in MONITOR’s COLORSPACE.

    On HDR RGB numbers are ENCODED in Rec2020 PQ. Your mnitor must translate those numbers to the closest equivalent in reduced/limited SDR colorspace in monitor.

    If I generate an ICC profile with QuickGamma, every single application then offsets the colours by what i’ve set with that ICC profile. I don’t think applications need to ‘understand’ the ICC profile themselves. I highly doubt Doom Eternal understands ICC profiles yet if I remove all colour profiles its extremely red shifted, do up a quickie with quickgamma and I can correct the red shift

    Explained many times before, you are correcting greyscale. A global 1D LUT x 3 channel aimed to correct grayscale and such greyscale correction will be applied to each channel gamma ramp.

    If your display is ~sRGB, correcting grayscale corrects all that should be corrected on display.
    Otherwise it needs more than a greyscale correction, it needs a full colorspace translation => DWMLUT or an app that supports ICC.

    • This reply was modified 2 years, 5 months ago by Vincent.
    • This reply was modified 2 years, 5 months ago by Vincent.
Viewing 15 posts - 1 through 15 (of 19 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS