Struggling to Calibrate for Grading

Home Forums Help and Support Struggling to Calibrate for Grading

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #7700

    Sam B
    Participant
    • Offline

    Hi,

    This might be a long one, but I’ve spent all day experimenting with this.

    I have a 2012 Macbook Pro connected to an Acer S240HL LED/LCD monitor and a Spyder 2 tool. (I use the monitor on my PC as well, so the profile will between two machines). I know it’s a big no-no, but this is in my bedroom, with changing light enviroments, going from daylight to 5600k eco bulbs.

    I want to calibrate both screens as best I can to grade in Davinci.

    However I’m still really baffled as to what settings and standards I should be calibrating to; what I’ve read online has been so conflicting that I’ve spent all day going round in circles! As well as this, the process of setting the RGB and whitepoint settings is completely baffling to me, but I’ll return to that.

    So, what makes sense to me is to go for a REC.709 match; as my content is web and screen based, so that’s the standard which makes most sense? Right? But then other people have then thrown REC.1886 and sRGB into the mix, saying they are all incredibly similar. What gamma should I then tune my monitor to, with my room environment?  I’ve been lead to believe that rec.709 and 1886 are in the area of 2.4-2.5 whilst 2.2 is more of the standard computer and phone gamma..?

    Secondly, and probably whats been the BIGGEST issue for me today has been adjusting the Brightness, Contrast and RGB white balance on the monitor.

    So here’s what I’ve been doing:

    In the Calibration window:

    • The Whitepoint (which I’m lead to believe is the monitors white balance? ) I’m keeping as what the Video and sRGB presets are at, which is 6504k (default is on the Chromaticity Coordinates, but that’s beyond my grasps still :D)
    • White level: Still not sure what this is- I’ve been guessing it’s the luminance/brightness of white???
    • Tone curve: sRGB/Rec.1886/Rec.709 respectively.

    But what’s getting me in a real fiddle, is the Interactive Display Adjustment window.

    As I  juggle between the RGB sliders to bring the Adjustment window’s bars into the centre (arrows), the resulting balance doesn’t really seem neutral to me, despite the green writing appearing. When the calibration is complete, the resulting image has seemed very warm. Then when I go back to redo the calibration, the RGB balance is completely off again- despite me going back to my default profile and the room ambience not changing.

    The monitor has the simple toggle of Warm/Cold, would I be better off leaving it on Cold, and letting the profile counter it?

    Then with the whitepoint, what cd/m2 should I be aiming for, with my room being like it is? Do I adjust this value with just the Brightness on my monitor (which is all that’s available on the Macbook’s display, obviously) or on my Acer, do I use a combination of both the Brightness and Contrast dials? The Contrast dial has always been quite a ambiguous term, as I’m never too sure what it’s doing to the rest of the image.

    Thanks so much for baring through such a long winded post, would appreciate any help and direction you could give to this DisplayCAL noob!

    Attachments:
    You must be logged in to view attached files.
    #7703

    Florian Höch
    Administrator
    • Offline

    Hi,

    Acer S240HL LED/LCD

    This monitor seems to have a TN panel. TN panels are not very suitable for accurate color work due to the high viewing angle dependency.

    I know it’s a big no-no, but this is in my bedroom, with changing light enviroments, going from daylight to 5600k eco bulbs.

    Not an ideal situation. I would recommend to close the blinds and rely on artificial lighting exclusively when doing color work.

    What gamma should I then tune my monitor to, with my room environment?

    For grading, Rec. 1886 is what’s being used for the 3D LUT, so it makes sense to use that for the 1D calibration as well, that way non color managed parts of the desktop (i.e. most everything) will at least look similar in terms of tonal response.

    I’ve been lead to believe that rec.709 and 1886 are in the area of 2.4-2.5 whilst 2.2 is more of the standard computer and phone gamma..?

    Rec. 709 is an encoding-only (“camera”) response and should not be used for calibrating. Rec. 1886 takes the display contrast ratio into account, so it’s a sliding gamma with anywhere from (roughly) 2.2 to 2.4 in the midtones, depending on display black and white level.

    The Whitepoint (which I’m lead to believe is the monitors white balance? ) I’m keeping as what the Video and sRGB presets are at, which is 6504k (default is on the Chromaticity Coordinates, but that’s beyond my grasps still :D)

    […]

    As I juggle between the RGB sliders to bring the Adjustment window’s bars into the centre (arrows), the resulting balance doesn’t really seem neutral to me, despite the green writing appearing. When the calibration is complete, the resulting image has seemed very warm.

    With a Spyder2, I would recommend setting the calibration whitepoint target to “As measured” – the Spyder2 is simply not accurate enough for a white point adjustment. As soon as you set the whitepoint target to “As measured”, you’ll be asked if you want to use it for the 3D LUT. Confirm this. During interactive adjustment, the best you can do is adjust the monitor by eye so that white looks neutral to you. In the long run, I would recommend getting a better instrument.

    Then with the whitepoint, what cd/m2 should I be aiming for, with my room being like it is?

    In a dim room, 80-100 cd/m2 is a reasonable range.

    on my Acer, do I use a combination of both the Brightness and Contrast dials? The Contrast dial has always been quite a ambiguous term, as I’m never too sure what it’s doing to the rest of the image.

    See the adjustment guide in the documentation.

    #7704

    Sam B
    Participant
    • Offline

    Thank you so much Florian, I’ve never seen someone so dedicated to helping their online community!

    Thanks!

    #17970

    Wire
    Participant
    • Offline

    This thread is stale but I think the topic is of current important to many users.

    I want to clarify Florian’s refrain thet Rec. 709 is a camera spec. This is not wrong, but over-simplifies the situation.

    The ITU spec for BT.1886 reads:

    https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.1886-0-201103-I!!PDF-E.pdf

    //While the image capture process of Recommendation ITU-R BT.709 had an optical to electrical transfer function, there has never been an EOTF documented. This was due in part to the fact that display devices until recently were all CRT devices which had somewhat consistent characteristics device to device.//

    Put another way, the origins and evolution of television CRT displays had a lot of variability which had to be tolerated. Studios could afford elaborate Picture Line-Up Generation Equipment, such as scopes that helped them see what the cameras were doing, and devices which generated standard patterns (the P L U G E) which helped align CRTs for nominal decode of what the cameras were producing. Due to the history of evolution of television, the industry left the definition of the display up to the complex dance of studio technicians and “reference standard” displays.

    As cumputers and new display technoligy are trasnforming everything, TV display physics are nolonger CRT-like, and Rec.709 became too open-ended for posterity. The point of BT.1886 is to close the gap in the specs in a new era in which digital display technology with vastly better and more predictable performance.

    Along the way of the computer desktop punishing revolution, the ICC was formed (Apple), and as computers took over prepress, they also took over video, Rec.709 ICC profiles had to appear and this meant choosing a gamma. If you ask the ICC for a Rec.709 display profile, they will give you one that says decode gamma 2.4:

    http://www.color.org/rec709.xalter

    What right did the the ICC have to say this? None at all, except they’re the International Color Consortium! And without the PC industry, the ITU-R (that’s R for “radio” standards dating back to the telegraph in 1865) had no reason to even think in terms of ICC color mannagement… Except that computers were changing everthing in their world as fast as they coould keep up!

    The Rec.709 display got defined by the computer industry, and this is not bunk. If you look a refereence class HDTV display, say as described by the Sony BVM-E250 HDTV Professional Video Monitor User’s Guide (this monitor currently sells with a street price of $24,000 BTW, just in case you suspect this some prosumer one-off):

    https://pro.sony/s3/cms-static-content/operation-manual/4274954141.pdf

    Sony described on of its several standard color compatability modes as:

    ITU-R BT.709: Gamma 2.4

    So, contrary to Florian’s very well-intended assertion, Rec.709 is well to be observed as a display specification. (i’m gonna forshadown a reference here.) The fact that it’s related to the assumption of video cameras is true of the origins of gamma compensation since the inception of television; that it is a psycho-visual aspect of TV’s end-to-end transfer function which accounts for a CRT display inverting the analog signal produced by a camera under the assumption of a dim viewing enironment. In other words, gamma is a psycho-visual parameter of image coding, not some hard dweeby etching in standards granite. It just so happened that in TV history, the industry found the camera to be a more cost-effective locale for additional expensive circuitry to account for compensation to this assumption.

    The rules are chaning now with the advent of HDR cine-video.

    So why is BT.1886 important?

    Not because Rec. 709 is an unacceptable or incorrect display standard. Poynton—the originator of 1886—is very clear about the motivation: It’s because legacy television production content is not “scene-referred”—IOW it is not about making the display match the scene via a camera)—legacy production content is “display-referred”, meaning thet the director signs off completion on the production based on what’s seen on his reference display.

    When should a colorist use of BT.1886?

    This is tricky, because BT.1886 is an idealization of HDTV display devices predating today’s ICC oriented display tech.

    For those who want to enjoy all the thinking about this matter from the guy who received a Sarnoff prize for inventing HDTV’s square pixels, please persuse the definitive paper on the subject:

    https://poynton.ca/notes/PU-PR-IS/index.html

    BT.1886 is the ITU-R response to the above paper. (BTW—The paper is a great read for anyone who enjoys nerding out on this stuff.)

    In a nutshell, given that reference standard CRTs are an essential part of the HDTV industry, Poynton’s observation was that the ITU should standardize what it means to be a reference HDTV display in terms of the physics of the display hardware that constituted a major era of the TV industry, so that studios using more advanced display physics can properly simulate the olden days. The intent of BT.1886 is for studios using new tech with old-timey content. BT.1886 gives the evolving studio what it needs to ensure that it’s looking at legacy content properly display-referred to the era of its creation. You will see in the BT.1886 spec the variability that is allowed to account for the controls of CRT displays and varying assumptions of viewing conditions between studios and end-users.

    So, as a colorist, do these legacy conditions describe your work? If so, BT.1886 is right for you. You will have to look into the gamma assumption based on assumptions that stem from the legacy content.

    However, existience of BT.1886 for legacy content in no way way implies that BT.1886 should replace Rec.709 for general video display. Moreover, 1886 is unnecessarily complicated for many users because it invites you to wonder about which incremental gamma to choose and about the settings of virtual (and eternally flummoxing) Brightness and Contrast controls (why did Contrast make the TV picture brighter and Brightness vary the contrast? We may never know).

    If you don’t know solid reasons to grapple with BT.1886 you’ll be safer choosing Rec.709 2.4. Just like if you don’t have solid reasons to do otherwise, you are safer putting wed images in sRGB.

    The HDTV gear industry is founded upon Rec.709 compliant devices and Rec.709 is a perfectly appropriate display target for a modern (circa 2019) HDTV colorist.

    Basically, Rec.709 is sRGB for video — And that’s not a bad thing!

    #17971

    Florian Höch
    Administrator
    • Offline

    I want to clarify Florian’s refrain thet Rec. 709 is a camera spec.

    That’s not what I said. I said that the Rec. 709 tone response curve is a “camera” (or encoding) curve (OETF), and its use as an EOTF (decoding “gamma”) produces unintended results. This says nothing about primaries. Using the Rec. 709 primaries for HDTV calibration is fine, but it must be accompanied by an appropriate EOTF (because Rec. BT.709 doesn’t define one itself – instead, it refers to BT.1886).

    If you ask the ICC for a Rec.709 display profile, they will give you one that says decode gamma 2.4

    At the time that specific profile was created, Rec. BT.1886 did not yet exist.

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS