DaVinciResolve viewer vs MediaPlayerHomeCinema

Home Forums Help and Support DaVinciResolve viewer vs MediaPlayerHomeCinema

Viewing 8 posts - 1 through 8 (of 8 total)
  • Author
    Posts
  • #5010

    Gaetano Cirillo
    Participant
    • Offline

    Hi,

    I’ve started using DisplayCal GUI a few weeks ago to calibrate my monitor: Kudos to you for this marvellous piece of code!
    My monitor is an LG25UM58P, 99%sRGB Gamut coverage declared by LG, and this specification seems in agreement with what I see from the Displaycal GUI reports, after profiling.
    I’ve had success in creating a bunch of profiles for this monitor, with different white point and/or white level.
    At the end, I’ve opted for a profile with these settings:

    CALIBRATION:
    White point:  5000°K, daylight
    White level: 120cd/m^2
    Black level: native
    Gamma 2.2, relative
    Black level output compensation: 100%
    Ambient light level: disabled
    Black level correction: 0%, Ratio: 4
    Calibration speed: High
    PROFILING:
    profile type: XYZ LUT+Matrix, black point compensation disabled
    profile quality: High
    test chart: auto-optimized, 425 patches
    I use LightRoom, Gimp, Rawtherapee and the lab that prints my photos works with D50 white point.
    Prints, usually 20×30 cm, come out in perfect accordance with what I see on my monitor.

    As I work with video too on this same monitor I’ve created, starting from that profile, a 3D LUT for Da Vinci Resolve (12.5.1), taking care of disabling “Apply calibration (vcgt)” as suggested here on the forum in various topics when one works in Da Vinci Resolve with a monitor that is part of the desktop (my O.S. is Windows 10), hence not external, and leaving all other values at their default:

    Source color space Rec.709
    Tone curve Rec.1886
    Gamma 2.4
    black level output compensation: I don’t rember if it was 0% or 100%
    Gammut mapping mode: inverse device-to-PCS
    Rendering intent: Absolute colorimetric with white point scaling
    3D Lut file format: IRIDAS(.cube)
    Input encoding: full range
    Ouput encoding: full range
    3D LUT size: 65x65x65

    Fact is that the same video clip (coming from a Samsung NX1 mirrorless) showed in Resolve viewer appear a lot different from what I see playing it in MediaPlayerHomeCinema (This one SHOULD BE color managed, I’ve enabled it in its options).
    In particular, the same clip appears more contrasted in MPHC (crushed blacks) than in Resolve viewer; also, MPHC shows a more “reddish” white point, Resolve viewer seems a lot more “neutral” and tonally pleasing.
    Moreover, what I see in Resolve viewer seems similar to what I see on the NX1 LCD playing the same clip.

    Hence the question: how can I check wich of the two (Resolve vs MPHC) is showing me the “real” colours and contrast of the video clip?
    I tought I could open a jpeg in Lightroom or Gimp and compare it with what I see viewing this same jpeg in the Resolve viewer; nevertheless, in case they should appear different, how can I say who is right?
    Maybe Lightroom (version 5.7, by the way) or Gimp aren’t correctly managing the profile I’ve created with Displaycal GUI and Resolve is instead correctly managing the 3D LUT I’ve created from that same profile….or maybe viceversa…who knows??

    Sorry for the very very long post, but I hope the detailed info will be at least useful 🙂

    Thank you

    #5017

    Florian Höch
    Administrator
    • Offline

    Hi,

    In particular, the same clip appears more contrasted in MPHC (crushed blacks) than in Resolve viewer; also, MPHC shows a more “reddish” white point,

    MPC-HC only allows for the selection of “output offset” gamma 2.2 (“bright”), 2.35 (“dim”) and 2.4 (“dark”) for tone curve, no BT.1886, so that should explain the more contrasted look. Also, you have to set rendering intent to “Absolute colorimetric” in MPC-HC to get whitepoint simulation of D65 in your actual D50-ish whitepoint.
    A better solution is to use madVR as renderer and create a 3D LUT for it in the same way you did for Resolve (you can use the same profile, so no need to run lengthy measurements again).

    I tought I could open a jpeg in Lightroom or Gimp and compare it with what I see viewing this same jpeg in the Resolve viewer

    Gimp or Lightroom have no notion of BT.1886 as well.

    #5020

    Gaetano Cirillo
    Participant
    • Offline

    Hi,
    MPC-HC only allows for the selection of “output offset” gamma 2.2 (“bright”), 2.35 (“dim”) and 2.4 (“dark”) for tone curve, no BT.1886, so that should explain the more contrasted look.

    Hi Florian, thanks for the feedback; However I’m not sure I get it correctly, hope you’ll forgive my partial ignorance on some of the aspects of the (really complex) display calibration matter.
    If I’ve correctly interpreted what I’ve read in hours and hours of google searching, “video world” had, until a few years ago, no standard gamma value; in other words Rec.709, for example, only defines the gamut of this “video color space” (similar to sRGB if I get it right), not the gamma.
    Hence, Bt.1886 tries to overcome this problem by defining a “standard” gamma for the “video world”.
    Now the question: has Bt.1886 a “simple power law” (like 2.2, 2.3, 2.4, 2.n gamma) or it is a “more complex” function? If it is not a simple power law, what’s the meaning of chosing in Displaycal GUI, for example, a Bt.1886 – 2.4 Gamma when creating the 3D LUT? Does this number represent the number that one would have to chose for a simple power law gamma curve to have a similar results to the one you’ll have chosing this same value for Bt.1886? If so, why chosing 2.4 as output offset gamma in MPHC exasperate the “black crush” problem instead of solving it?
    Moreover, if MPHC is not able to show a color managed image (I’ve understood this from your answer, correct me if I’m wrong) why it is defined a color managed software? I mean, MPHC (version 1.7.10, 64 bit) have a color management enabling option: what does it do?

    Still…using DisplayCal GUI for 3D LUT creation and disabling “Apply calibration (vcgt)”, the gamma should be managed only by the video card at the operative system level or I’m completely out of the way?
    …confused…
    In other words, by who and how many times the gamma gets manipulated? If I choose for my monitor profiling 2.2 Gamma, aren’t only the videocard LUT the only ones manipulated to obtain this number once and forever? Or the colorspaces (and hence their standard gamma definition, a part from rec.709 where you explicitely choice Bt.1886 and a number for gamma) interact/manipulate again it in some way? An if yes, how?At video card “global” level? Or at local(read “software dependant”) level?

    Also, you have to set rendering intent to “Absolute colorimetric” in MPC-HC to get whitepoint simulation of D65 in your actual D50-ish whitepoint.

    I’ve already tried this setting, but it seems to give no appreciable differences (read, it’s still “reddish” than Resolve viewer)

    A better solution is to use madVR as renderer and create a 3D LUT for it in the same way you did for Resolve (you can use the same profile, so no need to run lengthy measurements again).

    I’ll try and I’ll report back here.

    I tought I could open a jpeg in Lightroom or Gimp and compare it with what I see viewing this same jpeg in the Resolve viewer

    Gimp or Lightroom have no notion of BT.1886 as well.

    Are you saying that a jpeg opened in Resolve will be read as having a rec.709 color space and a Bt.1886 gamma curve instead of its embedded color space (sRGB in my case)?

    Sorry for the questions flood, you’re help is really really appreciate!!

    Thanks

    #5024

    Florian Höch
    Administrator
    • Offline

    If I’ve correctly interpreted what I’ve read in hours and hours of google searching, “video world” had, until a few years ago, no standard gamma value; in other words Rec.709, for example, only defines the gamut of this “video color space” (similar to sRGB if I get it right), not the gamma.

    The “de-facto” standard for video until BT.1886 was introduced was basically Rec709 with a 2.2 “output offset” power curve, but this was never formally specified.

    Hence, Bt.1886 tries to overcome this problem by defining a “standard” gamma for the “video world”.

    Correct.

    Now the question: has Bt.1886 a “simple power law” (like 2.2, 2.3, 2.4, 2.n gamma) or it is a “more complex” function?

    It’s a power curve with input offset to accommodate the actual display black level.

    […] why chosing 2.4 as output offset gamma in MPHC exasperate the “black crush” problem instead of solving it?

    Because MPC-HC always uses full output offset (scales and offsets the whole “ideal” curve), which can make steps near black imperceptibly small if the black level is not so black, instead of input offset (shift and scale the curve so that only the part of the “ideal” curve that is actually reproducible 1:1 on the display is used).

    Moreover, if MPHC is not able to show a color managed image

    It can and does, but there’s simply no support for BT.1886.

    Still…using DisplayCal GUI for 3D LUT creation and disabling “Apply calibration (vcgt)”, the gamma should be managed only by the video card at the operative system level […]

    No, it only means the calibration is not part of the 3D LUT and needs to be applied in another way (i.e. via the video card gamma tables). The calibration curves record the difference between the display’s native response and the chosen 1D calibration tone curve target. This is useful to have because the vast majority of the whole system is not color managed at all, and so at least whitepoint and grayscale can be corrected everywhere (by virtue of the video card gamma tables that the applications do not need to have a notion of). If every single application and the system itself would be color managed, there would be no inherent need for 1D calibration.

    Are you saying that a jpeg opened in Resolve will be read as having a rec.709 color space and a Bt.1886 gamma curve instead of its embedded color space (sRGB in my case)?

    If using a 3D LUT, this will dictate the color transform. There’s no notion of embedded ICC profiles in Resolve (that I can see at least), although latest versions seem to have some rudimentary ICC display profile support (I would strongly recommend against relying on it in any way though, because unfortunately, their implementation seems lackluster and incomplete).

    #5066

    Gaetano Cirillo
    Participant
    • Offline

    Hi Florian,
    thanks for the detailed explanation!
    Some more questions (please, be patient! 🙂 ) :

    -When generating a 3D lut starting from an existing profile, a part from the 3D Lut, a new .icm profile is also generated. Which is its utility? I’m asking beacuse if I  put that .icm in the windows color profile directory and then try to  load it, using DisplayCal profile loader, it doesn’t appear as an avaliable profile. Still, if I try to “force” the loading trough windows color profile loader, it says that it is not a valid color profile.
    -When creating a maadVR 3D lut, only input encoding 16-235 is avaliable: why?

    Thanks!

    #5068

    Florian Höch
    Administrator
    • Offline

    When generating a 3D lut starting from an existing profile, a part from the 3D Lut, a new .icm profile is also generated. Which is its utility?

    That is a device link ICC profile that’s always created by Argyll’s collink. Basically, a 3D LUT in a standardized and compact binary format.

    When creating a maadVR 3D lut, only input encoding 16-235 is avaliable: why?

    Because it’s the only correct choice. In madVR, the actual output encoding is decoupled from the 3D LUT (you can set PC or TV levels in madVR options).

    #5069

    Gaetano Cirillo
    Participant
    • Offline

    When generating a 3D lut starting from an existing profile, a part from the 3D Lut, a new .icm profile is also generated. Which is its utility?

    That is a device link ICC profile that’s always created by Argyll’s collink. Basically, a 3D LUT in a standardized and compact binary format.

    When creating a maadVR 3D lut, only input encoding 16-235 is avaliable: why?

    Because it’s the only correct choice. In madVR, the actual output encoding is decoupled from the 3D LUT (you can set PC or TV levels in madVR options).

    Ok, so the generated .icm profile is a sort of “encapsulated” 3D LUT. How can it be used? (if possible..)

    On the 16-235 input encoding: I had already seen that MadVR lets the user choose between 16-235 and 0-255 OUTPUT level but…Were we talking about INPUT level? I mean, if my mirrorless camera lets me choose between 16-235 and 0-255 video recording and I opted for the last one (0-255), how can a MadVR 3D LUT manage correctly my video clips if it is interpreting them as if they were been done using (if I’ve correctly understood the point) 16-235 INPUT level?

    #5071

    Florian Höch
    Administrator
    • Offline

    How can it be used?

    E.g. Photoshop has device link support since CS6, or you could use it with Argyll’s cctiff to transform still images (provided the encoding matches what the LUT expects). These are just examples.

    how can a MadVR 3D LUT manage correctly my video clips if it is interpreting them

    madVR will compress and expand levels as needed.

Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS