MacOS Verification on Resolve

Home Forums Help and Support MacOS Verification on Resolve

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #139617

    carloscfrias
    Participant
    • Offline

    Hi,

    I’m trying to get the UI color viewer of Resolve as right as it can be on my 2017 iMac Pro. I know it’s not going to be perfect and I should get a profesional monitor and bypass the OS color management, but I don’t have the resources for that yet.

    I’ve profiled the screen outside Resolve and when I run the verification (Simulation profile unchecked) I get great results, so it is well characterized.

    Now I want to check if I can display correctly a BT.1886 signal inside Resolve. For this, I want to try two ways:

    1. Check ‘Use Mac display profiles’ so that Resolve uses the OS color management.
      What settings should I use to run the Verification? I’m assuming I should check ‘Simulation profile’ and select Rec.709 with Rec. 1886 tone curve. But when I run the Measurement report the results are awful (File attached n1). So I’ve tried setting the Resolve Project to 709/g2.2 and testing with a 709/g2.2 simulation profile because I heard somewhere that the OS color management works better with those settings and the results are better, yet not perfect (File attached n2). Am I doing this right? Is this the best setting I can get and then maybe I should try creating a LUT that gets me closer to the ideal?
    2. Uncheck ‘Use Mac display profiles’ and try to create a LUT.
      So first I wanted to run a verification using a Rec.709/Rec.1886 simulation profile without any LUT, so it should give bad results as the screen doesn’t meet that recomendation and  there’s no OS color management, but the results were kind of great (File attached n3). Just to make sure, I ran another verification but setting the simulation profile to Rec.709/g2.2 and leaving everything else the same and the results were also kind of great (File attached n4), but they shouldn’t be because I’ve not changed anything and a screen that displays accurately a BT.1886 signal should not display accurately a g2.2 singal without correction. Furthermore, the results are almost identical and the measurements are equivalent. So I’m guessing  I’ve not understood well what ‘Simulation profile’ means and it is in fact “filtering” in some way that profile through the ICC profile so the display’s behavior change.
      What settings should I use to measure this well? Should I check “Use simulation profile as display profile”?
      If that’s the case, what I would try to do is profile the display through the UI color viewer and make a LUT with Rec.709 source colorspace and Rec. 1886 tone curve (Apply calibration checked or unchecked?). Then, with the LUT applied I’d run the verification using a Rec.709/Rec.1886 simulation profile and checking ‘Use simulation profile as display profile”. Is that right? Should I check also ‘Device link profile’?

      I’m sorry for the long post, I hope i can get some help. Thanks!!

    Attachments:
    You must be logged in to view attached files.
    #139639

    Vincent
    Participant
    • Offline

    Hi,

    I’m trying to get the UI color viewer of Resolve as right as it can be on my 2017 iMac Pro. I know it’s not going to be perfect and I should get a profesional monitor and bypass the OS color management, but I don’t have the resources for that yet.

    I’ve profiled the screen outside Resolve and when I run the verification (Simulation profile unchecked) I get great results, so it is well characterized.

    ~600:1 … maybe there is something wrong.

    Also rec1886 its not something you can do “as expected by you” on an IPS 1000-2000:1 display.

    Now I want to check if I can display correctly a BT.1886 signal inside Resolve. For this, I want to try two ways:

    1. Check ‘Use Mac display profiles’ so that Resolve uses the OS color management.
      What settings should I use to run the Verification? I’m assuming I should check ‘Simulation profile’ and select Rec.709 with Rec. 1886 tone curve. But when I run the Measurement report the results are awful (File attached n1). So I’ve tried setting the Resolve Project to 709/g2.2 and testing with a 709/g2.2 simulation profile because I heard somewhere that the OS color management works better with those settings and the results are better, yet not perfect (File attached n2). Am I doing this right? Is this the best setting I can get and then maybe I should try creating a LUT that gets me closer to the ideal?

    I have  doubts about how to verify a display plugged as GUI on macOS using macOS color management.
    They key is noticing what is doing Displaycal and what is using Resolve.
    -Displaycal compares & transforms if needed from Display profile to simulation profile
    -Resolve transforms RGB numbers from source (rec709) to display colorspace.

    Hence if you want to test color management done by Resolve you need that display colorspace matches source (simulation + use simulation as display profile) BUT doing this will clean VCGT so you’ll loose grey calibration.

    I’m not sure that you can configure DisplayCAL to use as “comparison display profile” a  profile not set as display profile.
    It will be more easy to test with a sample 100% saturation sweeps in MP4 RGBCMY + 10 IRE steps from 10 to 100 and reading raw like “manual” HCFR with a windows laptop.

    ***IF*** Resolve color management “‘Use Mac display profiles’” works as expected, equivalent verification will be NOT using Resolve output but display and then verifying simulation profile (DO NOT USE simulation as display profile).

    1. Uncheck ‘Use Mac display profiles’ and try to create a LUT.
      So first I wanted to run a verification using a Rec.709/Rec.1886 simulation profile without any LUT, so it should give bad results as the screen doesn’t meet that recomendation and  there’s no OS color management, but the results were kind of great (File attached n3). Just to make sure, I ran another verification but setting the simulation profile to Rec.709/g2.2 and leaving everything else the same and the results were also kind of great (File attached n4), but they shouldn’t be because I’ve not changed anything and a screen that displays accurately a BT.1886 signal should not display accurately a g2.2 singal without correction. Furthermore, the results are almost identical and the measurements are equivalent. So I’m guessing  I’ve not understood well what ‘Simulation profile’ means and it is in fact “filtering” in some way that profile through the ICC profile so the display’s behavior change.
      What settings should I use to measure this well? Should I check “Use simulation profile as display profile”?
      If that’s the case, what I would try to do is profile the display through the UI color viewer and make a LUT with Rec.709 source colorspace and Rec. 1886 tone curve (Apply calibration checked or unchecked?). Then, with the LUT applied I’d run the verification using a Rec.709/Rec.1886 simulation profile and checking ‘Use simulation profile as display profile”. Is that right? Should I check also ‘Device link profile’? 

      I’m sorry for the long post, I hope i can get some help. Thanks!!

    To verify actual output on a GUI display  = a display connected through common GPU like integraded display on a mac, with LUT3D applied you’ll face same issues as described in previous point. How to use DisplayCAL without changing current VCGT and at the same time use an arbitrary “fake” display profile.
    It could be done using Rec709 as simulation + use simulation as display profile but this forces you to create a LUT3D with VCGT embeded data… and that’s i snot really what you want in a GUI display.

    #139640

    carloscfrias
    Participant
    • Offline

     

    ~600:1 … maybe there is something wrong.

    Also rec1886 its not something you can do “as expected by you” on an IPS 1000-2000:1 display.

    Thanks, I’ll try to create a new profile and see if the constrast ratio increases. I thought that low number was due to the calibration for Rec.1886.

    Now I want to check if I can display correctly a BT.1886 signal inside Resolve. For this, I want to try two ways:

    1. Check ‘Use Mac display profiles’ so that Resolve uses the OS color management.
      What settings should I use to run the Verification? I’m assuming I should check ‘Simulation profile’ and select Rec.709 with Rec. 1886 tone curve. But when I run the Measurement report the results are awful (File attached n1). So I’ve tried setting the Resolve Project to 709/g2.2 and testing with a 709/g2.2 simulation profile because I heard somewhere that the OS color management works better with those settings and the results are better, yet not perfect (File attached n2). Am I doing this right? Is this the best setting I can get and then maybe I should try creating a LUT that gets me closer to the ideal?

    I have  doubts about how to verify a display plugged as GUI on macOS using macOS color management.
    They key is noticing what is doing Displaycal and what is using Resolve.
    -Displaycal compares & transforms if needed from Display profile to simulation profile
    -Resolve transforms RGB numbers from source (rec709) to display colorspace.

    Hence if you want to test color management done by Resolve you need that display colorspace matches source (simulation + use simulation as display profile) BUT doing this will clean VCGT so you’ll loose grey calibrationI’m not sure that you can configure DisplayCAL to use as “comparison display profile” a  profile not set as display profile.
    It will be more easy to test with a sample 100% saturation sweeps in MP4 RGBCMY + 10 IRE steps from 10 to 100 and reading raw like “manual” HCFR with a windows laptop.

    I have a windows laptop but I don’t understand “reading raw like manual HCFR” . Could you explain or link any resource?

    ***IF*** Resolve color management “‘Use Mac display profiles’” works as expected, equivalent verification will be NOT using Resolve output but display and then verifying simulation profile (DO NOT USE simulation as display profile).

    1. Uncheck ‘Use Mac display profiles’ and try to create a LUT.
      So first I wanted to run a verification using a Rec.709/Rec.1886 simulation profile without any LUT, so it should give bad results as the screen doesn’t meet that recomendation and  there’s no OS color management, but the results were kind of great (File attached n3). Just to make sure, I ran another verification but setting the simulation profile to Rec.709/g2.2 and leaving everything else the same and the results were also kind of great (File attached n4), but they shouldn’t be because I’ve not changed anything and a screen that displays accurately a BT.1886 signal should not display accurately a g2.2 singal without correction. Furthermore, the results are almost identical and the measurements are equivalent. So I’m guessing  I’ve not understood well what ‘Simulation profile’ means and it is in fact “filtering” in some way that profile through the ICC profile so the display’s behavior change.
      What settings should I use to measure this well? Should I check “Use simulation profile as display profile”?
      If that’s the case, what I would try to do is profile the display through the UI color viewer and make a LUT with Rec.709 source colorspace and Rec. 1886 tone curve (Apply calibration checked or unchecked?). Then, with the LUT applied I’d run the verification using a Rec.709/Rec.1886 simulation profile and checking ‘Use simulation profile as display profile”. Is that right? Should I check also ‘Device link profile’? 

      I’m sorry for the long post, I hope i can get some help. Thanks!!

    To verify actual output on a GUI display  = a display connected through common GPU like integraded display on a mac, with LUT3D applied you’ll face same issues as described in previous point. How to use DisplayCAL without changing current VCGT and at the same time use an arbitrary “fake” display profile.
    It could be done using Rec709 as simulation + use simulation as display profile but this forces you to create a LUT3D with VCGT embeded data… and that’s i snot really what you want in a GUI display.

    If I understand correctly, the VCGT is always there even though “Use mac display profiles” is unchecked So I could create from the same profile a LUT with VCGT and a LUT whithout and use the one with VCGT to run the verification with Rec709 sim + use sim as display profile and, if everything is correct, use de LUT without to grade and it should output correct colors… right? I know this is scrappy to say the least, I’m just trying to understand the logic of it

    #139642

    Vincent
    Participant
    • Offline

     

    ~600:1 … maybe there is something wrong.

    Also rec1886 its not something you can do “as expected by you” on an IPS 1000-2000:1 display.

    Thanks, I’ll try to create a new profile and see if the constrast ratio increases. I thought that low number was due to the calibration for Rec.1886.

    Now I want to check if I can display correctly a BT.1886 signal inside Resolve. For this, I want to try two ways:

    1. Check ‘Use Mac display profiles’ so that Resolve uses the OS color management.
      What settings should I use to run the Verification? I’m assuming I should check ‘Simulation profile’ and select Rec.709 with Rec. 1886 tone curve. But when I run the Measurement report the results are awful (File attached n1). So I’ve tried setting the Resolve Project to 709/g2.2 and testing with a 709/g2.2 simulation profile because I heard somewhere that the OS color management works better with those settings and the results are better, yet not perfect (File attached n2). Am I doing this right? Is this the best setting I can get and then maybe I should try creating a LUT that gets me closer to the ideal?

    I have  doubts about how to verify a display plugged as GUI on macOS using macOS color management.
    They key is noticing what is doing Displaycal and what is using Resolve.
    -Displaycal compares & transforms if needed from Display profile to simulation profile
    -Resolve transforms RGB numbers from source (rec709) to display colorspace.

    Hence if you want to test color management done by Resolve you need that display colorspace matches source (simulation + use simulation as display profile) BUT doing this will clean VCGT so you’ll loose grey calibrationI’m not sure that you can configure DisplayCAL to use as “comparison display profile” a  profile not set as display profile.
    It will be more easy to test with a sample 100% saturation sweeps in MP4 RGBCMY + 10 IRE steps from 10 to 100 and reading raw like “manual” HCFR with a windows laptop.

    I have a windows laptop but I don’t understand “reading raw like manual HCFR” . Could you explain or link any resource?

    HCFR, new, Generator = “DVD Manual”

    Now HCFR expects taht you put the referenced patch on the display (external) before measurement. In Resolve you’ll have to import in a new project a set of IRE grayscale patches and saturation sweeps RGBCMY. AVSForum had a set of catalog of MP4 Rec709 with these patches.

    It’s slow and last resource.

    ***IF*** Resolve color management “‘Use Mac display profiles’” works as expected, equivalent verification will be NOT using Resolve output but display and then verifying simulation profile (DO NOT USE simulation as display profile).

    1. Uncheck ‘Use Mac display profiles’ and try to create a LUT.
      So first I wanted to run a verification using a Rec.709/Rec.1886 simulation profile without any LUT, so it should give bad results as the screen doesn’t meet that recomendation and  there’s no OS color management, but the results were kind of great (File attached n3). Just to make sure, I ran another verification but setting the simulation profile to Rec.709/g2.2 and leaving everything else the same and the results were also kind of great (File attached n4), but they shouldn’t be because I’ve not changed anything and a screen that displays accurately a BT.1886 signal should not display accurately a g2.2 singal without correction. Furthermore, the results are almost identical and the measurements are equivalent. So I’m guessing  I’ve not understood well what ‘Simulation profile’ means and it is in fact “filtering” in some way that profile through the ICC profile so the display’s behavior change.
      What settings should I use to measure this well? Should I check “Use simulation profile as display profile”?
      If that’s the case, what I would try to do is profile the display through the UI color viewer and make a LUT with Rec.709 source colorspace and Rec. 1886 tone curve (Apply calibration checked or unchecked?). Then, with the LUT applied I’d run the verification using a Rec.709/Rec.1886 simulation profile and checking ‘Use simulation profile as display profile”. Is that right? Should I check also ‘Device link profile’? 

      I’m sorry for the long post, I hope i can get some help. Thanks!!

    To verify actual output on a GUI display  = a display connected through common GPU like integraded display on a mac, with LUT3D applied you’ll face same issues as described in previous point. How to use DisplayCAL without changing current VCGT and at the same time use an arbitrary “fake” display profile.
    It could be done using Rec709 as simulation + use simulation as display profile but this forces you to create a LUT3D with VCGT embeded data… and that’s i snot really what you want in a GUI display.

    If I understand correctly, the VCGT is always there even though “Use mac display profiles” is unchecked So I could create from the same profile a LUT with VCGT and a LUT whithout and use the one with VCGT to run the verification with Rec709 sim + use sim as display profile and, if everything is correct, use de LUT without to grade and it should output correct colors… right? I know this is scrappy to say the least, I’m just trying to understand the logic of it

    It is always unless default display profile changes and cleans it so , yes, what you ask is correct. Create 2 LUT3D valide VCGT embeded but load non embeded VCGT for work.

    #139644

    carloscfrias
    Participant
    • Offline

    Thank you very much!

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS