sRGB vs 2.2

Home Forums Help and Support sRGB vs 2.2

Viewing 3 posts - 16 through 18 (of 18 total)
  • Author
    Posts
  • #25391

    rpnfan
    Participant
    • Offline

    The gradation curve of the monitor is not relevant — as long no loss of tonal values occurs due the influence of the calibration — and as long you have created an ICC profile which describes the very gradation (gamma / tonal) curve and use colormanagement aware programs.

    As soon as you step outside the colormanaged world — which you still have in many places in Windows by default (or only partly colormanaged) the gradation curve will change the tonal response of what you see of course.

    You could assume sRGB gamut and sRGB gradation as some sort of default. BUT many modern displays have a somewhat larger gamut than sRGB and the tonal response may not match sRGB perfectly, but often can be closer to gamma = 2.2.  (This is often chosen in factories as the calibration target – simplified gamma=2.2 curve without any “fancy” gain offset). You will only get somehow close to prepare pictures for the “unknown” anyway, so using sRGB gamut and sRGB-curve or gamma = 2.2 will bring you in the ballpark.

    I have a wide-gamut screen and use one calibration with the native gamut for photo work.

    For daily usage I have another calibration to a slightly extend sRGB gamut with an sRGB curve and one with a 2.2 gamma. I switch between both calibrations as I like. 😉
    Seldom I use a plain “sRGB-gamut” + sRGB-curve calibration. But I prefer the slightly enhanced saturation of the others (+10 Saturation dialed in on an Eizo display).

    #140990

    yagma
    Participant
    • Offline

    I don’t think most web -audiences- are looking at the srgb tone curve.

    all see this or we wouldn’t bother trying to align them.  So to say “…device makers made the switch…” is an odd turn of phrase. Can you say more about this?

    For example, if devices perfectly targeted standards, which they usually don’t for many reasons, today there are so many standards it will make your head swim: sRGB, Rec709, AdobeRGB, DCI-P3, DisplayP3, Rec2020 (yikes this last one has to have support for a virtual gamut because no actual device can display its full gamut). Add gimmicky intermediates like BT.1886, HLG, S-Log, and on and on. Then compound it with profile formats, OS support, app support, colorimeters that can’t handle the light emitted by LEDs made from the ambergris of the White Whale… My god.

    As an further aside, this creates a true conundrum. With so much device variation, and various standards to target, how do we agree? Anyone who has spent any time on this has seen color management can not solve this problem. It can help you align to a standard, and characterize a transformation from one regime to another. But there are always one-way streets and barriers. In a perverse sense, the promise of color management has always been a lie: You can never achieve repeatability across all devices without a lowest common denominator. And the industry is forever improving things! Where does that leave you? Chasing after the latest stuff. Don’t get me wrong, this can be fun. And this makes sRGB very valuable. It’s a good cut for a lowest common denominator. Not withstanding Rec709 TRC for video arrrgh! See? Pick any standard you like, you are likely looking at some content that’s made for a different assumption: chose sRGB assuming web graphics and play a video in VLC.  Look at Display P3 photos from a phone camera, etc. Maybe you shoot raw and edit in Adobe RGB to preserve all the ju-ju then export to sRGB to post online. Etc and so forth. Something I’ve never heard anyone say—ever—is “my photos just looked so drab when I exported them to sRGB.” Something I hear people say all the time is “my Adobe RGB photos look drab to other people on the web.” This is the true promise of color management: to increase the accuracy by which you make mistakes <wink>

    I envision AI color management could change all of this eventually on many devices; being able to detect individual images and videos on a monitor and convert each element to a color profile that is native to your display / ICC.  If this was implemented at the video-driver level like Nvidias new NV-HDR, that would encourage its universality

    • This reply was modified 3 weeks, 4 days ago by yagma.
    • This reply was modified 3 weeks, 4 days ago by yagma.
    #140997

    Old Man
    Participant
    • Offline

    What you describe is literally what current color management does

Viewing 3 posts - 16 through 18 (of 18 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS