Calibration Recommendation

Home Forums Help and Support Calibration Recommendation

Viewing 9 posts - 1 through 9 (of 9 total)
  • Author
    Posts
  • #8602

    wtester7
    Participant
    • Offline

    Hello there,

    I am pretty new to calibration and wanted to ask some advice from the experts.

    I own a factory pre-calibrated Asus PA246Q CCFL Wide Gamut Monitor with 3 different modes ( Standard ( Biggest Gamut ), AdobeRGB and sRGB ) and a i1 Display Pro, I calibrate all 3 modes. I also sit in a room, close to a window with natural sun light (no lamps) and using the standard settings 6500K, 120cd/m2 and Gamma 2.2 for calibration.

    Which day time would be the best for monitor calibration? 12 pm (Noon) when the sunlight shines through the window or 12 am (Midnight) when there is no light source, a completely dark room ? I search on the web and didn’t find a definitive conclusion on that matter, but at night I get better average and maximum deltaE results…

    Should I use the standard  WGCCFLFamily_07Feb11.ccss  colorimeter correction (which is also used in the i1Profiler software) or are the Colorimeter Corrections from the DisplayCal Database better? Here is the link:
    https://colorimetercorrections.displaycal.net/?get&type=ccss&manufacturer_id=ACI&display=PA246&instrument=i1%20DisplayPro%2C%20ColorMunki%20Display%2C%20Spyder4&html=1
    These corretion use the GretagMacbeth i1 Pro spectrophotometer and are for my monitor model, so I assume these are more accurate than the standard WGCCFLFamily_07Feb11.ccss  ???

    I am a bit confused about Asus pre-calibrated factory settings, because my calibration settings are absolutely different. On this model I use the factory/service menu trick to be able to change the R G B values, otherwise it’s locked. For example,

    Asus factory pre-calibration settings are:
    Standard Mode (  Biggest Gamut ): R 89 , G 66, B 90 , Brightness 50 Default
    AdobeRGB Mode: R 53 , G 46, B 66, Brightness 50 Default
    sRGB Mode: R 59 , G 45, B 72, Brightness 50 Default

    My settings after calibration:
    Standard Mode (  Biggest Gamut ): R 63 , G 67, B 80, Brightness 54
    AdobeRGB Mode: R 0 , G 36, B 40, Brightness 46
    sRGB Mode: R 1 , G 61, B 80, Brightness 34

    When I calibrate, I aim for the lowest deltaE about 0.2-.03 , my verification results are pretty good:
    Measured whitepoint:  0.48 ( Noon ) , 0.2 ( Night )
    Average deltaE: 0.34 ( Noon ) , 0.25 ( Night )
    Maximum deltaE: 1.3 ( Noon ) , 1.17 ( Night )

    As you can see in the example above, in the sRGB mode, Asus factory pre-calbration has a Red value of 59, when I calibrate, I turn the Red value to 0! and a higher Green and Blue adjustment. What is going on?!? Yes, I admit with my own eyes after my calibration the overall image is less saturated and more pale, obviously because of the reds! But I get good deltaE values?! I have bought a new i1 Display Pro, I don’t think it’s broken. Also when I measure with the Asus factory pre-calibrated settings I get an average deltaE 15 in the calibration measurement window box!  What’s going on with the Asus pre-calibrated settings?! Or is something on my part wrong here!?

    Last but not least, Chrome, Vivaldi, Opera now have full color management with ICC V4 which is awesome. Firefox has a pretty cool feature ( gfx.color_management.mode = 1 ) which automatically assumes untagged images to sRGB. On wide gamut monitors that don’t use the sRGB emulation mode,
    the images look normal and not over saturated. Does this feature is also available for Chromium-based Web Browsers? I didn’t find it on the web…

    Thank you
    wtester7

    Calibrite Display Pro HL on Amazon  
    Disclosure: As an Amazon Associate I earn from qualifying purchases.

    #8628

    Florian Höch
    Administrator
    • Offline

    Hi,

    Which day time would be the best for monitor calibration? 12 pm (Noon) when the sunlight shines through the window or 12 am (Midnight) when there is no light source, a completely dark room ?

    For accurate measurements, no direct light should shine onto the monitor and the instrument. If in doubt, take calibration and profiling measurements in a dim to dark room.

    Should I use the standard WGCCFLFamily_07Feb11.ccss colorimeter correction (which is also used in the i1Profiler software) or are the Colorimeter Corrections from the DisplayCal Database better?

    When in doubt, use the generic spectral correction WGCCFLFamily_07Feb11.ccss. All corrections in the database are provided by users, they may or may not improve the absolute accuracy of your specific colorimeter on your specific display.

    As you can see in the example above, in the sRGB mode, Asus factory pre-calbration has a Red value of 59, when I calibrate, I turn the Red value to 0! and a higher Green and Blue adjustment. What is going on?!? Yes, I admit with my own eyes after my calibration the overall image is less saturated and more pale, obviously because of the reds! But I get good deltaE values?!

    Without sitting in front of the monitor, it’s hard to comment. But if you get expected results both visually and a verification shows color errors well within tolerances, then that would indicate proper operation.

    Chrome, Vivaldi, Opera now have full color management with ICC V4 which is awesome

    Last I checked, even the latest development version of Chrome didn’t support cLUT profiles and was falling back to matrix tags (if present), keep that in mind.

    #8672

    wtester7
    Participant
    • Offline

    Florian, thank you for your reply!

    I have used the  WGCCFLFamily_07Feb11.ccss  correction and after calibration I have different R G B values, especially in the sRGB mode.

    Old settings after calibration:
    Standard Mode (  Biggest Gamut ): R 63 , G 67, B 80, Brightness 54
    sRGB Mode: R 1 , G 61, B 80, Brightness 34

    New  settings after calibration:
    Standard Mode (  Biggest Gamut ): R 64 , G 67, B 79, Brightness 46
    sRGB Mode: R 1 0, G 60, B 70, Brightness 36

    The new settings look more accurate to my eyes, less blue, more red. I trust this default correction ( as used in the i1Profiler Software ) more than the user submitted ones. I still don’t understand the big difference in the Asus pre-calibrated factory settings, because with the original pre-calibrated R G B settings I get a deltaE 15 in the calibration measurement. The only possibility I can think of right now is, that the LED backlighting of the monitor is used off after a long time of usage.

    If possible, could you please tell me how these colorimeter corrections work to get a better understanding? As far as I understand the colorimeter has it’s own internal LUT, programmed according to the industry standards ( International Color Consortium ? ). The user sets the settings in the software, for example 6500K , 120 cd/m2 and Gamma 2.2. During the calibration, the user-defined settings are passed onto the colorimeter and its sensor checks the monitor displays whitepoint, whitelevel and blacklevel. The sensor checks the transmitted information with it’s internal LUT and shows during the calibration measurement the result as accuracy deltaE. So, are the colorimeter corrections just different LUT’s? These are passed onto the colorimeter that temporarily ( during calibration ) overwrittes its interal LUT? Is this assumption correct?

    I do think my calibration is correct, I have attached my calibrated verification report for Standard Mode and sRGB Mode. I assume if I get these results there is no way that my calibration is wrong or my setup. Could you please check?

    I try to understand the engineers from Asus and this monitor model, because there is a crucial technical question.
    Here is a image from my monitors Standard Mode ( biggest gamut ) and sRGB Mode:
    http://i65.tinypic.com/29xf6z7.jpg

    Why does the sRGB mode has only 87.2 % gamut coverage and in Standard Mode 99,5% ? Why doesn’t it also have 99,5%?
    I could understand it with the AdobeRGB gamut but in sRGB? What is wrong with this sRGB emulation mode? I just don’t understand
    why it clips its own gamut, that’s a huge 12.3% color loss!? Very confusing…
    In practice for example, if I would design a website or an image in sRGB from scratch, the best option would be to work in Standard Mode because
    of the bigger gamut? I would have more available colors ( saturation) in stock right? Even the latest Non-Wide Gamut Monitors nowadays produce nearly 100% sRGB coverage. Also the sRGB mode wouldn’t even be accurate if I browse images on the web, it would not be able to show the 99,5% sRGB color gamut! So this sRGB mode is  NOT precise, it’s useless! Am I right?

    Another problem that got me a headache for half a day is a bug??? in Chrome-based browsers ( latest version ):
    http://i66.tinypic.com/23uy5ae.jpg

    I was happy with Iron Browser, Vivaldi and Opera as colors were accurate in sRGB ( finally color management ) until I personally found bad banding and washed out blacks as you can see in the photo. When I save my jpg images with sRGB ICC Profile , the banding in dark tones are pretty visible and also washed out.
    Without and ICC Profile it’s back to normal. In Firefox this behavior doesn’t exist. What happened at Google? What are the dev’s thinking? Is there no quality control?

    Does it mean that the best option for me is to use the standard mode with Firefox ( to prevent the oversaturated colors when using Wide-Gamut Displays ) ?
    I personally hate Firefox, it was good until version 3.5, then it went downhill due to the politics of shareholders. When version 4.0 arrived it was just over-bloated and slow, even right now to this day, I did various browser benchmarks and Firefox was the second from bottom  ( first of course Internet Explorer ).

    The best are:
    1. Iron Browser tied with Vivaldi ( they are fighting for the top all the time )
    2. Original Chrome Browser ( not good,  it’s sending your data to Google server )
    3. Opera ( was also a good browser, original team are now developing Vivaldi )
    4. Firefox ( Version 3.5 was the last good version for me )
    5. Internet Explorer ( always sucks, everybody knows… )

    So these problems got me all pretty confused, I hope you can help me out, I would greatly appreciate your effort.
    Thank you again!

    wtester7

    • This reply was modified 6 years, 8 months ago by wtester7.
    • This reply was modified 6 years, 8 months ago by wtester7.
    • This reply was modified 6 years ago by Florian Höch.
    Attachments:
    You must be logged in to view attached files.
    #8681

    Florian Höch
    Administrator
    • Offline

    If possible, could you please tell me how these colorimeter corrections work to get a better understanding?

    It’s a 3×3 matrix that is applied to the measured XYZ values.

    As far as I understand the colorimeter has it’s own internal LUT

    No. Colorimeters use (physical) filters (usually red, green and blue) in front of light sensing photodetectors. The filters should usually be designed so that they provide a good match to the CIE 1931 2° color matching functions, so that the instrument can provide CIE XYZ readings.

    During the calibration, the user-defined settings are passed onto the colorimeter and its sensor checks the monitor displays whitepoint, whitelevel and blacklevel. The sensor checks the transmitted information with it’s internal LUT and shows during the calibration measurement the result as accuracy deltaE. […]

    No. The instrument just measures and that’s it. All the other information is calculated from the measurements by the software.

    Why does the sRGB mode has only 87.2 % gamut coverage and in Standard Mode 99,5% ? Why doesn’t it also have 99,5%?

    Apparently the monitor’s sRGB mode limits the gamut more than would be required.

    Does it mean that the best option for me is to use the standard mode with Firefox ( to prevent the oversaturated colors when using Wide-Gamut Displays ) ?

    Firefox has its shortcomings, but it is the only browser (on Windows atleast) that correctly support cLUT profiles (with gfx.color_management.enablev4 true in about:config).

    #8689

    wtester7
    Participant
    • Offline

    Thank you very much for the explanation, this monitor did confused me alot. Nonetheless its quality is great.

    One last thing, could you please check my verification reports in the attachment (sRGB_Mode.html , AdobeRGB_Mode.html )
    from my previous post and see if you could find some anomalies and if the results are okay ( I personally think they are good )?
    I would appreciate it, cheers!

    #8693

    wtester7
    Participant
    • Offline

    I forgot to mention to ask you about the option Testchart in the profiling tab. I always use auto-optimized,
    but you can choose  various Testcharts. What does this function do? For example what’s the difference between
    auto-optimized and default? And what exactly does the amount of patches do?
    I have read here on the forum that the best setting is between 2000 – 3000, I am using 2000 now.
    But what difference would it make, if you would compare 115 patches vs 3000 ? Is it the reproduction of
    better color accuracy when using color managed programs?

    • This reply was modified 6 years, 8 months ago by wtester7.
    #8698

    Florian Höch
    Administrator
    • Offline

    I forgot to mention to ask you about the option Testchart in the profiling tab. I always use auto-optimized,

    You should always use auto-optimized, it will choose a suitable testchart for you.

    And what exactly does the amount of patches do?

    More patches = potentially higher accuracy of the resulting profile (although computer monitors are usually relatively linear, so it is recommended to stick to defaults). For TVs, due to their internal processing and resulting non-linearities, a higher amount is often required, but the testcharts set sensible values.

    #8716

    wtester7
    Participant
    • Offline

    Ok, so yesterday I tried various different calibration settings and found some interesting facts.
    I thought that the rendering of a display’s color coverage is the potential capability of the internal hardware.
    This of course is for the most part true, but I found out that user-defined whitepoint and Gamma are also
    playing a role.

    For example in sRGB mode, if I put whitepoint, whitelevels and tone curve to “as measured”
    I get 1,8% more color coverage (  from 87,2% to 89% ). This means via the monitor settings through
    the RGB Gain, the display extends or cuts the overall color coverage?! If you think about it, it seems logical.
    If I put tone curve to “sRGB” and whitepoint to “6500K”, I get only 86%.
    This means you sacrifice 3% of color loss, it is better to get this compromise rather than not to be accurate…
    I guess there is no way to increase better results in color coverage???

    Nonetheless, this monitor model from a consumers standpoint of view is very confusing. After calibrating all 3 modes
    ( Standard, AdobeRGB, sRGB ). I finally got to the conclusion, to ditch and forget about the sRGB Mode and use the AdobeRGB Mode
    for sRGB work ( not using the Standard Mode due to oversaturation of images on the web ).
    In this mode I get 95,5 % sRGB coverage and 85% AdobeRGB coverage, it is still confusing why not 99% AdobeRGB while in
    AdobeRGB mode!!! And If I want to work with the AdobeRGB or an even higher color workspace I will use the Standard Mode,
    it has the best results of 100% sRGB coverage and 98% AdobeRGB coverage.
    What the hell were the technicians of ASUS thinking???

    Finally, is there a way to get somehow the original measurement results of the Asus pre-calibrated factory settings, without sacrificing and resetting my final calibrations? I want to see the color coverage in all 3 modes while using the original Asus pre-calibrated settings to compare these with my settings ( my settings after calibration ). ???
    This would be a great help!

    Thank you Florian!

    #8756

    Florian Höch
    Administrator
    • Offline

    This means via the monitor settings through the RGB Gain, the display extends or cuts the overall color coverage?! If you think about it, it seems logical.

    Yes, that is to be expected.

    Finally, is there a way to get somehow the original measurement results of the Asus pre-calibrated factory settings, without sacrificing and resetting my final calibrations?

    Just write down your monitor settings so you can restore them afterwards.

Viewing 9 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS