Home › Forums › Help and Support › Override minimum display update delay
- This topic has 8 replies, 2 voices, and was last updated 8 years, 4 months ago by Florian Höch.
-
AuthorPosts
-
2015-12-15 at 19:53 #1478
Hi, I am trying to calibrate my laptop’s display with an i1 display pro. Is it possible to control the integration time? The option “Override minimum display update delay” seems relevant, but it doesn’t seem to change anything. No matter how many msec I put there. when it is measuring patches I get a message saying something like “Measured display update delay of 51 msec, using delay of 168 msec_0 msec inst reaction”. Shouldn’t it say what I have set?
Thank you.Calibrite Display Pro HL on Amazon
Disclosure: As an Amazon Associate I earn from qualifying purchases.2015-12-16 at 15:29 #1479Checking again more carefully, I think the option does work and it is just the message that is wrong. Is there a benefit of adding a small delay for a typical laptop LED display? I have a Colormunki Display too, which gives equally accurate results when doing a measurement report, but visually looks a bit more red by the tiniest amount compared to the i1d3 profile. Could that be from measuring slower or just instrument variation?
2015-12-16 at 21:08 #1480Is there a benefit of adding a small delay for a typical laptop LED display?
I’ve never needed to.
Could that be from measuring slower or just instrument variation?
Probably the latter.
2015-12-17 at 22:23 #1481Thank you. In case I wanted to use a custom integration time, how would I do it? The only explanation I have found is from Argyllcms site :
“By default the integration time is adaptive, taking longer when the light level is low. This can be disabled and a fixed integration time used to gain maximum speed at the cost of greatly reduced low light accuracy, by using the -Y A flag.”
but I am not sure how to set that in dispcal.
2015-12-17 at 22:25 #1482at the cost of greatly reduced low light accuracy
You should probably heed that warning, but you can specify additional command line arguments in the “Options” menu.
2015-12-18 at 9:54 #1483I know I probably won’t gain anything. I was more thinking of setting a value that is larger than the slowest measurement time, so it doesn’t affect the low light accuracy. Sorry for the noob question, but could you please tell me what would be the exact argument and in which line of the additional command line arguments I would put it. Thanks.
2015-12-18 at 11:06 #1484I was more thinking of setting a value that is larger than the slowest measurement time
I wouldn’t recommend that, as the i1D3 allows for up to 20 seconds on very dark patches (black). I would recommend to not worry about the integration time, you will not get a result that’s as good as the default adaptive mode.
2015-12-21 at 12:03 #1485Thank you Florian and sorry for troubling you. I think the Argyll drivers use a shorter minimum integration time that X-rite’s default, possibly giving a bit less stable readings. Also, HFCR which uses the Argyll drivers aswell, has an option for changing the minimum base integration time. That way, it shouldn’t affect the low light accuracy, since whatever extra measurement time is needed would be added to the new base integration time.
Could such an option be added in dispcal ?(if it isn’t there already)
Thanks again2015-12-21 at 19:50 #1486I think the Argyll drivers use a shorter minimum integration time that X-rite’s default, possibly giving a bit less stable readings.
I’ve never seen any indication of that. YMMV.
Also, HFCR which uses the Argyll drivers aswell, has an option for changing the minimum base integration time.
HCFR integrates the Argyll instrument library (instlib), so it is not using Argyll CMS like dispcalGUI does.
Could such an option be added in dispcal ?
You’ll have to ask Graeme (via the ArgyllCMS mailinglist or directly).
-
AuthorPosts