When doing a verify for a G2.2 calibration, sometimes the measurement report references an ideal flat-line gamma, and sometimes it references the cal-time measured (real) gamma.
See attached two screenshots and corresponding reports.
I can’t figure out the preconditions for this variance.
Per previous dialog on nature and value of this graph, I began to see why the graph is designed as it is, as I looked more carefully at the measurements.
In this case, I kind of the like the ideal reference (per the dE feature) because it helps me see the nature of the device against its cal target. But I understand why it might be more interesting to know how much the display has deviated from its cal-time performance. For sRGB, the ideal curve on the same scale would be even more interesting to me as a visualization of the distinction between sRGB TRC and G2.2.
BTW—Working w DisplayCal is a true enjoyment because it teaches so much about display performance. If I ever sound gripey about anything it’s unintended and I take it back. There’s a true tension between canned behaviors that over-simplify complex subjects and nuanced behaviors that reward some diligence with knowledge. I much prefer the latter! And for me DisplayCal / Argyll are a treasure trove of good thinking.
It depends on profile TRC, I eman actual TRC data stored in ICC profile file. It can be idealized (neutral power law), idealized neutral (actual TRC but idealized to be neutral) or actual TRC value.
This is baused by profile type: gamma + matrix, single curve + matrix, curves + matriz/ XYZ+matrix
This reply was modified 2 weeks, 2 days ago by Vincent.