{ "numMessagesInTopic": 4, "nextInTime": 682, "senderId": "_HXHEtk7IWKweli7vYYPmPWSvyGds_jSyLIFYrpzHO0HSmhZAOH0GpkQe6zfkIFLLBggRWLaJPcILb2q8VRr6IxvrXeEKqJvkrsDKKGFNw", "systemMessage": false, "subject": "Re: Card dependence of color calibration?", "from": ""fluppeteer" <yahoo@...>", "authorName": "fluppeteer", "msgSnippet": "Hi Yeang, This is a can of worms which I ve opened in the past. I was waiting until I ve got my PCI-e system up before I try fixing it properly, but I ll tell", "msgId": 681, "profile": "fluppeteer", "topicId": 680, "spamInfo": { "reason": "12", "isSpam": false }, "replyTo": "LIST", "userId": 192443393, "messageBody": "
--- In IBM_T2X_LCD@yahoogroups.com, "yeangchng" <yeang_chng@h...> wrote:
\n>
\n> Yet another question regarding color....
\n>
\n> Let's say you calibrate your T221 with i1 Display, and load the
\n> calibrated ICC profile to the monitor via USB. Now you uninstall the
\n> IBM color management utility (which prevents the graphics card from
\n> imposing its own 8-bit LUT thus convolving both graphics card LUT
\n> and monitor 10-bit LUT). And you uninstall the ICC profile from the
\n> Windows control panel. This is my current setup.
\n
\nOkay, we've got multiple stages here. I last looked at this around
\nChristmas, so my memory may be a bit flakey, but we have:
\n
\nT221 TFT levels
\n[T221 internal LUT/dithering]
\nDVI signals
\n[graphics card LUT]
\nvalues in frame buffer
\n[ICC colour profile management/rendering intent]
\ncolour application wants to display (in your preferred working space)
\n
\nThe colour management/rendering intent bit may do nothing if
\nyour application isn't colour profile aware.
\n
\nI'll interject my problem with this at this point, before
\ngetting back to what you're trying to do...
\n
\nIf you're using a colour managed application, it shouldn't
\nbe necessary to have anything other than a linear conversion
\nin the graphics card LUT or the T221's internal LUT. The T221
\nonly has 8-bit TFTs, and imposes a fixed (I believe) dither
\nover them; some other TFTs (Eizo's posh ones do, I think) do
\nsimilarly - the clue is that you get 1020 values, not 1024,
\nfrom a "10-bit" input. Some implementations alternate the
\ndither between frames, which can help a bit, but otherwise
\nany dithering imposed by the application can clash. 99.9%
\nof the time this doesn't happen and the internal LUT is a
\ngood solution - and it's the way forward for non-colour-
\nmanagement aware apps - but you lose the ability to represent
\nsome constant colours, and CM-aware apps ought to be able
\nto dither anyway. I'm told Photoshop can, although I've not
\nreally had the chance to prove it yet.
\n
\nMore important, to my mind, is the LUT on the graphics card.
\nThis simply maps input 8-bit (unless you're in something like
\nA2R10G10B10 mode, which almost nothing uses) colours to the
\n8-bit output for the DVI (again, unless you've got a very odd
\nmonitor). On a CRT you *do* want this - the RAMDAC on modern
\ncards is usually 10(ish)-bit, and each input colour will
\nusually map to a unique output colour, more linearly distributed
\nthroughout the colour range. On DVI, anything other than a
\nlinear mapping (okay, or a permutation for pedants out there)
\nwill result in a reduction in the total number of colours
\navailable. I put a number of long posts in the Adobe colour
\nmanagement forums on this topic seeking clarification, and
\nmy conclusion was that I'm not the only one who got confused
\n(a lot of people couldn't understand why I didn't want to
\ncalibrate - vs profile - my monitor, and I never got a clear
\nreason why I might be talking nonsense).
\n
\nChances are the LUT on the graphics card handles colour
\nchannels independently, by the way - it effectively provides
\ngamma for red, green and blue (for an arbitrary function
\nrather than gamma). There's no conversion between the
\nchannels, so you can't have an amount of red in the input
\ninfluencing the green level, and what's pure blue in the
\nframe buffer can never be anything other than pure blue
\n(albeit possibly of a different intensity) on the monitor -
\nyou can't compensate for the monitor's phosphors not
\nmatching the channels you want. The same, obviously, is
\ntrue for any analogue adjustments on a CRT (which just
\nfiddle with the gain for the electron guns, whether by
\nexplicit levels or colour temperature). I presume the same
\nis true of the LUT on the T221, although it may be cleverer.
\nThe CMM of a colour-management aware application *can* do
\narbitrary colour mappings (between your preferred colour
\nspace's idea of blue and your monitor's idea of it), but
\nobviously it's slower.
\n
\nDue to the cunning way in which the LUT is encoded in the
\nICC profile, these stages aren't as independent as they
\nshould be. The Eye-One (I have one too) actually won't
\nprofile the monitor without also trying to calibrate it,
\nunless you get the posh version of their software.
\nBasICColor (third party software) *will* do this, and
\nis much cheaper, if anyone is in the same situation as me.
\nEverything struggles a bit that I run my T221 at low brightness.
\n
\nIn spite of reading the ICC spec on this, I'm still somewhat
\nhazy on whether it's possible to specify absolute colorimetric
\nrendering and get the colour displayed on the monitor to be
\na match for the print output, scaled by monitor brightness;
\nI'll worry about out-of-gamut colours when I hit them (I'd
\nrather avoid them or specifically switch to perceptual intent)
\nbut I've failed to be given a good understanding of why
\nworking in absolute colorimetric is considered to be a bad
\nthing by the CM community. Anyway, that's getting off-topic.
\n
\nA problem with Windows is that, because multiple displays
\nallow windows to be blitted between them without any colour
\nconversion taking place, only one ICC profile gets used by
\na CMM at a time. This means you can set up Photoshop to know
\nabout one of your monitors, but you have to calibrate the
\nother to match first. If you switch, you have to calibrate
\nthe other way. Macs, I'm told, treat the two displays as
\nindependent and therefore they have separate profiles (at
\nthe cost of not being able to span both with one window).
\nAt least different video heads get loaded with different
\nLUTs, so a token effort is made at calibrating two displays.
\n
\nSince I bought a CRT partly to compensate for the gamut
\nlimits of the T221, calibrating it back to the same boundaries
\nis a bit of a pain...
\n
\nAnyway, back to *your* problems. It sounds like the IBM utility
\nreads the colour conversion that would go in the graphics card
\nLUT and implements it (better) on the monitor, then kills the
\ncard LUT (if I've understood what you said correctly). I only
\ntried it briefly, because I don't intend to use it much, but
\nthat sounds sensible.
\n
\nUninstalling the ICC profile is something you only want to
\ndo if you never use CM-aware applications, otherwise their
\ninternal CMMs can't do proper conversion when writing values
\nto the frame buffer. Calibration, and the LUT loading, will
\ndetermine what those values look like - but only within a
\nlimited amount of adjustment, as I said above; the CMM will
\nuse the profile to match desired colours to the best frame
\nbuffer value available according to the rendering intent,
\nand since the calibration may be of arbitrary quality you
\nstill won't get very close to the colours you want unless you
\nkeep the profile. If, however, you never use a CM-aware app
\n(and you only want to get your web pages to look like proper
\nsRGB) then there shouldn't be any harm in killing the profile
\nonce the LUT is on the T221.
\n
\n> So now you have a monitor that has a calibrated LUT loaded, but no
\n> color management at all in Windows. As long as you stay on the same
\n> graphics card that you initially calibrated the T221 on, you should
\n> be fine. However, what happens when you move the monitor on to
\n> another graphics card (say, ATI FireGL to Nvidia Quadro)?
\n
\nIt's a DVI link - because it's digital, the way the frame buffer
\nreaches the T221 should be the same regardless of what card you
\nuse (when there's 255,255,255 in the frame buffer, and the LUT
\nisn't getting in the way, the monitor should see 255,255,255 and
\nrepresent it however it can - it won't see some arbitrary analogue
\nsignal which should be interpreted as white). It would be different
\nif you were using an analogue connection, because then the RAMDAC
\ndifferences are significant.
\n
\n> The new graphics card would still not be color managed under
\n> windows. However, if the unprofiled color output of the Quadro is
\n> slightly different from the FireGL, then the calibrated LUT in the
\n> T221 will be slightly off.
\n
\nTrue for analogue, but not for DVI-D. Good to think to worry,
\nbut there's no need in this case. :-)
\n
\n> Anyone knows what color variances there are between different
\n> graphics cards using the DVI outputs (keeping the monitor constant)?
\n> I'm sure that on analog it will be a complete crapshoot between
\n> different cards!
\n
\nIndeed. Mind you, I'm only saying this as my understanding of
\nwhat happens; if anyone knows a reason why they *should* differ,
\nplease say (and I'll be very worried)!
\n
\nEr. That answer turned out to be a lot shorter than my diversionary
\ndiscussion on the subject. Oops. Sorry.
\n
\n--
\nFluppeteer