What I do not understand:
With analogue signalling (VGA etc.) you may be able to tweak (calibrate) the colour response with a LUT in the graphics card in a certain range almost without losing anything (maybe losing some gamut), of course only as long as a certain SNR is maintained.
With digital interfaces (HDMI etc.) you will almost always lose information by tweaking (or calibrating) the colour response in the graphics card, assuming that the bit depth is limited. If there are 8 bits available per colour channel and I use 8 bit colour input and have a LUT inbetween, I may lose several bits caused by the nonlinear response. Having a LUT in the monitor, just before the DA conversion which may have a much better resolution, makes again perfect sense.
Doesn't that mean that calibration is counterproductive for all 8 bits per channel interfaces when done in the graphics card with digital output, by not only losing gamut but also losing colour resolution? Where is my error?