I came upon a page on Gary Ballard’s website that quotes someone writing several years ago on an Apple forum about the suitability of wide gamut monitors for photo editing:
A wide gamut LCD display is not a good thing for most (95%) of high end users. The data that leaves your graphic card and travels over the DVI cable is 8 bit per component.
You can’t change this.
The OS, ICC CMMs, the graphic card, the DVI spec, and Photoshop will all have to be upgraded before this will change and that’s going to take a while.
What does this mean to you?
It means that when you send RGB data to a wide gamut display the colorimetric distance between any two colors is much larger.
As an example, lets say you have two adjacent color patches one is 230,240,200 and the patch next to it is 230,241,200.
On a standard LCD or CRT those two colors may be around .8 Delta E apart. On an Adobe RGB display those colors might be 2 Delta E apart on an ECI RGB display this could be as high as 4 delta E.
It’s very nice to be able to display all kinds of saturated colors you may never use in your photographs, however, if the smallest visible adjustment you can make to a skin tone is 4 delta E you will become very frustrated very quickly.
The argument sounds reasonable to me, but I was wondering if it still applies to current hardware.