What criteria for a good monitor? Any usable monitor database?

@sjjh & all other interested forumers:

There is another important aspect in the Noble Art of Picking a Monitor:

  • How strict is the manufacturer’s monitor pixel policy?
    (Probably more depressing than what you would hope.)

Have fun!
Claes in Lund, Sweden

1 Like

For the price, there is no 10-bit-panel on the market, AFAIK.
Don’t know how to prove yourself, still specifications.
I hope, it’s ok to post links, if not, please let me know.

Well, there is no mention about FRC on the European Benq website, in fact it is advertised as real 10bit.
I have also briefly used the screen in 10bit mode and it seems to work.

You may also check if the monitor is using PWM for backlight. It may strain your eyes.

Just a suggestion: it would be nice if someone makes a wiki-post to summarize and explain features (in words or links) to look out for in a nice monitor. :wink:

Regarding the 10-bit debate, has it been shown that the human eye can tell the difference between having 2^24 vs 2^30 colors?

I bet most people don’t have good eyesight and even if they did no one or two eyes have all of the features. :laughing:

1 Like

This is a good resource… https://www.youtube.com/c/ArtIsRight/videos

Oh yes, indeed… :thinking:

It acts as a real 10bit Monitor, this is FRC. And it’s not that important. Regarding of black I’m dreaming of an OLED Display.
Happy editing!

As I understood colorscience(I didn’t!), the difference between SRGB versus AdobeRGB is noticable, and AdobeRGB is a 10 bit colorspace.
Please correct me…

You can see the posterization effect on many photos, which (should) have smooth color gradient (like sky). Can be reproduced by creating shallow gradients of one color in bitmap editor.

If I understand, this source (point #15) suggests 460 shades (non linear) or 255 shades (linear) is needed to see perfect gradients. (Note: This is shades, thus 2^8 = 256, and 2^9 = 512. If we were talking about total colours, I’ve read the human eye can see approximately 10 million colours. Given 2^24 = 16.8 million, 8 bit would be more than enough, but based on this link at least, its not as simple as looking at the total). As monitors have a gamma, thus a non-linearity (of approx 2.2, pending the make), then 9 bit would be minimum required: Gamma FAQ - Frequently Asked Questions about Gamma
However, most real world scenes are not perfect gradients, thus 8 bit is fine in the vast majority of circumstances.

The primary benefit of 10 bit is to know that any banding you see is in the image, not due to monitor limitations. If your camera raw is more than 8 bits (probably all modern cameras are), then there shouldn’t be banding in the image, and if you introduce banding in editing, it should be easy to spot by turning the module on/off. If you are scanning, scan at 16 bit. So 10 bit monitor is a luxury you probably don’t need.

Perhaps mathematically, 10 bit will introduce less rounding errors than 8 bit, but the visual difference of that would probably be minor - I say probably, cause I’ve never actually seen a true 10 bit monitor.

Yes there is a clear difference between the two spaces, however no space is defined by a particular bit depth. Instead, however many bits you have, get spread across the whole space. Editing 8 bits in a large space (like pro photo rgb) gives you less wiggle room, thus more likely to introduce artifacts like banding, than editing 16 bit, because there aren’t as many bits to spread around. Nowadays, unless you are editing jpegs (which should be avoided), there aren’t many reasons to edit in 8 bit - only for things like saving space, using old filters that only work in 8 bit, or using intensive processes that run faster in 8 bit.

4 Likes

That’s a good point.

2 Likes

Sorry, I was a little busy the last days. Thanks everybody who posted additional criteria to look out for. I’ll again update the initial post. edit Apparently, I cannot edit anymore, will add the complete criteria list at the bottom of this post.

So I’ll add hardware calibrated to the list. As far as I understood, there are two possibilities to do the hardware calibration:

  1. calibration device and software are built-in the monitor
  2. external calibration device and software run on the computer

Second solution will probably not work under Linux, as the software most probably is only available for Windows and Mac.

Thanks, will add that to the list.

I’d hope that for high quality monitors we’re looking at here, this won’t play such a big role, but I’ll add it to the list. :slight_smile: What would be a reasonable threshold?

Okay, will add that as well.

If I find the function to change my initial post to a wiki-post, I’ll do so. If not, maybe someone else can help out. :slight_smile:

Sounds logical to me.

Okay, will note that down.

I’m still looking for a good searchable/filterable database to input all the relevant criteria and get a list of relevant monitors I can then compare. For example, https://www.newegg.com offers nearly no relevant criteria to filter for, https://www.bhphotovideo.com allows to filter by color gamut, but not by color uniformity, https://www.prad.de/ seems to have no filter function at all.

Any additional relevant information to choose a good monitor is welcome. :slight_smile:


List of criteria (as I cannot edit my initial post anymore)

  • screen size / monitor resolution: 24"-27" with 2K or >=32" with 4K
    (27" with 4K needs fractional scaling, that doesn’t work well under Linux yet)
  • panel type: IPS panel
  • color depth: 8bit is enough, 10bit is luxury
    (if you use a camera >8 bit and can turn image processing modules on/off to inspect reason for banding) (10bits might still cause problems under Linux)
  • color gamut: >= 100% sRGB and >= 100% AdobeRGB
  • color uniformity: < 2 Delta-E
    (<=1 → not perceptible by the human eye; 1-2 → perceptible through close observation)
  • luminance uniformity: <5% max. deviation
  • monitor pixel policy: ?
  • PMW for backlight: No
  • calibration: hardware calibration
    (software running on monitor, or available for Linux)

You might also want to look at the NEC Spectraview line. I had a 27" in daily use for 10 years, and it served me well. The calibrator died, but I was able to use a supported third party with no problem.

I’ve since moved on to a Dell Ultrasharp 31" 2560x1600 which I calibrate on linux using displaycal and the ColorSpyder 5 I bought when the NEC calibrator died. I’m happy with the size and resolution upgrade. I bought the Dell used locally, from a designer for $100 on CraigsList. I’ve had it for a year and a half, still works fine.

Luckily, it does work under Linux :slight_smile: Look for DisplayCal software.

Unfortunately, even “high quality manufacturers” have problems making their monitors 100% free from errors. Threshold? To me, only 0 defects would be acceptable. Luckily, most manufacturers publish their policies (and most of these policies are, to me, not acceptable).

Have fun!
Claes in Lund, Sweden

DisplayCal software is doing software calibration. For hardware calibration i think Eizo has sw for linux, benq has windows only…

1 Like

10 bits is supported under Linux/nvidia

afaik only Photoshop and Krita and vkdt can 10bits

On Linux, go to nvidia x server settings/x servers display configuration, in layout, ctrl+left mouse on the screen 10bits capable & under, you can choose in color depth. Works with Dt, Gimp, Hugin, Xnview, etc. Doesn’t work with Cups, Gutenprint, Turboprint, Vuescan, etc