How can you prove this? Have you switched to 10bit?
SW240 Specifications
Specifications of BenQ SW240 24-inch AdobeRGB Photographer Monitor
How can you prove this? Have you switched to 10bit?
@sjjh & all other interested forumers:
There is another important aspect in the Noble Art of Picking a Monitor:
Have fun!
Claes in Lund, Sweden
Specifications of BenQ SW240 24-inch AdobeRGB Photographer Monitor
Technischen Daten und Spezifikationen des BenQ SW240. Bildschirm: 24.1 in, AHVA IPS, W-LED, 1920 x 1200 Pixel, Blickwinkel (H/V): 178 ° / 178 °, Helligkeit: 250 cd/m², Kontrast: 1000 : 1, Dynamischer Kontrast: 20000000 : 1, Bildwiederholfrequenz: 50...
BenQ PhotoVue SW240 : Accurate AdobeRGB Color for Under $400ConclusionExcellent Color & Good Backlight in a Simple, Professional PackageColor9.5Screen Surface9.5Backlight8.5Design7Value78.3Very NiceBenQ SW240 Review: Excellent Color I’ve spent...
For the price, there is no 10-bit-panel on the market, AFAIK.
Don’t know how to prove yourself, still specifications.
I hope, it’s ok to post links, if not, please let me know.
Well, there is no mention about FRC on the European Benq website, in fact it is advertised as real 10bit.
I have also briefly used the screen in 10bit mode and it seems to work.
You may also check if the monitor is using PWM for backlight. It may strain your eyes.
Just a suggestion: it would be nice if someone makes a wiki-post to summarize and explain features (in words or links) to look out for in a nice monitor.
Regarding the 10-bit debate, has it been shown that the human eye can tell the difference between having 2^24 vs 2^30 colors?
I bet most people don’t have good eyesight and even if they did no one or two eyes have all of the features.
Well, there is no mention about FRC on the European Benq website, in fact it is advertised as real 10bit.
Oh yes, indeed…
I have also briefly used the screen in 10bit mode and it seems to work.
It acts as a real 10bit Monitor, this is FRC. And it’s not that important. Regarding of black I’m dreaming of an OLED Display.
Happy editing!
As I understood colorscience(I didn’t!), the difference between SRGB versus AdobeRGB is noticable, and AdobeRGB is a 10 bit colorspace.
Please correct me…
Regarding the 10-bit debate, has it been shown that the human eye can tell the difference between having 2^24 vs 2^30 colors?
You can see the posterization effect on many photos, which (should) have smooth color gradient (like sky). Can be reproduced by creating shallow gradients of one color in bitmap editor.
If I understand, this source (point #15) suggests 460 shades (non linear) or 255 shades (linear) is needed to see perfect gradients. (Note: This is shades, thus 2^8 = 256, and 2^9 = 512. If we were talking about total colours, I’ve read the human eye can see approximately 10 million colours. Given 2^24 = 16.8 million, 8 bit would be more than enough, but based on this link at least, its not as simple as looking at the total). As monitors have a gamma, thus a non-linearity (of approx 2.2, pending the make), then 9 bit would be minimum required: Gamma FAQ - Frequently Asked Questions about Gamma
However, most real world scenes are not perfect gradients, thus 8 bit is fine in the vast majority of circumstances.
The primary benefit of 10 bit is to know that any banding you see is in the image, not due to monitor limitations. If your camera raw is more than 8 bits (probably all modern cameras are), then there shouldn’t be banding in the image, and if you introduce banding in editing, it should be easy to spot by turning the module on/off. If you are scanning, scan at 16 bit. So 10 bit monitor is a luxury you probably don’t need.
Perhaps mathematically, 10 bit will introduce less rounding errors than 8 bit, but the visual difference of that would probably be minor - I say probably, cause I’ve never actually seen a true 10 bit monitor.
As I understood colorscience(I didn’t!), the difference between SRGB versus AdobeRGB is noticable, and AdobeRGB is a 10 bit colorspace.
Please correct me…
Yes there is a clear difference between the two spaces, however no space is defined by a particular bit depth. Instead, however many bits you have, get spread across the whole space. Editing 8 bits in a large space (like pro photo rgb) gives you less wiggle room, thus more likely to introduce artifacts like banding, than editing 16 bit, because there aren’t as many bits to spread around. Nowadays, unless you are editing jpegs (which should be avoided), there aren’t many reasons to edit in 8 bit - only for things like saving space, using old filters that only work in 8 bit, or using intensive processes that run faster in 8 bit.
The primary benefit of 10 bit is to know that any banding you see is in the image, not due to monitor limitations.
That’s a good point.
Sorry, I was a little busy the last days. Thanks everybody who posted additional criteria to look out for. I’ll again update the initial post. edit Apparently, I cannot edit anymore, will add the complete criteria list at the bottom of this post.
the difference between hardware and software calibration is well explained here (in German)
So I’ll add hardware calibrated to the list. As far as I understood, there are two possibilities to do the hardware calibration:
Second solution will probably not work under Linux, as the software most probably is only available for Windows and Mac.
The ideal monitor would have no difference in luminance/brightness across the whole screen, but of course, monitors are not made so perfect. A max deviation of <5% is desirable.
There can also be colour temperature deviation. It is said a difference of <2 delta E is not really perceptible to human eyes.
Thanks, will add that to the list.
How strict is the manufacturer’s monitor pixel policy?
I’d hope that for high quality monitors we’re looking at here, this won’t play such a big role, but I’ll add it to the list. What would be a reasonable threshold?
You may also check if the monitor is using PWM for backlight.
Okay, will add that as well.
it would be nice if someone makes a wiki-post
If I find the function to change my initial post to a wiki-post, I’ll do so. If not, maybe someone else can help out.
The primary benefit of 10 bit is to know that any banding you see is in the image, not due to monitor limitations. If your camera raw is more than 8 bits (probably all modern cameras are), then there shouldn’t be banding in the image, and if you introduce banding in editing, it should be easy to spot by turning the module on/off.
Sounds logical to me.
So 10 bit monitor is a luxury you probably don’t need.
Okay, will note that down.
I’m still looking for a good searchable/filterable database to input all the relevant criteria and get a list of relevant monitors I can then compare. For example, https://www.newegg.com offers nearly no relevant criteria to filter for, https://www.bhphotovideo.com allows to filter by color gamut, but not by color uniformity, https://www.prad.de/ seems to have no filter function at all.
Any additional relevant information to choose a good monitor is welcome.
List of criteria (as I cannot edit my initial post anymore)
You might also want to look at the NEC Spectraview line. I had a 27" in daily use for 10 years, and it served me well. The calibrator died, but I was able to use a supported third party with no problem.
I’ve since moved on to a Dell Ultrasharp 31" 2560x1600 which I calibrate on linux using displaycal and the ColorSpyder 5 I bought when the NEC calibrator died. I’m happy with the size and resolution upgrade. I bought the Dell used locally, from a designer for $100 on CraigsList. I’ve had it for a year and a half, still works fine.
… external calibration device and software run on the computer
… will probably not work under Linux, as the software most probably is only available for Windows and Mac.
Luckily, it does work under Linux Look for DisplayCal software.
… the manufacturer’s monitor pixel policy?
I’d hope that for high quality monitors we’re looking at here, this won’t play such a big role, but I’ll add it to the list. What would be a reasonable threshold?
Unfortunately, even “high quality manufacturers” have problems making their monitors 100% free from errors. Threshold? To me, only 0 defects would be acceptable. Luckily, most manufacturers publish their policies (and most of these policies are, to me, not acceptable).
Have fun!
Claes in Lund, Sweden
there are two possibilities to do the hardware calibration:
Luckily, it does work under Linux Look for DisplayCal software.
DisplayCal software is doing software calibration. For hardware calibration i think Eizo has sw for linux, benq has windows only…
10 bits is supported under Linux/nvidia
afaik only Photoshop and Krita and vkdt can 10bits