In the process now of buying a new laptop, I’ve put OLED screen up as one of the most important attributes, as I perceive “real blacks” to be one af the gains OLEDs offer, beside wider gammut and more vivid colors.
But I’ve now begun to wonder if I fool myself by putting such emphasis on this:
Have I ever looked at the LCD screen of my current or former PCs and thought “Oh, I really wish I could get som true black in the pictures here!”?
Nope. Our fantastic adaptable, TI*-enhanced brain based vision sees black where it ought to see black rather than dark grey.
Reality is that our vision in an instant can transform perception of an area even from white to black. (If that surprises you, look at a projection screen, likely being the most white surface in a room. Then project a text document from your PC onto that screen. What do you see? Black text on white background. But we didn’t project black, only white light - much more light than the (then perceived) white surface was first receiving - and voila, in the (white, former that is …) areas of the font lines our brain now rather see black.)
I asked my wife who for some months has been using an PC with OLED screen, whether she now sees more “black pictures” on screen. The answer was not what a seller of OLED would prefer.
Of course if we put an OLED and LCD screen besides each other, we will clearly see the same picture with blacker black on the OLED. But we don’t normally look at pictures in this way.
And where will the images I, as a hobbyist, process on my PC later be seen? If at all, on paper with much less dynamic range or on someone elses screen, which for some time to come likely(?) doesn’t display such broad contrast and gammut.
If I just want to look at my images on my own OLED screen, I make of course take advantage of the improved ranges of OLEDS by e.g. creating graddients where there on LCD would be only a solid area, which is good.
But what if I in this way exploit increased OLED range in my own processing and then sends the images for others to see on more limited screens, may an investment in OLED then rather lead me to convey images that look worse to to them, than if I had stayed within the confines of the LCD range?
A few years back I bought a new Sony TV (not OLED) but there was three price points. The main difference or selling point between these was how black the blacks were. When viewed side by side the difference was apparent. I bought the most expensive with the deepest black. In retrospect I suckered myself into the most expensive TV and once they are not in a side by side comparison could not tell what I have paid 250% more for. But listen to the advice of those who have bought the OLED screens.
Vision is a funny thing. We see not like a camera, but as a mammal: what matters are object properties: is this a shiny apple or a dull pebble? Is this dappled light on grass, or a sabre-tooth tiger, waiting to eat us?
This is probably at the core of your observation; our brains look at what is portrayed, not how it is displayed.
No unlike how a good movie remains good, even if watched on a bad TV. By the same token, however, I do get an additional kick out of a fine rendition. The question is (to me anyway), is this additional kick additive or orthogonal to my enjoyment of the movie? I think it is mostly orthogonal: the movie does not get better, so long as a minimum bar of quality is reached. On the other hand, there are some renditions where the medium is part of the message, IMAX and VR come to mind. These provide experiences that a mere TV can not reproduce.
But most mundane photography (and office work) is probably just fine on an LCD. It is fine printed on even more limited paper, after all. But there are some special cases as well, which hit different because they are SO BIG, such as Monet’s Water Lillies, or the building-scale panoramas of Yadegar Asisi (do go see them if you have the chance! They’re amazing!).
It makes me very happy to read this and I’m so glad people realize this. Chasing numbers (nits, contrast ratios, etc.) is moot when our brains do gain regulation on-the-fly. It’s silly and it promotes a completely incorrect thinking. (I understand the monetary incentive for companies to keep playing the ‘big number = better’ game).
Personally, I would take stability and predictability any day over ‘HDR’ or any other combination of letters.
That’s how I was suckered into the HiDPI fad in 2016 on a 13" laptop. Spent months trying to make everything work, then gave up and used it 2x downscaled.
I think that display makers come up with various fads from time to time (curved displays, HiDPI, wide gamut, HDR) that serve a niche purpose, so OS/software support fails to materialize or takes very long (often long enough for the original fad to fade, to be replaced by the next one). As far as I am concerned, give me a decent 99% sRGB IPS panel and I am happy.
Good point, and I’d like to explore the opposite approach to this problem :
In my opinion you rightly ask yourself, is changing display technology will negatively affect your ability to properly edit my images ? And in the process to answer this question I’d like to explore the opposite supposition “Is true black and wider gamut will help to properly edit pictures ?”
In my opinion viewing properly edited images on OLED can be more enjoyable as color and contrasts “pop” more but as long as any display is calibrated and with a descent enough contrast ratio, so that it does not full you color-wise and you can distinguish every gradient, your eyes and brain will automagically adjust/calibrate your vision to the limit of the screen/medium and you won’t be limited by it.
If you aim for HDR edits It’s another story though…
I would reverse the question: given any display or print technology, ask if great pictures have been seen before its introduction, and what it adds to them. If not much, then its contribution is marginal.
In particular, for wide gamut: yes, there is a wow factor when you see it for the first time, especially in comparison with previous tech. But if you just see it on its own, it is hard to say what the fuss is about.
Personally, I find the “every color should pop” aesthetic boring. It is great for advertising your new display, but tiresome and trite after a while.
I think that if a photo evokes emotions and tells a story, then it will do that on an old screen, or printed with a cheapo drugstore printer. If it doesn’t, then no amount of tech magic will save it.
Totally agreed with that and your answer as a whole, but it does not tackle the part when you want to edit and being able to view all the data you are supposed to ! That said data do not have to pop
Many years ago I was trained by a graphic artist to use Photoshop and it was interesting that when he used levels he would not set the output to pure black or white as a photographer would, but instead he raised the blacks a little to a very dark grey and whites were a little off white. I believe the rationale for him was that the image would be printed in a magazine and there would be no pure white or blacks. Maybe we are best editing on the same screen as we will view the image and that becomes impossible if our images are intended for the internet. I notice that even on this forum I do an edit I am happy with but once I post my result to the forum it often has a dull appearance compared to my editing environment and that is with the same screen.
I noticed that as well. The rendition needs to fit the intended presentation; and quite often needs adjustments for different presentations. Prints need different editing, different print sizes have an influence, the papers matter, a forum presentation is different again… I most keenly feel this with our annual photo book, where I ideally should re-edit all the photos to follow a common style. But realistically I can only redo a few with the available time.
I understand, but in practice there is very little that is outside sRGB or Adobe RGB, so it is hard to find a practical application for photography.
If you are color-grading movies then it is another matter, but even then pure gamut is just one of your concerns, color accuracy and consistency are equially important. There are pro monitors which just cover 90% of P3 and they are fine for a lot of stuff.
Obviously not, for my part.
I’ve searched for something like that, to no avail – until now that I knew it is there. (Just also saw also @kofa’s dark screenshot in another thread.)
Thanks!