That would also depend on the physical size of the monitor. You didn’t tell us the size of those 8k monitors (size in inches/feet, I mean), but that does play a role (as does the viewing distance).
And for reference, my not so recent camera produces images that are 6k × 4k pixels. That’s more than enough for a print of 60×40 cm (which a non-photographer wouldn’t be looking at from 30 cm away )
32 inch Dell Ultrasharp. Understood regarding size and viewing distance. My curiosity was purely academic, I just wondered if there was any benefit to having a monitor with a higher resolution than the photos you are editing. Talking purely resolution, not brightness, contrast ratio, colour space coverage, price point or anything else. The display can’t add detail so zooming past 100% can only degrade an image can’t it, even if it so subtle that it is not noticed in any practical sense.
As I say, just an academic curiosity not born out of any real practical application on my part.
If I have a good photograph, by which I mean sharp and in focus, with a resolution of 1920x1080, it looks great viewed on my 40" 1080p display at a sensible and comfortable viewing distance. Will it look any better on a 40" higher resolution display under the same viewing conditions. I don’t have one so can’t experiment but I suspect it will look the same by any practical evaluation.
Some arithmetic needed if “the same viewing conditions?” includes the same viewing distance. There is a published graph to do with Human Visual Acuity which tells me that, with my poor vision, the higher resolution display will look worse (less detailed).
Case in point: My monitor has 0.265 mm pixel pitch which I view from about 450 mm. Sampling frequency 0.265/450/2=849 cycles per radian = 15 cycles per degree. From the graph, it would appear that this old man doesn’t have much contrast at that frequency, indicating that a 4K or 8K monitor at that viewing distance for me would be a complete waste of money.
Does it actually work like that in practice though? I am not entirely sure it does. I think that an image is resampled when it is scaled, regardless of the scale factor.
Of course an image is re-sampled when it is scaled, but what we are discussing are specific scale factors - so it is not correct to say “regardless of the scale factor” unless you can prove otherwise.
In particular, the Nearest Neighbor zoom algorithm introduces no new colors for scale factors of 1X,2X,3X,4X … etc.
In my experience, it does produce unrealistic blockiness, though. Clusters of 4/9/16 pixels are much more square than single pixels or sensels. The result is more pixelated than the source data.
1080p content looks much more pixelated on a 4k display, than on an equal-size 1080p screen. Also, subpixel font-smoothing breaks.
So do I, this was the point of my initial post. When zooming in further than 100%, especially on a display with a much higher resolution than the photograph being edited, are there any practical issues that are clearly visible.
My question is academic anyway, it was just something that triggered my curiosity. I will leave it there and thanks for all the replies. I am happy with 1080p, it’s good enough for me for now.
It depends on the resampling algorithm and the scale factor. Your “regardless of the scale factor” covers all scaling factors and is therefore incorrect, sorry.