Developments in the near future - HDR

I see in the tech news that Apple is making their own LOFIC image sensor, and omnivision already has one, and it was used by a phone maker.
Samsung and Sony and such are also about to release them in 2026-2028.
And they are heralded as a solution for sensors for cars too, since they have more stops and can eliminate the infamous LED flicker.
A LOFIC sensor is an image sensor that has an extended range and can do HDR in a single image because it has a capacitor to capture pixel overflow and thus increases its range.
All of this means those sensors will become widely available I would expect.

Meanwhile the PDF association announced they will support JPEG-XL to get HDR support in images.
Plus you see more and more that monitors are 10bit color or more. and HDR.

I think I was right when I argued here in the forum that the future will go higher bit-depth and that software makers should prepare for it.

So what do you guys think? A fad (in terms of consumer hardware) or a linear movement towards a new standard?
Or as so often: No comment :slight_smile:

I really wonder if classic JPEG will finally be retired though, it’s stuck to 24bit and is a bit dated, and yet all the expensive cameras and phones et cetera still massively use JPEG, with RAW as the second option. But RAW you need to process yourself and that’s not a convenient thing for consumers.

Luckily GMIC has its internal 32bit float values engine, that was a good move.

I was reluctant to buy a 4K TV when they first hit because nobody was transmitting in 4K. Time has proven me wrong and I am so glad I have a 4K TV. I too hope that we will move away from 8 bit JPGs to a JPG-XL. Probably not a fad but something that supports improved technology. JPG was originally designed for photo journalists to transmit over very slow connections images back to the editorial office (Well that is the story I was told). The internet is now so much quicker.

Desktop displays are slowly transitioning towards wide-gamut, high-DPI, HDR technology. But unless you’re using a MacBook, they are still relatively rare. Smartphones and TVs, meanwhile, have mostly migrated over to these technologies.

Windows now has passable HDR support, good High-DPI support, but wide-gamut still looks garish. MacOS can do all three, but HDR with third party screens is a bit hit-and-miss. Wayland HDR and color management is slooowly rolling out as well.

I’d say we still have a good way to go until wide-gamut, high-DPI, HDR can be relied upon on the desktop.

Thanks you for your comment bastibe, very informative to hear how things are going across the various platforms.

And toTerry, I think that on paper jpeg-xl sounds pretty good but I hear the implementations are unstable and perhaps it makes it too late now to get traction.
What I see is that websites first went webp and now all switched to AVIF images. So I think at this point AVIF is going to be the winner, but as I said they relatively quickly switched from webp to it so perhaps they will move to another thing pretty quick too.
I myself still like PNG for my personal use because it doesn’t have the awful compression artifacts and it is compatible with all the (often dated) software I like to use. I am considering using something else more often though.
And that is also the issue with higher bit-depth of course, a format like PNG supports higher bit-depth but a lot of software I use doesn’t like that at all and goes bonkers if I try to load it showing black or distorted images or even going buggy.
And most software is 8 bit per plane oriented and I have to accept that reality.
I incidentally read that AVIF supports a thing called ‘gain maps’ where the base image is 8 bit and the extra bit-depth is added separately, for software that can use that, which sounds like a good fix except wikipedia says there is no encoder that uses that capability… from which I conclude it’s unlikely to be used as input either by any software.
I think that’s generally a big issue, the switchover to more bit-depth/HDR if there is one, is hard. You basically have to internally run two systems if you want to create software that is universal.

Wondering if this is the modulo camera promised over a decade ago, now. :slight_smile: