Leica SL3 sensor

I just read this elsewhere:

The SL3 adopts a 60MP BSI CMOS sensor, making it both the highest resolution SL camera yet and also the one most likely to play nicely with Leica’s M series rangefinder lenses (BSI sensors are better at receiving light from close-mounted lenses, as their photosensitive region is closer to the surface)

Doesn’t sound right to me.

I’m thinking that the ratio of sensitivities BSI vs. “Normal” doesn’t change with the lens distance from the sensor?

Like they are equally “better” at receiving light from far-mounted lenses, or equally better at receiving light from anywhere!

I read that comment, and it didn’t make sense to me. I suspect it was written by someone who didn’t understand what they were writing about, perhaps trying to summarise an accurate but more complicated description.

Maybe it refers to the angle the light is hitting the sensor and far-mount lenses presumably come in more directly therefore may reach deeper into a sensor. I am sure someone can confirm if this is the case or not.

I think they’re referring to the smearing issue that can occur with wide angle rangefinder lenses. The light that hits the sensor comes in at an oblique angle and can activate multiple photosites if the sensor is too thick. This is a known issue with adapting rangefinder lenses to Sony cameras as the filter stack is pretty thick.

1 Like

So let’s use a simple sine rule and the handy-dandy 30 vs. 60 degs.

At 30º incidence, both the BSI and the normal sensor get the same light, 50% of the incoming light from the lens.

At 60º incidence, both the BSI and the normal sensor get the same light, now 87% of the incoming light from the lens.

At 0º incidence, both the BSI and the normal sensor get the same light, 100% of the incoming light from the lens.

How does the angle of incidence affect the ratio of BSI to normal sensitivity, i.e. the amount of “betterness”?

by “light”, I mean photons/sec/unit area at a wavelength of good old 555nm.

The light has to traverse the metal layer in the FSI case. The greater the angle of incidence, the longer the path across it, and more light gets lost/absorbed there.

And by the by, this particular sensor has been used before.


I am talking about the ratio of sensitivity between BSI and FSI - is it dependent on the angle of incidence or not?

I just explained how it is, bit here goes again: you loose even more light w/ more acute angle on FSI, so it is “less sensitive”, and this ratio also of course then changes w/ angle.

We are talking in too general terminology. Also, we are both ignoring the effect of the microlenses. If only we could find comparative cross-sections and apply some actual math …

You can model this as detailed as you want. Fundamentally, w/ FSI you have to traverse the circuit layer, w/ BSI you don’t Let’s say the thickness of that layer is d, and angle of incidence \theta. Then, as a first estimate, you photons have to travel d/cos\theta through the circuit layer until they reach silicon w/ FSI, while w/ BSI they don’t have this. The number of photons you loose in the circuit layer w/ FSI is proportional to the distance travelled, and hence, the (cosine of) angle of incidence.

Edit: the microlenses should reduce/control the actual angle of incidence for the bulk of the photons so the effect is not as dramatic/direct, but it’s there.

Each sensel sits at the bottom of a small well. This leads to some shading for light at shallow angles. BSI sensels sit in shallower wells than FSI sensels, and experience less shading.

This is independent of the loss of light density due to the angle of incidence (cosine rule).

Some fixed-lens rangefinder cameras mitigate the shading effect with slightly offset microlenses.

Thanks for the more detailed description. I found this in your first link:

I’ll be thinking some more about it. Later.