Manufacturer lens correction module for Darktable

Last weekend I spent some time writing a manufacturer lens correction module for Darktable. The idea is to take advantage of the fact that most cameras embed distortion, chromatic aberration, and vignetting correction data into their RAW files. Thus, by interpreting these we can correct iens issues without waiting for a lens/camera combination to first be supported by lensfun (something which can take months or even years, and whose results depend heavily on the quality of the calibration).

Currently the module understands Sony and Fuji (APS-C) correction data. The models themselves were all derived through a combination of reverse engineering along with trial and error. As an example here is an image taken with a Fuji 16mm F2.8 lens in Darktable.

Notice the extreme amount of vignetting and distortion. Enabling the correction module we find:

Which has straightened everything up and produces results which are reasonably similar to the out of camera JPEG’s. (In case anyone is wondering that is what -11.6% distortion looks like.)

The module is reasonably adept at handling edge cases including Sony full frame cameras in 1.5x crop mode and Fuji cameras with their 1.25x crop mode. For those who are interested the relevant PR can be found at although there is still quite a bit of work to do.


Nice work.

Am I correct in assuming that this only works for certain (recent) Sony and Fuji cameras and only with recent brand lenses? (and in those cases you absolutely need a correction module to get a good result given the distortion and vignetting in the first image above :wink: ).

This is nice!
Unfortunately for instance in case of m43 Lumix cameras the data is encoded and not easily accessible. But it’s there. It will be pretty nice to read the corrections straight from embedded data.
Good luck!

IMHO this is really important given the recent tendency for lens manufacturers to include such corrections in their design trades. I’ve been meaning to pick at Nikon’s Z metadata of such, but this requires an attention span I haven’t been able to corral recently… :woozy_face:

I’m not sure it’s a tendency I like. On the one hand, it should allow cheaper lenses with decent image quality after correction, on the other, such corrections (except for vignetting) need interpolation with the associated loss of sharpness/details. Not a problem for screen display, though.

And, of course, the required data are encoded in the makernote section of the exif metadata with no official documentation whatsoever…

You also have the LCP from Adobe that could be used

Yes, it only supports more recent first-party lenses. However, lenses which do not include this information typically can not get away with excessive aberrations or ordinary Lightroom users would complain too much. First party lenses, however, can really go to town on aberrations knowing that they’ll be auto-corrected upon import.

I must say that I am pretty satisfied with the current lens correction module in DT. As far as distortion correction is concerned, I get better results in DT than in OOC jpg. The only think that I am missing is to be able to apply some attenuation to the amount of distortion correction, thanks to which I would be able to achieve a good balance between such correction and IQ. However, I have already heard that it is too complex to implement. Just my two cents.

That is to be expected, as older cameras do not apply any lens correction (or store correction parameters). Until fairly recently, lenses were designed to be as close to perfect as technically possible within a given budget (that’s what drove the development/use of different kinds of materials and aspherical elements). So in-camera correction of the lens defects wasn’t a priority (and not really worth the extra costs, I suppose). Some prime lenses especially didn’t really need any correction in post (e.g. the Tamron 90mm macro).

Now some lenses are developed that have known abberations that can be corrected in software. This makes the lenses easier (cheaper) to design and cheaper to produce, but it imposes correction in software, even for ooc jpgs (see the images from @fdw first post). While the distortion might be acceptable for certain images, the vignetting rarely is… But of course, the correction parameters are not applied to raw files, nor are they in an easily accessible part of the metadata. Let’s not even start about third-party lenses.

@fdw: how does your new module handle third-party lenses which report an ID identical to one of the first-party lenses? Not sure that’s relevant right now, but I’m sure there will be a clash in the (near?) future (Murphy’s law).

1 Like

I am shooting in m43 where most lenses are designed in that manner. My camera applies all necessary corrections in jpg. Distortion is abviously corrected as well but the IQ at the edges is much more deteriorated when compared to the same correction applied in DT on raw file.

The module does not care what the lens reports itself as. So long as third party lenses transmit distortion correction data to the body (which then passes it into the RAW file), everything will work as expected. This is actually one of the nice things about using the embedded coefficients.

Here it is important to distinguish between the quality of the model which is embedded in the RAW files, and the means by which the camera uses the model. Cameras are battery powered devices which need to be able to develop 10~20 RAW files a second; this includes applying corrections. By comparison Darktable can sometimes take several seconds to process just a single image even on a fast machine with a 100 W power budget. In terms of what you can do algorithmically they are simply not in the same league.

1 Like

This is really interesting stuff, i already have looked up your code in dt git. You probably know that DNG files have such correction parameters in the opcode list, those can be read already via exiv2 and could also be used for your code. The DNG specs 1.4 or 1.5 tell what you parameters you have.

For sure the LeicaQ2 example dng here on this site has these tags included.

1 Like

@fdw I’m disappointed that you don’t acknowledge and didn’t say “Thank You” for my 40 hours of effort to modify libexiv2 to provide the metadata to enable this.


@fdw, good stuff, but many don’t have recent lenses, and lots of us have existing third party lenses. Wouldn’t there be much more reward for your effort if you leveraged LCP data?

Great job @clanmills and @fdw! This is really useful. Could it support Canon and Nikon too?

Perhaps this could be implemented as a second (or even first) tab in the current lens correction module?

Imho, there is no point in having two separate modules from the UI/UX standpoint?

@clanmills: thank you for that information, it puts @fdw’s contribution in a slightly different light…

This is already supported by the existing lens correction module (which is built on lensfun). Using lensfun-convert-lcp you can import a set of LCP’s and use them with lensfun. Saying this, the lensfun documentation is somewhat disparaging about the quality of these profiles; see

Of course, for this to work seamlessly depends on your lenses being correctly identified which is a very much a hit-and-miss affair with third party lenses.

This is why you can have a user file for exiv2, then you can make sure your lens is correctly identified.

I would be surprised if recent Canon and Nikon cameras (at least the RF and Z mount ones) do not embed correction data. However, it first needs to be found in the RAW files and then the models need to be reverse engineered. Exiftool is usually quite good here with the tags, although there are still a huge number of (possibly encrypted) tags which it does not yet understand. On the Canon side there are also pending issues around CR3 support which need to be solved first.

In terms of modules this should probably remain separate since the existing module has a dependency on lensfun whereas this one does not.

1 Like