Some things to take a look at should be:
- The same lens model name reported in darktable as in Lensfun
- The same maker name.
- The same focal length reported in darktable as in Lensfun.
Some things to take a look at should be:
There is an known bug about this: 2255539 â ModuleNotFoundError: No module named 'lensfun'
Good points. Iâll be reshooting the calibration shots. In the camera, I had mistakenly entered 1090 instead of 1000 mm for the non-CPU lens info. It does seem darktable is sensitive to lens or camera attributes. Iâve made some âprogressâ in getting corrections applied but there seem to be a whole bunch of inconsistencies or else unexplained behaviors in the lens correction module. E.g. if âonly vignettingâ is chosen for âcorrectionsâ, âcorrections appliedâ reports ânoneâ even though some vignetting correction seems to be applied, and the histogram clearly changes shape. The f (stops) and d(istance) parameters in the module show a bunch of other apertures and distances than what are in the lensfun database for the lens, which could simply be darktable inter- and/or extrapolating the given values. I need to start from a clean slate instead of trying to understand idiosyncrasies that are probably due to bad input data. Also slowly familiarizing myself with darktableâs sqlite database.
Then, thereâs some debate as to whether the camera (body) applies vignetting correction. Models other than the Z8 and Z9 would only apply that (and distortion and presumably tca) to jpeg but not to raw. The interwebs speaks of the Z8 and Z9 applying that to raw as well, and I checked the camera had vignetting corrections enabled (which I would not want for the calibration shots). I canât imagine it would apply corrections to a non-CPU lens, and checking exif info doesnât show any vignetting correction applied on the raw images, fwiw.
Yeah, thanks for pointing that out. I had already noticed arch and gentoo discussions on a packaging error, as well as a fix in something like 0.3.4. The fedora 39 rpm is ~0.3.3, but the version is not necessarily to blame as debian 12 also has ~0.3.3 and lensfun-update-data works fine on debian. I had copied the xml updates from debian to fedora as debian has its own troubles with the Z8, notably in darktable.
Iâm (slowly) getting there.
Hi,
For the moment I used the script for vignetting only.
Me too in the beginning I has had a problem with the script, but after install ¨gnuplot¨ all ran fine. Could be usefoul add in this tutorial, at the list of dependencies ¨gnuplot¨.
Thank you
Weâre measuring transverse chromatic aberration, which doesnât change when opening/closing the aperture (lateral chromatic aberration does).
TCA deviates the axial distance of each colour component differently on the focal place; e.g. white dots look fine at the centre, but near the edges, youâd see a red dot slightly closer to the centre and a blue dot slightly farther, as in the picture (ref):
It changes with focal length, though, so I wonder why itâs not taken into account here. It means the effect will change at different focus distances, too.
For people in Europe, hereâs a link to the infamous Amazon website:
In general, look for a white-coloured Plexiglas annotated WH10, which corresponds to 70% translucence; itâs the equivalent of what you initially recommended.
Iâm a little sceptical about the method, though, because the sky is never uniform. I did a test with a 35 mm f/2 lens when the sky looked uniform (smooth, overcast weather) and found a large offset in the typical âovalâ shape.
I did another test in front of a maximized window on my monitor and found that one to be symmetrical, so I was relieved it wasnât an issue with the lens. Is the script really able to take the inevitable bias into account?
Since it requires a raw file, itâs hard to take pictures at different positions and average them (if thatâs even enough).
@system Is there any updated version of the calibration scripts?
Theyâre pretty old and some dependencies donât seem to be installable any more. For example, pip3 canât install python3-exiv2 because of missing files, at least on Windows, and on Linux, I got other problems trying to install lens_calibration (on Manjaro/Arch).
I see thereâs a module exiv2 now for python 3, but the structure doesnât seem to be compatible (pyexiv2.metadata => no exiv2.metadata equivalent).
I am not aware of any updates to this script. Might be nice to roll it into a container or something though if its not getting updated.
Hello @paperdigits
Just out of curiosityâŚ
Does it make sense to calibrate a modern Z lens (the ones used for Nikon mirrorless: Z9 - Z8 etc) in order to add this data to the lensfun database?
As you know, at this regards, nowdays everything is done internally, by the NIkon firmwares, when you take your pictures with modern Z lens.
I am asking this because, for instance, for the Z 70-180mm F 2.8 it looks like there is nothing in the lensfun database.
However, since its distortions might be already corrected in the shots (and these data are available in the NEFs when you edit your pictures) this is not a big deal in the end.
Am I correct or I am missing something?
If the correction data are baked in in the exif data, but darktables lens correction module canât use them, then you need to provide a lensfun profile âŚ
Hello @MStraeten
Thanks for your reply!
In the past, as regards RawTherapee, I was used to get the LCP profiles for my old NIkon Lens thanks to the ones provided by Adobe (Adobe Dng converter - the folder containing it).
At present, Adobe does not ship these files any longer for Nikon Z modern lens because these information are already present in the NEFs file and, as soon as, you work with them (e.g. with Lightroom or other softwares) you are already covered as regards the lens distortions.
In all truth, I usually take only macro shots, with a tripod, in a laboratory (plant diseases), and do not need to correct any lens distortions.
exiftool has recently exposed the distortion and vignetting for NIkon Z lenses. Iâve hand-extracted it and tested it on a few images, but I think thereâs still some data missing to make it work properly. Iâm at the end of my understanding of things; someone else more familiar with lens correction and metadata would need to pick up the charge.
For now, yes, as darktable canât use the Nikon embedded metadata yet. @ggbutcher and a few others were working on this, but it isnât in the correction module (yet).
But also if you create a lensfun profile for yourself, youâve profiled your specific lens, and the corrections are potentially better than the embedded metadata.
Peter Wemmert shared an OVA file that can be imported into VirtualBox, with the necessary scripts and tools. I did the test with VirtualBox 7.1.2 and it was still working fine (I havenât run the scripts yet).
Thanks a lot to him for sharing this VM; itâs very helpful!
(search for âovaâ)
After importing the file, donât forget to check the settings before starting the VM:
Itâs Ubuntu, but it will do for this purpose. If necessary, you can select your keyboard layout from the arrow at the top right, which gives access to the system settings. There are no available update for that version, so disable the regular update check to avoid the popups.
You may want to install the VirtualBox guest add-ons to make your life easier. It went without any problem for me.
You probably also want to add a shared folder to access the raw files from your disk. To get permission to the mounted folder, add the user to the vboxsf
group, as usual (password is âabc123â). After that, rebooting is the easiest way for the OS to take that group into account:
sudo adduser ubuntu21 vboxsf
sudo shutdown -r now
I noticed that Darktable didnât benefit from OpenCL, even though the libopcl2 package is installed. Maybe itâs not critical for the calibration scripts.
Iâm happy to help here. (I did a substantial part of the reverse engineering for the Fuji and Sony corrections some years ago.)
Do we have any idea what kind of model is being used?
Regards, Freddie.
It seems to be the Adobe model, but the scaling coefficient is not in the exposed metadata and Iâm too stupid to divine it.
Do you know if Adobeâs DNG converter makes use of these coefficients? If so, we can likely figure out the constant very easily.
Specifically, we take the coefficients in the .NEF and modify them (should be easy enough) to create a few thousand .NEF files. Then, we run each through the DNG converter and inspect the resulting correction parameters (which are documented and in the Adobe model as it is a DNG).
From then it is just a straightforward symbolic regression task to determine the mapping.
Regards, Freddie.
I have a couple of NEFs from which I generated DNGs to compare the parameters, beyond determining they were not the same I didnât take it any further. Based on that simple inspection itâs not clear to me Adobe DNGConverter is using them.
The easiest way to find out is to modify them. Sometimes exiftool lets you change parameters. Otherwise, youâll need to do it by hand by opening the NEF in a hex editor and manually tweaking them. (It should be possible to get exiftool to tell you where in the file these parameters are.) So long as there are no checksums you can run the modified file through the converter and see what happens.
If the DNG coefficients change then weâre in business. Personally, Iâd be surprised if the converter doesnât use them as it makes Adobeâs life a lot easier. Otherwise, they need to acquire a copy of each lens and test it themselves to get the coefficients.
Regards, Freddie.
Could you achieve the same thing for a zoom lens with multiple images shot at increasing focal lengths?