Panasonic LX100 II RAW files too dark in Darktable

I recently bought a Panasonic LX100II, and I was surprised that the RAW files decoded by Darktable are too dark. Probably the standard base curve for the camera was changed compared to the older sensors.
After my first estimations, the normal RAW files are underexposed by 0.75EV, the images with iDynamic set to Standard are underexposed by 1.2EV.
Can somebody create a new basecurve for this camera for the different iDynamic levels? I can help by providing color chart images.
It’s a bit annoying to begin every image by adjusting the exposure.

By the way, I created some styles for this camera to simulate some of the photo styles (Standard, Vivid, L.Monochrome D, Portrait and Natural) for iDynamic OFF and Standard on ISO200.
You can download them here: https://drive.google.com/open?id=1yzTEGjfgZLZjOwe4jbWjVoCwb4ob2_PM
(I’ll try to add the remaining styles too for this camera in the following days).

Darktable is no longer shipping camera specific base curves. There is a base curve tool to help you generate your own, if you’d like: https://github.com/darktable-org/darktable/blob/master/tools/basecurve/README.md

If you’re constantly editing the exposure, then create a preset for it and apply that preset at import or when you open the file.

You maybe interested in the filmic module, I’m getting much better results with it than I did with base curve.

Thanks for the tip! I was not aware of this base curve tool, only darktable-chart for camera calibration.

My pleasure!

I have no idea what iDynamic is, but if it is anything like Nikon’s active d-light, I’d probably avoid using it, as it seems to mess with the raw data.

Hi @kbarni and welcome!

Instead of using the base curve module in dt,
you could try the basic adjustments module.

Have fun!
Claes in Lund, Sweden

@paperdigits you are probably right. This feature does a dynamic range compression, so it’s probably similar to d-light.

I left for a 3 week trip this summer, and I left the camera on iDynamic Standard (the JPG files looked better), and when I got home it was too late: I found myself with a 1500 dark RAW files.

Eek! Well I hope you can make something nice out of them still!

My last trip to Europe, I discovered with like 2 days left that I’d been shooting jpg small the whole time, so at least you didn’t do that!

1 Like

This isn’t a darktable reply, but I think the perspective is germane to your problem:

Batch processing raws gets a bit of denigration of late, but I think it’s extraordinarily useful in such cases. When I wrote the command line corollary to my rawproc program, I defaulted it to skip output images already created. This has turned out to be a real boon, in that I can review the initial batch proofs and if I don’t like the way some were processed, I just delete the JPEGs, change the batch command, and rerun it - only the missing JPEGs will be re-processed from the raws!

I’ve used this quite a bit recently, playing with the filmic curve and certain ETTR attempts. A lot easier than individually re-opening each image and working it by hand…

No, not advocating rawproc, but I am describing what I think would be more efficient behavior for any batch processor.