Direct reading of an XMP file

I wonder if there has been any work in developing an XMP file so that it can be easily and directly read with slider percentages and all?
It would make the differences within versions of the same images quite simple and avoid having to open an image and hint for changes.

I should have said a converter through which the XMP could be run.

To the best of my knowledge, no.

I know of no work that “develops” an XMP, but they are best viewed with an app such as CoffeeCup that properly opens XML files. That retains the hierarchical formatting instead of opening them as jumbled text with NotePad.

XML is not white space we sensitive, so it doesn’t matter if its all on one line or pretty printed, its interpreted the same.

XML is designed to be human-readable, and I do not find the darktable XMP files a jumbled mess. Perhaps they appear that way under windows, which uses CR-LF pairs as line separators. Linux just uses LF characters.

But just showing the XML in a nice layout isn’t enough for what OP wants, you’ll also have to decode the “darktable:params” and “darktable:blendop_params” attributes. And that’s a different proposition.

The code to read and decode the xmp files should already exist (after all, dt does read its own xmp).

From there to finding someone who is interested enough (and able) to make the editing info into a readable format is another matter. The mask parts might be a bit tricky as well.

1 Like

Yes, I agree that the masks would be impossible but the ability too directly read the slider positions and general settings IMO would be quite useful.

I would find this useful too

I thought that by “read” he meant viewed on a monitor screen.

The module data in the darktable xmp files are just binary blobs of each module’s parameters and can really only be read and written with darktable itself. I would imagine that making all that information human-readable might be quite a lot of work, and of course it would still have to be backward-compatible with older xmp files.

This is a fairly hacky approach, but it’s possible to read the param values from the XML file by unpacking the binary data, provided you know the expected struct format.

As a simple example, if you look at the source for the temperature module, you can find the definition for the dt_iop_temperature_params_t (link). This module simply stores four float values in the param struct.

Finding the temperature operation element in the XML file for my test image, the param value is f3b420400000803f6ae5d43f0000807f. Which can be unpacked in a Python interpreter using the struct module (docs).

>>> from struct import *
>>> unpack('ffff', bytearray.fromhex('f3b420400000803f6ae5d43f0000807f'))
(2.5110442638397217, 1.0, 1.6632511615753174, inf)

These match the expected coefficients I see in the darktable UI for the white balance module.

To write values, you can use:

>>> pack('ffff', 1.0, 2.0, 1.2, float('inf')).hex()
'0000803f000000409a99993f0000807f'

Pasting this value back into the XML file should update it. (When I tried, I had to reimport the sidecar. Also, it’s helpful to pay attention to the state of the history stack, I made sure it was fully compressed before doing this.)

I tried decoding a couple modules, and this approach seemed reliable. With the more complex modules, the number of params can be significantly higher, and some modules use base64 encode the params rather than simply writing hex strings. Not sure if this is per-module or based on the length of the param values. Additionally, I didn’t dig into the mask or blendop values, but I’d imagine they’re stored similarly.

With a little manual work to grab the param struct formats, it should be possible to turn this into a simple Python script that could read/write specific module values.

1 Like

This is great! I have a rough idea of an app I could write if this library existed

… but someone decided a long time ago that writing binary blobs is an “awesome idea” – I’d rather call it lazy hack coding – instead of sticking with the idea of XML and writing human readable clear text values. Values that any of the XML libraries could read and write in a breeze. But that is one of the disadvantages of darktable, it has a very distinct “remote island design” when it comes to the XMP files.

PS: Please refrain from the not-helpful “it’s open source, just improve it” comments. Technical debt that is so deeply engrained into an app is not something you just simply improve.

1 Like

The XML is human-readable, and any XML library can read the values with no problem.
That some fields cannot be understood (because binary blobs or for whatever other reason) doesn’t change that. It’s the difference between syntax and semantics. XML never guarantees that data read from an XM stream is usable by the reader…

And those binary blobs in the XML file are “human readable clear text values” : encoded either as base64 or hexadecimal numbers.

Similarly, if I receive a letter in Italian (or chinese), I can open the envelope, and extract the data (the letter), that doesn’t mean I understand any of the information.

1 Like

Is that just nitpicking for nitpicking’s sake or did you really not understand what I was trying to say? If the latter is the case please let us know so I can try to word it better for your spectrum of textual reception and comprehension.

1 Like

Please don’t fight.
I understand @grubernd’s frustration, and I agree that the binary blobs in darktable’s XML (xmp) are not human-readable. However, even if they weren’t binary, they may still not be easy to interpret, even if they had readable names and values, since what each parameter does and what a parameter value means make only sense in the context of the algorithm using them.

I don’t see a fight, we only might have a text-based misunderstanding.

Anyway … by not being read/writable without all the darktable structs, it is impossible to mass-edit single tool parameters in the xmp. Since the GUI copy/paste does not support this either some edits are effectively impossible without destroying other parts of the edit.

And there are a lot of tools that might need a later change of a single parameter, so one has to be VERY cautious and conscious about tools and their interactions when editing a large group of images.

In other editors you can simply run a search+replace in a texteditor or a sed on the commandline and change hundreds or thousands of image settings in a breeze. Going the Lua route in darktable is probably possible but rather a far stretch for something so simple.

1 Like

When your rhetoric is pedantic, you get pedantic rhetoric in return. The problem isn’t the XML, which often contains all sorts of shit, but rather the design choice of using binary blobs.

Instead of rehashing the same tired and condescnedinly worded argument you’ve made multiple times (and it always goes nowhere), why not try to add something positive to what @Bordwall has posted?

1 Like

One could do that but there are a few principal downsides or “heavy stuff”.

  1. You’d have to make sure to keep follwing all dt updates as far as modules are involved
  2. the blob mechanism allows us dt devs to keep old edits working perfectly in almost all cases. If not, we would concider that to be a dt bug.
  3. masking stuff “translated” to human readability would certainly run into “darkness of understanding”
  4. any idea about raster masks? they might be depending on many modules

The argument “using blobs has been a bad design decision” is from my understanding just “not right”. By using those blobs we can expand/modify module’s code quite easily. If everything would have to be readable / writable “as text”, the module’s code would certainly explode and maintenance would become a nightmare.

dt has simply not been designed as a scripting tool besides the rudimentary ‘cli’ interface.

BUT - feel free to enter the dt-ship and start working on that, i don’t think anyone will block code that adds functionality.

2 Likes

Sorry for not being more clear, but I didn’t even mean that they should contribute code, though that is welcome, but rather add an idea, a positive comment, or really anything beside the same negative, thread derailing rant that we have seen a number of times.

I agree keeping up an external library has its drawback, but if you don’t want to modify dt code but still want to manipulate the XML file, then its not a bad solution, I think. Yes, each version of the conversion tool will be dependent on a specific DT version, but it is what it is :joy:

1 Like