Spectral film simulations from scratch

Very cool project ! I did not had time to explore it yet but it looks fascinating.

For those who struggle to install it, nowadays you can use uv to manage and install python programs.
With absolutely nothing installed on your system (not even python) you just need to execute the following:

# ! you only need to exeucte this command the first time  to install uv!
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

cd path/to/agx-emulsion/download/dir

# ! you only need to execute this command the first time !
# this will take some time to run because it needs to cache all the dependencies
uv run --python 3.11 --with-requirements requirements.txt --no-project --with-editable . imageio_download_bin freeimage

# and this the command you will call everytime to launch the program
uv run --python 3.11 --with-requirements requirements.txt --no-project --with-editable . agx_emulsion/gui/main.py

the above works for Windows on powershell, on other system you probably just need to edit the command to install uv by looking at their manual: Installation | uv

And @arctic to get rid of the annoying imageio download step you could use another image IO library like OpenImageIO, they recently made themselves available as pip package.

2 Likes

So I have it running in Windows 11 now - eventually realised all I had to do was install Anaconda then pretty much follow the instructions in the readme…

Me like it! Lot of learning to do to mnake the most of it, and I appreciate it’s still at the experimental stage but loving the results on some photos. Just shared one [here.][Capture Challenge] Charge your battery and take some photos - #2913 by 123sg

1 Like

Thanks for the uv tip. On debian I made a venv environment, pip installed uv and then copy pasted your uv commands. It worked!

1 Like

This looks really awesome! I just gave it a quick try, mostly with default settings (the amount of options overwhelms me a bit :D) and the result looks really great. I’ll definitely play around with it a bit more when I have some time.

2 Likes

So awesome to see how far you have got by starting from first principles and the underlying chemical processes. Looks awesome so far, looking forward to peek into the code and try it some more!

What would it take to use images that exceed the sRGB limits? This repo is a treasure trove of test images, and most are linear BT.709 encoded OpenEXR that have negative values for some of the components. As far as I can tell, the program expects values that have been encoded with the sRGB inverse EOTF.

3 Likes

Been testing it and I feel like I’m looking at my scans! Takes a while to understand the knobs because I never developed colour myself.

I’m curious as to why my photos need huge up towards -40 ev exposure compensation to show anything?

I’m exporting from Rawtherapee and the imported files look very contrasty on import when they are extremely flat in other viewers.

I’ve noticed that too, importing 16bit tiffs from darktable, but once I run the simulation they seem fine - the excess contrast disappears.

Mine don’t need that… interesting - maybe a colour profile issue? I am using the auto exposure though.

Thanks a lot for the instructions, I didn’t know about uv! I will definetily have a look at it.

Oh nice! Also great suggestion.

For now what limits the input color space is the way I am converting RGB to spectral data at the very beginning of the pipeline. I am using this colour.recovery.RGB_to_sd_Mallett2019 that is very convenient, robust and fast but works only for sRGB. A different spectral input conversion could allow wider gamuts. I have the feeling that it would not change a lot the results, given the wide absorptions of the film layers. But of course we should experiment and verify.

That sounds very strange, I’ve never used more than a few ev of compensation with the camera auto-exposure active. Could you share a .pp3 or a low res file you use so I can reproduce it? Do you export PNG 16bit and import using the filepicker widget? Importing directly with napari for example might not work well and convert to 8 bit.

That looks impressive! :slightly_smiling_face:

2 Likes

I did a couple of mini optimization on the main branch.
Mainly, I reduced the wavelength step of spectral calculations from 5 nm to 10 nm, sacrificing a little accuracy for sake of efficiency. I didn’t notice big changes, but the spectra (especially for filters and film/print absorptions) are tightly sampled and look a bit ugly.

With these minimal changes I managed to process on my laptop (32GB of ram) a 20 megapixel image. Kodak Gold 200 and Portra Endura, raw file from signatureedits.com.

I also have in mind a couple of major optimizations that could facilitate the translation in gpu (I think), and drastically reduce memory needs, keeping a 5 nm step. I will soon prototype with it and update here.

6 Likes

WOW!
@arctic , I don’t know of any effort to simulate film in this depth.
You basically mimic every physical step of film development. And as far as I can see it really pays off.
Preflashing and DIR simulation, a proper grain size distribution simulation?
I’m floored.
This is beyond comprehension how complete this simulation is.

Kudos.

I’ll look where I have the Kodak stuff, where they describe Cine-filmstock (before the Vision filmstocks) from the 70ies or 80ies…somewhere on my harddrive.

my mind is properly blown.

EDIT: found it! From before ECN-2 developer chemistry. It even contains dye-ageing estimation plots…:smirk:

5 Likes

What a thorough approach!

Sadly, it segfaults in libpython3 on my machine - I need to debug it properly :frowning:

You might try and upgrade all the imports from the requirements. It helped me to get things running. Using pycharm.

1 Like

Thank you for the kind words @PhotoPhysicsGuy.

Interesting! I’m becoming kind of a collector of technical documents from Kodak. It would be nice to have a look at them. My sources for technical documents have been these websites: Index of /docs/film, Photographic & Darkroom Products by Brand, Browse The Analog Film Stock Library | Filmtypes, https://analogfilm.space/.

I also noticed that older datasheets from Kodak tend to have better quality. Newer ones can have images copy-pasted from older ones, so I usually opted for the oldest when I could choose. :slight_smile:

Regarding grain, I am fitting characteristics curves D-LogE with three normal CDFs (for three sublayers). This is an ok minimal model if we assume a lognormally distributed area of silver halide particles in every layer (that roughly is, from old references), and sensitivity proportional to the area of particles. So the multi layer structure directly arises from the curves themselves. Here is an example plot of a fitted structure.


Then adjusting binomial (for probability of development) and Poisson (for random position of particles) distributions for each layer we can mock a decent RMS granularity profile.

An aspect, that I was super surprised of, is that film has embedded “chemical sharpening”. DIR couplers released in high density areas, diffuses in space (about 10-15 um) and produces local contrast with surrounding lower density parts of the image, inhibiting them. That sounds kind of crazy to me.

4 Likes

Oh, this is great!

Decent is a bit of an understatement here. I would qualify this as a very complex grain model. Could also be that I am unaware of other grain modeling efforts though.

Ahhh! That’s why some MTF plots for film have above 1 transfer at higher frequencies. I always thought only stand-developer kind of local developer depletion could do this (like simulated in Filmulator), which of course isn’t possible in cine-film development.

Sure! I’ll PM you.

2 Likes

I think that’s what filmulator implemented!

(As an aside, the chemical sharpening clearly operates on small areas; I believe that a part of “the medium format look” was the different size of this sharpening, relative to the negative size. It seems that some image editing programs still rely on pixel sizes in their sharpening algorithms, which exhibits a similar difference for high-megapixel images.)

3 Likes

I 100% agree with this.

1 Like

As far I understood, DIR couplers sharpening will happen in normally agitated development, and will affect only very short range, depending on the diffusion properties of the couplers molecules in the emulsion phase. A reasonable guess is 10-15 um, but I need a better reference for this. :grin:

I didn’t know about that, I should definitely dig more into the filmulator project.

I also agree here. In the simulation the diffusion parameter of the DIR couplers is in micrometers. So changing the size of the film negative (film_format_mm) will take into account this.

2 Likes

Cool, switching to compatible releases did the trick for me. Sent it as a PR.

2 Likes

Caution @arctic , I was told firmly by admin a few days ago to stop quoting AI responses …

1 Like