Spectral film simulations from scratch

uv 0.7.1 (90f46f89a 2025-04-30)

Running in Powershell btw

I see that you’re running uv within what is maybe a conda environment, did you install uv from conda?

1 Like

Fascinating project! I wanted to dip my toes into the emulsion as well but am having trouble installing.

running fedora 41 inside distrobox with installed dependencies (as far as I could tell): python, git, gcc
trying uvx --from git+https://github.com/andreavolpato/agx-emulsion.git agx-emulsion fails because it apparently cant build vispy (output)

using pip yields a seemingly similar issue (output)

Can someone give me a hint how to further troubleshoot or what I am missing?

You could maybe use the info here from Steps 1 and 2…the third step is specific to an implementation in ART but the first two might help…

https://art.pixls.us/AgXEmulsionLutHowto

I installed on F41 (though the real thing, without distrobox) using the pip and conda methods, both worked.

1 Like

I really don’t like this software. /sarcasm Because I finally got a look I want out of darktable (after years) and on the same day I find this… insane insane insane film emulation wow. I can’t wait for more man!

In case someone searches for this. I solved it, apparently I was missing python3-devel and PyQt5 as dependencies. To run this inside a Fedora 41 distrobox I installed git gcc python uv python3-devel and PyQt5 and was able to install agx-emulsion via the “pip-route” as well as via uv.

This is way to much fun. Awesome and utterly fascinating work @arctic!

7 Likes

When I did all the steps in the documentation of Art page it sais the lut is invalid
I managed to launch Napari, but nothig seems to work with the test image
I hope someone could help !

Take a look at this image; when I change he settings nothing seems to work !

Look at the message I get when I turn on the film simulation and the same message in the color correction tab

Hi,
If the standalone app doesn’t work, it means something went wrong somewhere in the installation. Until you figure that out, there’s nothing that can be done on the art side I’m afraid, sorry.

1 Like

cant speak to ART but in napari you have to probably scroll down in the right panel and “run” the emulation!

1 Like

This - on my win 11 machine I also have to do a bit of drag and drop module rearranging to get the run button visible

1 Like

I’ll try that, thanks


I managed to try an image of mine and I have to say that I have never seen something that even close to this software… just mind blowing. The only downside is the processing power that it requires and the difficulty of using it in general, other than that it’s fascinating.
A major thanks to @arctic.

6 Likes

Hello everyone, I was wondering is it possible to use the software only for the grain emulation? I’ve been tinkering with the settings and couldn’t find a way to disable the “color profile”.

1 Like

I stumbled across this project and have to give kudos to @arctic for making it happen. I’ve been shooting film for the past 3 years, and none of the plugins come this close to the film look (Yedlin is also pretty close, but his code is not publicly available, and you need Nuke to run it).

A small tip I’d like to share, adding a narrow black frame in Darktable acts like a film rebate, which should be the darkest point during inversion. Also, if Napari’s background is set to white, it makes it easier to evaluate contrast and white balance.

Would it be too much of a hassle to include Agfa films? I love the look they had, muted colors with dense primaries, too bad they stopped production (NC500 should be similar, but it’s too grainy).

Thanks again!

AGFA.F-AF-E5.pdf (163.7 KB)

2 Likes

Hi Andrea (arctic),

I want to start by thanking you for creating this project. While my Python is extremely rusty (last time I touched it, Barack Obama was still in office) I feel that you have created an extremely close and elegant emulation to the real thing. The closest thing that I tested that gave somewhat similar results was Filmulator.

I have compared digitizations of the Fuji 400 (X-Tra) to the same frames taken on my D810 at the time. While I will not post the images here (portraits of a close friend), I can say that running the ART integration agx-emulsion and starting from the default values is very close. To add some flavor, I compared both the fancy Noritsu scans (minilab) and my DSLR digitizations. While the colors obviously differed a little (proprietary profiles and weird piece-wise contrast curves) the ballpark was the same, the look and feel was there.

As a side-note, I also compared Portra 400 prints on matte paper to results from the emulation and they were really close, granted I didn’t have the exact frame for reference, so I could only trust my eyes.

I have scans of colorcheckers SG and Passport targets at different exposure levels, under D50, StdA and Flash illuminantion, if you feel they might be useful for a few stocks: Portra 400, Fuji 400, Pro Image 100 and Ektar 100.

Thank you again for the work done on this project and look forward to developments to the code.
Best Regards,
John

5 Likes

Does anyone have any tips for verifying the Napari launched version on MacOS is running correctly? I’ve gotten familiar with it over the past couple weeks and coming from darkroom printing its pretty intuitive but I’ve had the nagging feeling its not compiled correctly.

I downloaded one of the example images @arctic used and matched it to his Darktable output but when I use the same settings in Agx it seems impossible to match the example output.

A screenshot dump of my settings and comparisons below.

As far as the settings used in darktable: I went through this and the other thread to try to find as much info as to the settings used for the initial Raw conversion export. I’ve tried many combinations aside from these in respect to color profiles. I still end up at a result that doesn’t match. A sRGB processed file in photoshop matches the example JPEG downloaded from the post.

The settings in Agx that give me pause are the ‘cctf de/encoding’, though no amount of check/uncheck combinations get me closer.

I’m pretty stumped! The resulting contrast and clipping is a big jump from what I think I should be expecting.

Is it possible that loading the app straight from Github could be the source? I’ve only had a few hours to try to figure out the cuda approach but I’m so out of my wheelhouse that I consider myself lucky I got the terminal aspect to actually load. I would try it with the ART program but, again, I have no idea how to even install it on a Mac.

Apologies for this wall of text!

did you import the jpg into agx-emulsion? you set the input colour space to bt2020 and the screenshot looks like it’s actually sRGB. not sure about the cctf (you should check this box if the input is in fact sRGB/jpg, but is it?).